CN109033089A - Sentiment analysis method and apparatus - Google Patents

Sentiment analysis method and apparatus Download PDF

Info

Publication number
CN109033089A
CN109033089A CN201811037201.3A CN201811037201A CN109033089A CN 109033089 A CN109033089 A CN 109033089A CN 201811037201 A CN201811037201 A CN 201811037201A CN 109033089 A CN109033089 A CN 109033089A
Authority
CN
China
Prior art keywords
prediction model
classification results
training pattern
output
characteristic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811037201.3A
Other languages
Chinese (zh)
Other versions
CN109033089B (en
Inventor
车天博
高维国
何晓冬
刘晓华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingdong Century Trading Co Ltd
Beijing Jingdong Shangke Information Technology Co Ltd
Original Assignee
Beijing Jingdong Century Trading Co Ltd
Beijing Jingdong Shangke Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingdong Century Trading Co Ltd, Beijing Jingdong Shangke Information Technology Co Ltd filed Critical Beijing Jingdong Century Trading Co Ltd
Priority to CN201811037201.3A priority Critical patent/CN109033089B/en
Publication of CN109033089A publication Critical patent/CN109033089A/en
Application granted granted Critical
Publication of CN109033089B publication Critical patent/CN109033089B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/30Semantic analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/01Customer relationship services

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Biophysics (AREA)
  • Computing Systems (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Molecular Biology (AREA)
  • Accounting & Taxation (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • Finance (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The disclosure provides a kind of sentiment analysis method and apparatus.Sentiment analysis device carries out feature extraction to user conversation, extracted user conversation feature is inputted into preset first prediction model and the second prediction model respectively, fisrt feature data in first prediction model are inputted into the second prediction model, so that the second prediction model merges fisrt feature data and the second feature data of itself, and the emotional semantic classification result of user conversation is obtained using fused characteristic.The disclosure by using transfer learning mode, by the knowledge migration in the first prediction model into the second prediction model, so that the second prediction model be enable to obtain better classification results.

Description

Sentiment analysis method and apparatus
Technical field
This disclosure relates to field of information processing, in particular to a kind of sentiment analysis method and apparatus.
Background technique
Artificial customer service plays increasingly important role in internet industry as the window for being directly facing user.Customer service Problem-solving ability will directly affect user experience and user to the impression of company.
Currently, giving a mark by using Text Classification to user feeling.For example, 1 point of expression is very dissatisfied, 5 Dividing indicates very dissatisfied.
Summary of the invention
Inventors discovered through research that only give a mark to user feeling in practical business scene, it can not be accurate User's cause of dissatisfaction is solved, to can not effectively improve to business.
For this purpose, the disclosure provides a kind of scheme that can classify according to user conversation to user feeling, consequently facilitating User's cause of dissatisfaction is understood in time.
According to the one aspect of one or more other embodiments of the present disclosure, provide a kind of sentiment analysis method, comprising: to Family session carries out feature extraction;Extracted user conversation feature is inputted into preset first prediction model and the second prediction respectively Model;Fisrt feature data in first prediction model are inputted into second prediction model, so as to second prediction Model merges the fisrt feature data and the second feature data of itself, and is obtained using fused characteristic The emotional semantic classification result of user conversation.
In some embodiments, the classification results of first prediction model are less than the classification knot of second prediction model Fruit.
In some embodiments, each classification results of first prediction model respectively with second prediction model At least one classification results is associated.
In some embodiments, the classification results of first prediction model include glad, neutral, negative;Described second The classification results of prediction model include happiness, neutrality, anxiety, anger, fear, is sad, losing, wherein first prediction model Negative classification results and the anxiety of second prediction model, anger, fear, sad and to lose classification results associated.
In some embodiments, first prediction model and second prediction model are the convolutional Neural based on character Network.
In some embodiments, training data is separately input to first prediction model and in training pattern, with Toilet states the first prediction model output category result, and wherein training data includes user conversation feature;By the first prediction mould Characteristic input in type is described to training pattern, so as to it is described to training pattern by the spy from first prediction model Sign data and the characteristic of itself are merged, and utilize fused characteristic output category result;According to it is described to Existing deviation between the classification results of training pattern output and the classification results of first prediction model output, described in adjustment To the parameter of training pattern, to obtain second prediction model.
In some embodiments, if the classification results to training pattern output and first prediction model output Incidence relation is not present between classification results, then determines the classification results to training pattern output and the first prediction mould There are deviations between the classification results of type output.
According to the other side of one or more other embodiments of the present disclosure, a kind of sentiment analysis device is provided, comprising: special Extraction module is levied, is configured as carrying out feature extraction to user conversation;Feature input module is configured as extracted user Session characteristics input preset first prediction model and the second prediction model respectively;Transfer learning module, being configured as will be described Fisrt feature data in first prediction model input second prediction model, so that second prediction model is by described the One characteristic and the second feature data of itself are merged, and obtain the feelings of user conversation using fused characteristic Feel classification results.
In some embodiments, the classification results of first prediction model are less than the classification knot of second prediction model Fruit.
In some embodiments, each classification results of first prediction model respectively with second prediction model At least one classification results is associated.
In some embodiments, the classification results of first prediction model include glad, neutral, negative;Described second The classification results of prediction model include happiness, neutrality, anxiety, anger, fear, is sad, losing, wherein first prediction model Negative classification results and the anxiety of second prediction model, anger, fear, sad and to lose classification results associated.
In some embodiments, first prediction model and second prediction model are the convolutional Neural based on character Network.
In some embodiments, above-mentioned apparatus further includes training module, is configured as training data being separately input to institute State the first prediction model and in training pattern, so as to the first prediction model output category result, wherein training data packet Include user conversation feature;Characteristic input in first prediction model is described to training pattern, so as to described wait instruct Practice model by from first prediction model characteristic and itself characteristic merge, and using fused Characteristic output category result;It is exported according to the classification results to training pattern output and first prediction model Existing deviation between classification results, the adjustment parameter to training pattern, to obtain second prediction model.
In some embodiments, if training module be additionally configured to it is described to training pattern output classification results with it is described Incidence relation is not present between the classification results of first prediction model output, then determines the classification knot to training pattern output There are deviations between fruit and the classification results of first prediction model output.
According to the other side of one or more other embodiments of the present disclosure, a kind of article recommendation apparatus is provided, comprising: deposit Reservoir is configured as store instruction;Processor, is coupled to memory, and processor is configured as the instruction stored based on memory Execute the method realized and be related to such as above-mentioned any embodiment.
According to the other side of one or more other embodiments of the present disclosure, a kind of computer readable storage medium is provided, Wherein, computer-readable recording medium storage has computer instruction, and such as any of the above-described implementation is realized when instruction is executed by processor The method that example is related to.
By the detailed description referring to the drawings to the exemplary embodiment of the disclosure, the other feature of the disclosure and its Advantage will become apparent.
Detailed description of the invention
In order to illustrate more clearly of the embodiment of the present disclosure or technical solution in the prior art, to embodiment or will show below There is attached drawing needed in technical description to be briefly described, it should be apparent that, the accompanying drawings in the following description is only this Disclosed some embodiments without any creative labor, may be used also for those of ordinary skill in the art To obtain other drawings based on these drawings.
Fig. 1 is the exemplary process diagram of the sentiment analysis method of an embodiment of the present disclosure;
Fig. 2 is the sentiment analysis network model schematic diagram of an embodiment of the present disclosure;
Fig. 3 is the exemplary process diagram of the sentiment analysis method of another embodiment of the disclosure;
Fig. 4 is the sentiment analysis network model schematic diagram of another embodiment of the disclosure;
Fig. 5 is the exemplary block diagram of the sentiment analysis device of an embodiment of the present disclosure;
Fig. 6 is the exemplary block diagram of the sentiment analysis device of another embodiment of the disclosure;
Fig. 7 is the exemplary block diagram of the sentiment analysis device of another embodiment of the disclosure.
Specific embodiment
Below in conjunction with the attached drawing in the embodiment of the present disclosure, the technical solution in the embodiment of the present disclosure is carried out clear, complete Site preparation description, it is clear that described embodiment is only disclosure a part of the embodiment, instead of all the embodiments.Below Description only actually at least one exemplary embodiment be it is illustrative, never as to the disclosure and its application or making Any restrictions.Based on the embodiment in the disclosure, those of ordinary skill in the art are not making creative work premise Under every other embodiment obtained, belong to the disclosure protection range.
Unless specifically stated otherwise, positioned opposite, the digital table of the component and step that otherwise illustrate in these embodiments Up to the unlimited the scope of the present disclosure processed of formula and numerical value.
Simultaneously, it should be appreciated that for ease of description, the size of various pieces shown in attached drawing is not according to reality Proportionate relationship draw.
Technology, method and apparatus known to person of ordinary skill in the relevant may be not discussed in detail, but suitable In the case of, the technology, method and apparatus should be considered as authorizing part of specification.
It is shown here and discuss all examples in, any occurrence should be construed as merely illustratively, without It is as limitation.Therefore, the other examples of exemplary embodiment can have different values.
It should also be noted that similar label and letter indicate similar terms in following attached drawing, therefore, once a certain Xiang Yi It is defined in a attached drawing, then in subsequent attached drawing does not need that it is further discussed.
Fig. 1 is the exemplary process diagram of the sentiment analysis method of an embodiment of the present disclosure.In some embodiments, this reality The method and step for applying example can be executed by sentiment analysis device.
In step 101, feature extraction is carried out to user conversation.
In step 102, extracted user conversation feature is inputted into preset first prediction model and the second prediction respectively Model.
In some embodiments, classification results of the classification results of the first prediction model less than the second prediction model.For example, There are three classification results, the outputs of the second prediction model seven classification results for the output of first prediction model.As a result, by borrowing Help the knowledge of the first prediction model, the second prediction model is able to carry out more detailed emotional semantic classification.
In some embodiments, each classification results of the first prediction model respectively with the second prediction model at least one Classification results are associated.For example, the first prediction model is three disaggregated models, classification results include glad (happy), neutrality (neutral), negative (negative).Second prediction model is seven disaggregated models, classification results include glad (happy), Neutral (neutral), anxiety (anxiety), fear (fear), sad (sad), lose (lost) angry (anger).First The happiness classification results of prediction model and the happiness classification results of the second prediction model are associated.The neutrality of first prediction model point Class result is associated with the neutral classification results of the second prediction model.The negative classification results of first prediction model and the second prediction The anxiety of model, anger, fear, sad and to lose classification results associated.
In some embodiments, the first prediction model and the second prediction model are the convolutional neural networks based on character (charCNN)。
In step 103, the fisrt feature data in the first prediction model are inputted into the second prediction model, so as to the second prediction Model merges fisrt feature data and the second feature data of itself, and obtains user using fused characteristic The emotional semantic classification result of session.
It in some embodiments, can be in the corresponding hidden layer of the second prediction model, by first from the first prediction model Second feature data in characteristic and the second prediction model are spliced.For example, fisrt feature data are the number of 200 dimensions According to, second feature data are also the data of 200 dimensions, by the way that fisrt feature data and second feature data are spliced, will To 400 dimension data continue to handle in the second prediction model, to obtain the emotional semantic classification result of user conversation.
In disclosure sentiment analysis method provided by the above embodiment, by using the mode of transfer learning, by first The feature of prediction model inputs the second prediction model, thus by the knowledge migration in the first prediction model to the second prediction model In, so that the second prediction model can obtain better classification results.
Fig. 2 is the sentiment analysis network model schematic diagram of an embodiment of the present disclosure.As shown in Fig. 2, by user conversation spy Sign inputs preset first prediction model and the second prediction model respectively.First prediction model is three disaggregated models, has magnanimity Labeled data.Second prediction model is seven disaggregated models, and possessed labeled data is less.Each of first prediction model point Class result is associated at least one classification results of the second prediction model respectively.By the fisrt feature number in the first prediction model According to the second prediction model is inputted, so that the second prediction model melts fisrt feature data and the second feature data of itself It closes, and obtains the emotional semantic classification result of user conversation using fused characteristic.
Carrying out sentiment analysis to user conversation example using above-described embodiment, the results are shown in Table 1.
Session identification Emotional category User conversation
1 It is neutral Does shellfish parent's nipple have cross bore?
2 It loses Wrist-watch is cranky, and wanting to search all can not find out
3 It is neutral I has just bought this power supply
4 It is angry It is so slow!
5 It is neutral IPONE5S is either with or without 32G memory
6 It is glad It is dear to rattle away
7 It is neutral How long is No. S trousers
8 It is sad Thanks to
9 It is sad Returning can not give the answer
10 It is neutral How timing shutdown
11 It is neutral It is big again either with or without power
12 It is angry What I ordered does not deliver also
Table 1
Fig. 3 is the exemplary process diagram of the sentiment analysis method of another embodiment of the disclosure.In some embodiments, originally The method and step of embodiment can be executed by sentiment analysis device.
In step 301, training data is separately input to the first prediction model and in training pattern, so as to the first prediction Model output category result, wherein training data includes user conversation feature.
In step 302, by the characteristic input in the first prediction model to training pattern, so as to training pattern future It is merged from the characteristic of the first prediction model and the characteristic of itself, and utilizes the output point of fused characteristic Class result.
In step 303, the classification results that are exported according to the classification results exported to training pattern and the first prediction model it Between existing deviation, parameter to training pattern is adjusted, to obtain the second prediction model.
In some embodiments, if to the classification results of training pattern output and the classification results of the first prediction model output Between incidence relation is not present, then determine the classification results that classification results export to training pattern and the first prediction model export Between there are deviations.
It for example, the first prediction model output user feeling is happiness, and is anger to training pattern output user feeling, then Show to which there are deviations between training pattern and the output category result of the first prediction model.
Fig. 4 is the sentiment analysis network model schematic diagram of another embodiment of the disclosure.Training data is separately input to First prediction model and in training pattern, so as to the first prediction model output category result.First prediction model is three classification Model, the labeled data with magnanimity.Second prediction model is seven disaggregated models, and possessed labeled data is less.By first Characteristic input in prediction model to training pattern, so as to training pattern by the characteristic from the first prediction model It is merged with the characteristic of itself, and utilizes fused characteristic output category result.According to defeated to training pattern Existing deviation between the classification results of classification results and the output of the first prediction model out, adjusts the parameter to training pattern, To obtain the second prediction model.
Fig. 5 is the exemplary block diagram of the sentiment analysis device of an embodiment of the present disclosure.As shown in figure 5, sentiment analysis fills It sets including characteristic extracting module 51, feature input module 52 and transfer learning module 53.
Characteristic extracting module 51 is configured as carrying out feature extraction to user conversation.
Feature input module 52 is configured as inputting extracted user conversation feature into preset first prediction mould respectively Type and the second prediction model.
In some embodiments, classification results of the classification results of the first prediction model less than the second prediction model.One In a little embodiments, each classification results of the first prediction model are related at least one classification results of the second prediction model respectively Connection.For example, the classification results of the first prediction model include glad, neutral, negative.The classification results of second prediction model include height Emerging, neutral, anxiety anger, is feared, is sad, losing, wherein the negative classification results and the second prediction model of the first prediction model Anxiety, anger, fear, sad and to lose classification results associated.
In some embodiments, the first prediction model and the second prediction model are the convolutional neural networks based on character.
Transfer learning module 53 is configured as the fisrt feature data in the first prediction model inputting the second prediction model, So that the second prediction model merges fisrt feature data and the second feature data of itself, and utilize fused feature Data obtain the emotional semantic classification result of user conversation.
Fig. 6 is the exemplary block diagram of the sentiment analysis device of another embodiment of the disclosure.The difference of Fig. 6 and Fig. 5 It is, in the embodiment shown in fig. 6, sentiment analysis device further includes training module 54.
Training module 54 is configured as training data being separately input to the first prediction model and in training pattern, so as to First prediction model output category result, wherein training data includes user conversation feature.Training module 54 predicts mould for first Characteristic input in type to training pattern, so as to training pattern by from the first prediction model characteristic and itself Characteristic merged, and utilize fused characteristic output category result.Training module 54 is according to mould to be trained Existing deviation between the classification results of type output and the classification results of the first prediction model output, adjusts the ginseng to training pattern Number, to obtain the second prediction model.
In some embodiments, if training module 54 is additionally configured to the classification results exported to training pattern and first in advance It surveys and incidence relation is not present between the classification results of model output, then determine the classification results exported to training pattern and first in advance There are deviations between the classification results of survey model output.
Fig. 7 is the exemplary block diagram of the sentiment analysis device of another embodiment of the disclosure.As shown in fig. 7, sentiment analysis Device includes memory 71 and processor 72.
For storing instruction, processor 72 is coupled to memory 71 to memory 71, and processor 72 is configured as based on storage The instruction execution of device storage realizes the method that any embodiment is related in such as Fig. 1 or Fig. 3.
As shown in fig. 7, the sentiment analysis device further includes communication interface 73, for carrying out information exchange with other equipment. Meanwhile the device further includes bus 74, processor 72, communication interface 73 and memory 71 are completed each other by bus 74 Communication.
Memory 71 may include high speed RAM memory, can also further include nonvolatile memory (non-volatile Memory), a for example, at least magnetic disk storage.Memory 71 is also possible to memory array.Memory 71 is also possible to be divided Block, and block can be combined into virtual volume by certain rule.
In addition, processor 72 can be a central processor CPU, perhaps can be application-specific integrated circuit ASIC or It is arranged to implement one or more integrated circuits of the embodiment of the present disclosure.
The disclosure also relates to a kind of computer readable storage medium, and wherein computer-readable recording medium storage has meter The method that any embodiment is related in such as Fig. 1 or Fig. 3 is realized in the instruction of calculation machine when instruction is executed by processor.
Table 2 is the Experimental Comparison of the scheme that disclosure above-described embodiment proposes and the prior art.Test one is using tradition CharCNN model carry out seven classification processings, test two for using the disclosure transfer learning seven classification processings.
Test name Accuracy rate Recall rate F1 score
Test one 69.44% 63.22% 66%
Test two 76.52% 60.54% 67.54%
Table 2
From Table 2, it can be seen that scheme provided by the disclosure effectively promotes the accuracy rate of user feeling classification, while total Body F1 score also gets a promotion.
In some embodiments, functional unit block described above can be implemented as being retouched for executing the disclosure State the general processor of function, programmable logic controller (PLC) (Programmable Logic Controller, referred to as: PLC), Digital signal processor (Digital Signal Processor, referred to as: DSP), specific integrated circuit (Application Specific Integrated Circuit, referred to as: ASIC), field programmable gate array (Field-Programmable Gate Array, referred to as: FPGA) either other programmable logic device, discrete gate or transistor logic, discrete hardware Component or it is any appropriately combined.
Those of ordinary skill in the art will appreciate that realizing that all or part of the steps of above-described embodiment can pass through hardware It completes, relevant hardware can also be instructed to complete by program, the program can store in a kind of computer-readable In storage medium, storage medium mentioned above can be read-only memory, disk or CD etc..
The description of the disclosure is given for the purpose of illustration and description, and is not exhaustively or by the disclosure It is limited to disclosed form.Many modifications and variations are obvious for the ordinary skill in the art.It selects and retouches Embodiment is stated and be the principle and practical application in order to more preferably illustrate the disclosure, and those skilled in the art is enable to manage The solution disclosure is to design various embodiments suitable for specific applications with various modifications.

Claims (16)

1. a kind of sentiment analysis method, comprising:
Feature extraction is carried out to user conversation;
Extracted user conversation feature is inputted into preset first prediction model and the second prediction model respectively;
Fisrt feature data in first prediction model are inputted into second prediction model, so as to the second prediction mould Type merges the fisrt feature data and the second feature data of itself, and is used using fused characteristic The emotional semantic classification result of family session.
2. according to the method described in claim 1, wherein,
The classification results of first prediction model are less than the classification results of second prediction model.
3. according to the method described in claim 2, wherein,
Each classification results of first prediction model respectively at least one classification results phase of second prediction model Association.
4. according to the method described in claim 3, wherein,
The classification results of first prediction model include glad, neutral, negative;
The classification results of second prediction model include happiness, neutrality, anxiety, anger, fear, is sad, losing, wherein described The anxiety of the negative classification results of first prediction model and second prediction model anger, is feared, is sad and lose classification knot Fruit is associated.
5. according to the method described in claim 1, wherein,
First prediction model and second prediction model are the convolutional neural networks based on character.
6. method according to any one of claims 1-5, wherein
Training data is separately input to first prediction model and in training pattern, so that first prediction model is defeated Classification results out, wherein training data includes user conversation feature;
Characteristic input in first prediction model is described to training pattern, it will be come from so as to described to training pattern The characteristic of first prediction model and the characteristic of itself are merged, and are exported using fused characteristic Classification results;
Exist according between the classification results to training pattern output and the classification results of first prediction model output Deviation, the adjustment parameter to training pattern, to obtain second prediction model.
7. according to the method described in claim 6, wherein,
If described to be not present between the classification results of training pattern output and the classification results of first prediction model output Incidence relation, then determine the classification results to training pattern output and first prediction model output classification results it Between there are deviations.
8. a kind of sentiment analysis device, comprising:
Characteristic extracting module is configured as carrying out feature extraction to user conversation;
Feature input module is configured as inputting extracted user conversation feature into preset first prediction model and respectively Two prediction models;
Transfer learning module is configured as fisrt feature data input the second prediction mould in first prediction model Type so that second prediction model merges the fisrt feature data and the second feature data of itself, and utilizes Fused characteristic obtains the emotional semantic classification result of user conversation.
9. device according to claim 8, wherein
The classification results of first prediction model are less than the classification results of second prediction model.
10. device according to claim 9, wherein
Each classification results of first prediction model respectively at least one classification results phase of second prediction model Association.
11. device according to claim 10, wherein
The classification results of first prediction model include glad, neutral, negative;
The classification results of second prediction model include happiness, neutrality, anxiety, anger, fear, is sad, losing, wherein described The anxiety of the negative classification results of first prediction model and second prediction model anger, is feared, is sad and lose classification knot Fruit is associated.
12. device according to claim 8, wherein
First prediction model and second prediction model are the convolutional neural networks based on character.
13. the device according to any one of claim 8-12, further includes:
Training module, is configured as training data being separately input to first prediction model and in training pattern, so as to The first prediction model output category result, wherein training data includes user conversation feature;By first prediction model In characteristic input it is described to training pattern, so as to it is described to training pattern by the feature from first prediction model Data and the characteristic of itself are merged, and utilize fused characteristic output category result;According to described wait instruct Practice existing deviation between the classification results of model output and the classification results of first prediction model output, adjustment it is described to The parameter of training pattern, to obtain second prediction model.
14. device according to claim 13, wherein
If training module is additionally configured to the classification results to training pattern output and first prediction model output Incidence relation is not present between classification results, then determines the classification results to training pattern output and the first prediction mould There are deviations between the classification results of type output.
15. a kind of sentiment analysis device, comprising:
Memory is configured as store instruction;
Processor, is coupled to memory, and the instruction execution that processor is configured as storing based on memory realizes such as claim The method of any one of 1-7.
16. a kind of computer readable storage medium, wherein computer-readable recording medium storage has computer instruction, instructs quilt The method such as any one of claim 1-7 is realized when processor executes.
CN201811037201.3A 2018-09-06 2018-09-06 Emotion analysis method and device Active CN109033089B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811037201.3A CN109033089B (en) 2018-09-06 2018-09-06 Emotion analysis method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811037201.3A CN109033089B (en) 2018-09-06 2018-09-06 Emotion analysis method and device

Publications (2)

Publication Number Publication Date
CN109033089A true CN109033089A (en) 2018-12-18
CN109033089B CN109033089B (en) 2021-01-26

Family

ID=64623779

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811037201.3A Active CN109033089B (en) 2018-09-06 2018-09-06 Emotion analysis method and device

Country Status (1)

Country Link
CN (1) CN109033089B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109829166A (en) * 2019-02-15 2019-05-31 重庆师范大学 People place customer input method for digging based on character level convolutional neural networks
CN110378726A (en) * 2019-07-02 2019-10-25 阿里巴巴集团控股有限公司 A kind of recommended method of target user, system and electronic equipment
CN111626816A (en) * 2020-05-10 2020-09-04 石伟 Image interaction information processing method based on e-commerce live broadcast and cloud computing platform

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107247702A (en) * 2017-05-05 2017-10-13 桂林电子科技大学 A kind of text emotion analysis and processing method and system
CN107590134A (en) * 2017-10-26 2018-01-16 福建亿榕信息技术有限公司 Text sentiment classification method, storage medium and computer
CN107609009A (en) * 2017-07-26 2018-01-19 北京大学深圳研究院 Text emotion analysis method, device, storage medium and computer equipment
CN108108355A (en) * 2017-12-25 2018-06-01 北京牡丹电子集团有限责任公司数字电视技术中心 Text emotion analysis method and system based on deep learning
US20180165554A1 (en) * 2016-12-09 2018-06-14 The Research Foundation For The State University Of New York Semisupervised autoencoder for sentiment analysis
CN108460415A (en) * 2018-02-28 2018-08-28 国信优易数据有限公司 Pseudo label generates model training method and pseudo label generation method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180165554A1 (en) * 2016-12-09 2018-06-14 The Research Foundation For The State University Of New York Semisupervised autoencoder for sentiment analysis
CN107247702A (en) * 2017-05-05 2017-10-13 桂林电子科技大学 A kind of text emotion analysis and processing method and system
CN107609009A (en) * 2017-07-26 2018-01-19 北京大学深圳研究院 Text emotion analysis method, device, storage medium and computer equipment
CN107590134A (en) * 2017-10-26 2018-01-16 福建亿榕信息技术有限公司 Text sentiment classification method, storage medium and computer
CN108108355A (en) * 2017-12-25 2018-06-01 北京牡丹电子集团有限责任公司数字电视技术中心 Text emotion analysis method and system based on deep learning
CN108460415A (en) * 2018-02-28 2018-08-28 国信优易数据有限公司 Pseudo label generates model training method and pseudo label generation method

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109829166A (en) * 2019-02-15 2019-05-31 重庆师范大学 People place customer input method for digging based on character level convolutional neural networks
CN109829166B (en) * 2019-02-15 2022-12-27 重庆师范大学 People and host customer opinion mining method based on character-level convolutional neural network
CN110378726A (en) * 2019-07-02 2019-10-25 阿里巴巴集团控股有限公司 A kind of recommended method of target user, system and electronic equipment
CN111626816A (en) * 2020-05-10 2020-09-04 石伟 Image interaction information processing method based on e-commerce live broadcast and cloud computing platform

Also Published As

Publication number Publication date
CN109033089B (en) 2021-01-26

Similar Documents

Publication Publication Date Title
CN108319599A (en) A kind of interactive method and apparatus
CN106649825B (en) Voice interaction system and creation method and device thereof
CN108984530A (en) A kind of detection method and detection system of network sensitive content
CN109033089A (en) Sentiment analysis method and apparatus
CN110019770A (en) The method and apparatus of train classification models
CN108733644B (en) A kind of text emotion analysis method, computer readable storage medium and terminal device
CN107305578A (en) Human-machine intelligence's answering method and device
CN107480122A (en) A kind of artificial intelligence exchange method and artificial intelligence interactive device
CN107357787A (en) Semantic interaction method, apparatus and electronic equipment
CN105843796A (en) Microblog emotional tendency analysis method and device
CN103631874B (en) UGC label classification determining method and device for social platform
CN109918499A (en) A kind of file classification method, device, computer equipment and storage medium
CN109145193A (en) A kind of information-pushing method and system
CN107589828A (en) The man-machine interaction method and system of knowledge based collection of illustrative plates
CN107273406A (en) Dialog process method and device in task dialogue system
CN106601254A (en) Information inputting method, information inputting device and calculation equipment
CN108108347B (en) Dialogue mode analysis system and method
CN107193948A (en) Human-computer dialogue data analysing method and device
CN109508373A (en) Calculation method, equipment and the computer readable storage medium of enterprise's public opinion index
CN106202053A (en) A kind of microblogging theme sentiment analysis method that social networks drives
CN110175323A (en) Method and device for generating message abstract
CN107093164A (en) Method and apparatus for generating image
CN109977225A (en) Public opinion analysis method and device
CN113282762A (en) Knowledge graph construction method and device, electronic equipment and storage medium
CN108829777A (en) A kind of the problem of chat robots, replies method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant