CN110245236B - Information presentation method and device and electronic equipment - Google Patents
Information presentation method and device and electronic equipment Download PDFInfo
- Publication number
- CN110245236B CN110245236B CN201910556396.0A CN201910556396A CN110245236B CN 110245236 B CN110245236 B CN 110245236B CN 201910556396 A CN201910556396 A CN 201910556396A CN 110245236 B CN110245236 B CN 110245236B
- Authority
- CN
- China
- Prior art keywords
- emotion
- user
- information
- determining
- module
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/35—Clustering; Classification
- G06F16/353—Clustering; Classification into predefined classes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/35—Clustering; Classification
- G06F16/358—Browsing; Visualisation therefor
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
The embodiment of the disclosure provides an information presentation method and device and electronic equipment. The information presentation method comprises the following steps: acquiring information to be determined; extracting a user emotion keyword from information to be determined; determining user emotion information based on the user emotion keywords by using a machine learning method; and presenting the emotional information of the user in a visual mode. Through the embodiment of the disclosure, the technical problem of how to present the information which accurately reflects the real emotion of the user is solved, the presented information is visual, the information can be visually presented, the visual fatigue is not easy to generate, the emotion of the user can be distinguished, and the exciting discussion contradiction can be avoided.
Description
Technical Field
The present disclosure relates to the field of information processing technologies, and in particular, to an information presentation method and apparatus, and an electronic device.
Background
At present, people usually make a certain view of a certain thing or a certain event. For example, in the case of financial data such as stocks, the emotion of each stock is expressed by publishing a certain view.
The prior art generally presents user perspective information in a columnar manner.
However, the manner of presenting user viewpoint information in the prior art is a flat narrative manner, and the presented information is disordered, so that it is difficult to represent the user viewpoint information to be presented from a plurality of viewpoint information.
Therefore, the prior art has the defect that the real emotion of the user cannot be accurately reflected because the user viewpoint information is presented in a list mode.
BRIEF SUMMARY OF THE PRESENT DISCLOSURE
The disclosed embodiment mainly aims to provide an information presentation method, an information presentation device and electronic equipment, so as to solve the technical problem of how to present information which accurately reflects the real emotion of a user.
In order to achieve the above object, in a first aspect, the present disclosure provides the following technical solutions:
a method of information presentation, comprising:
acquiring information to be determined;
extracting a user emotion keyword from the information to be determined;
determining user emotion information based on the user emotion keywords by using a machine learning method;
and presenting the emotion information of the user in a visual mode.
Further, the step of determining the emotion information of the user based on the emotion keyword of the user by using the machine learning method specifically includes:
classifying the emotion keywords of the user according to emotion dimensionality to obtain a classification result;
acquiring the activeness of information issued by a user and historical emotion information;
determining the weight of the classification result according to the liveness of the user issued information and the historical emotion information;
and determining the emotion information of the user based on the classification result and the weight thereof.
Further, the step of determining the emotion information of the user based on the classification result and the weight thereof specifically includes:
scoring the classification results;
calculating a weighted sum of the scoring results and the weights;
and taking the emotion elements in the emotion dimensionality corresponding to the maximum value of the summation score as the emotion information of the user.
Further, after the step of determining user emotion information based on the user emotion keyword by using the machine learning method, the method further includes:
giving a predetermined label to the user emotion information;
and presenting the emotion information of the user in the form of the tag.
In order to achieve the above object, in a second aspect, the present disclosure further provides the following technical solutions:
an information presentation device, comprising:
the acquisition module is used for acquiring information to be determined;
the extraction module is used for extracting the emotion key words of the user from the information to be determined;
the determining module is used for determining the emotion information of the user based on the emotion keywords of the user by utilizing a machine learning device;
and the first presentation module is used for presenting the emotion information of the user in a visual mode.
Further, the determining module is specifically configured to:
classifying the emotion keywords of the user according to emotion dimensionality to obtain a classification result;
acquiring the activeness of information issued by a user and historical emotion information;
determining the weight of the classification result according to the liveness of the user issued information and the historical emotion information;
and determining the emotion information of the user based on the classification result and the weight thereof.
Further, the determining module is further configured to:
scoring the classification results;
calculating a weighted sum of the scoring results and the weights;
and taking the emotion elements in the emotion dimensionality corresponding to the maximum value of the summation score as the emotion information of the user.
Further, the apparatus further comprises:
the endowing module is used for endowing the emotion information of the user with a preset label;
and the second presentation module is used for presenting the emotion information of the user in the form of the label.
In order to achieve the above object, in a third aspect, the present disclosure further provides the following technical solutions:
an electronic device comprising a processor and a memory; wherein:
the memory is used for storing a computer program;
the processor is configured to implement the method steps of any one of the first aspect when executing the program stored in the memory.
The embodiment of the disclosure provides an information presentation method and device and electronic equipment. The information presentation method comprises the following steps: acquiring information to be determined; extracting a user emotion keyword from information to be determined; determining user emotion information based on the user emotion keywords by using a machine learning method; and presenting the emotional information of the user in a visual mode. Through the embodiment of the disclosure, the information which accurately reflects the real emotion of the user can be presented, the information is visual, the information can be visually presented, the visual fatigue is not easy to generate, the emotion of the user can be distinguished, and the exciting discussion contradiction can be avoided.
Of course, it is not necessary for any product to practice the present disclosure to achieve all of the advantages set forth above at the same time.
In order to make the technical means of the present disclosure more clearly understood, the present disclosure may be implemented in the form of a solution, and the above and other objects, features and advantages of the present disclosure may be more clearly understood through the following detailed description of the preferred embodiments, taken in conjunction with the accompanying drawings. Additional features and advantages of the disclosure will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the disclosure. The objectives and other advantages of the disclosure may be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings. The claimed subject matter is not limited to addressing any or all of the disadvantages noted in the background.
Drawings
In order to more clearly illustrate the embodiments or technical solutions in the prior art of the present disclosure, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings are included as a part of the present disclosure to further understand the present disclosure. The drawings in the following description are merely exemplary embodiments of the disclosure, and other drawings may be derived by those skilled in the art without inventive effort. Wherein:
FIG. 1 is a schematic flow diagram of an information presentation method according to an example embodiment;
fig. 2 is a schematic structural diagram of an information presentation apparatus according to an exemplary embodiment.
The drawings and written description above are not intended to limit the scope of the disclosure in any way, but rather to illustrate the concepts of the disclosure to those skilled in the art by referencing a particular embodiment. Also, the numerals and text in any of the figures are merely for the purpose of more clearly illustrating the disclosure and should not be taken as unduly limiting the scope of the disclosure.
Detailed Description
The embodiments of the present disclosure are described below with specific examples, and other advantages and effects of the present disclosure will be readily apparent to those skilled in the art from the disclosure in the specification. It is to be understood that the described embodiments are merely illustrative of some, and not restrictive, of the embodiments of the disclosure. The disclosure may be embodied or carried out in various other specific embodiments, and various modifications and changes may be made in the details within the description without departing from the spirit of the disclosure. It should be noted that, in the following embodiments and examples, features may be combined with each other to form a technical solution without conflict. All other embodiments, which can be derived by a person skilled in the art from the embodiments disclosed herein without making any creative effort, shall fall within the protection scope of the present disclosure.
It is noted that various aspects of the embodiments are described below within the scope of the appended claims. It should be apparent that the aspects described herein may be embodied in a wide variety of forms and that any specific structure and/or function described herein is merely illustrative. Based on the disclosure, one skilled in the art should appreciate that one aspect described herein may be implemented independently of any other aspects and that two or more of these aspects may be combined in various ways. For example, an apparatus may be implemented and/or a method practiced using any number of the aspects set forth herein. Additionally, such an apparatus may be implemented and/or such a method may be practiced using other structure and/or functionality in addition to one or more of the aspects set forth herein.
It should be noted that the drawings provided in the following embodiments are only for illustrating the basic idea of the present disclosure, and the drawings only show the components related to the present disclosure rather than the number, shape and size of the components in actual implementation, and the type, amount and ratio of the components in actual implementation may be changed arbitrarily, and the layout of the components may be more complicated.
In addition, in the following description, specific details are provided to facilitate a thorough understanding of the examples. However, it will be understood by those skilled in the art that the aspects may be practiced without these specific details. The exemplary embodiments of the present disclosure and their descriptions are intended to be illustrative of the disclosure, but should not be construed to unduly limit the scope of the disclosure.
Currently, the prior art generally performs information presentation by listing user viewpoint information. However, the information presented in this way is disordered, and it is difficult to present information that accurately reflects the true emotion of the user from the viewpoint of many users. In order to solve the technical problem of how to present information that accurately reflects the true emotion of a user, embodiments of the present disclosure provide an information presentation method. The method can be applied to a cloud, a server cluster and the like. As shown in fig. 1, the method may mainly include:
s100: and acquiring information to be determined.
The information to be determined may be, for example, opinion information published by a user in a stock forum community, opinion information published by a user in a crude oil forum community, or the like.
S110: and extracting the emotion key words of the user from the information to be determined.
The user emotion keyword may be, for example, "rising", "like", "cow", "good", "purchase by organization", or the like.
In practical application, the emotion keywords of the user can be extracted from the information to be determined by a natural language processing method.
S120: and determining the emotion information of the user based on the emotion keywords of the user by using a machine learning method.
In an optional embodiment, the step may specifically include: the following steps S121 to S124. Wherein:
s121: and classifying the emotion keywords of the user according to the emotion dimensionality to obtain a classification result.
The emotional dimension may include, for example, positive emotion, negative emotion, and neutral emotion, and may also include positive emotion and negative emotion, etc.
For example, this step may classify the user emotion keywords as positive emotion keywords or negative emotion keywords.
S122: and acquiring the activity and historical emotion information of the information issued by the user.
Taking the viewpoint information of the user to each stock as an example, the step considers at least the following cases: the emotion of a user can be matched with the emotion index of each stock, and then the emotion of the user is increased by the proportion of the emotion index of other stocks.
"user issued information liveness" expressed by user for a certain stock "
S123: determining the weight of the classification result according to the liveness of the information issued by the user and the historical emotional information;
s124: based on the classification result and the weight thereof, the emotion information of the user is determined.
Specifically, the present step S124 may include steps Sa1 to Sa 3. Wherein:
sa 1: and scoring the classification result.
For example, if the user emotion keywords are classified as positive emotion keywords and negative emotion keywords; scoring is respectively performed on the positive emotion keywords and the negative emotion keywords to serve as scores of emotion elements in the emotion dimensionality.
Sa 2: a weighted sum of the scoring results and the weights is calculated.
Sa 3: and taking the emotion elements in the emotion dimensionality corresponding to the maximum value of the summation score as the emotion information of the user.
For example, if the emotion element in the emotion dimension corresponding to the maximum value of the summation score is a positive emotion, the user emotion information is represented as a positive emotion.
S130: and presenting the emotional information of the user in a visual mode.
The visualization method includes, but is not limited to, small videos, pictures, animations, and the like.
The emotion information of the user is presented in a visual mode, and the technical effect of discussion quantification can be achieved.
In an optional embodiment, after step S120, the method may further include:
sb 1: giving a predetermined label to the emotion information of the user;
sb 2: and presenting the emotional information of the user in the form of the label.
According to the technical scheme, the technical effect of presenting the emotion of the user in the form of the user view label is achieved.
In summary, by adopting the above technical scheme, the embodiment of the present disclosure can present information that accurately reflects the real emotion of the user, and the information is visual, and can visually present information, so that visual fatigue is not easily generated, the emotion of the user can be distinguished, and an exciting discussion contradiction can be avoided.
In the above, although the steps in the embodiment of the information presentation method are described in the above sequence, it should be clear to those skilled in the art that the steps in the embodiment of the present disclosure are not necessarily performed in the above sequence, and may also be performed in other sequences such as reverse sequence, parallel sequence, cross sequence, etc., and further, on the basis of the above steps, those skilled in the art may also add other steps, and these obvious modifications or equivalents should also be included in the protection scope of the present disclosure, and are not described herein again.
For convenience of description, only the relevant parts of the embodiments of the present disclosure are shown, and details of the specific techniques are not disclosed, please refer to the embodiments of the method of the present disclosure. Functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
In order to solve the technical problem of how to present information accurately reflecting the real emotion of a user, the embodiment of the disclosure further provides an information presenting device. As shown in fig. 2, the apparatus may mainly include: an acquisition module 21, an extraction module 22, a determination module 23 and a first presentation module 24. The obtaining module 21 is configured to obtain information to be determined. The extraction module 22 is used for extracting the emotion keywords of the user from the information to be determined. The determination module 23 is configured to determine the emotion information of the user based on the emotion keywords of the user by using the machine learning apparatus. The first presentation module 24 is configured to present the emotional information of the user in a visual manner.
In an alternative embodiment, the determining module 23 is specifically configured to: classifying the emotion keywords of the user according to the emotion dimensionality to obtain a classification result; acquiring the activeness of information issued by a user and historical emotion information; determining the weight of the classification result according to the liveness of the information issued by the user and the historical emotional information; and determining the emotion information of the user based on the classification result and the weight thereof.
In an alternative embodiment, the determining module 23 is further configured to: scoring the classification results; calculating a weighted sum of the scoring result and the weight; and taking the emotion elements in the emotion dimensionality corresponding to the maximum value of the summation score as the emotion information of the user.
In an alternative embodiment, the apparatus may further include a endowment module and a second presentation module. The endowing module is used for endowing the emotion information of the user with a preset label. The second presentation module is used for presenting the emotion information of the user in the form of a tag.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working process of the apparatus, the technical problems to be solved and the technical effects to be achieved in the foregoing description may refer to the corresponding process in the foregoing method embodiments, the technical problems to be solved and the technical effects to be achieved, and are not described herein again.
In summary, in the embodiment of the present disclosure, by using the obtaining module 21, the extracting module 22, the determining module 23 and the first presenting module 24, information that accurately reflects the real emotion of the user can be presented, and the information is visual, and can be visually presented, so that visual fatigue is not easily generated, the emotion of the user can be distinguished, and the discussion contradiction can be avoided.
In addition, in order to solve the technical problem of how to present information accurately reflecting the real emotion of the user, the embodiment of the present disclosure also provides an electronic device, which includes a processor and a memory. Wherein the memory is used for storing a computer program. The processor is configured to implement the method steps described in any of the foregoing information presentation method embodiments when executing the program stored in the memory.
The processor may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so forth. The processor may be implemented in at least one hardware form of DSP (Digital Signal Processing), FPGA (field Programmable Gate Array), PLA (Programmable Logic Array). The processor may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content that the display screen needs to display. In some embodiments, the processor may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
The memory may include one or more computer-readable storage media, which may be non-transitory. The memory may also include high speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer-readable storage medium in a memory is to store at least one instruction for execution by a processor.
In some exemplary embodiments, the electronic device further optionally comprises: a peripheral interface and at least one peripheral. The processor, memory and peripheral interface may be connected by bus or signal lines. Each peripheral may be connected to the peripheral interface via a bus, signal line, or circuit board.
The electronic devices include, but are not limited to, a cloud, a server cluster, and the like.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process of the electronic device, the technical problems to be solved thereby, and the technical effects to be achieved thereby may refer to the corresponding process in the foregoing method embodiments, the technical problems to be solved thereby, and the technical effects to be achieved thereby, and are not described herein again.
The foregoing describes the general principles of the present disclosure in conjunction with specific embodiments, however, it is noted that the advantages, effects, etc. mentioned in the present disclosure are merely examples and are not limiting, and they should not be considered essential to the various embodiments of the present disclosure. Furthermore, the foregoing disclosure of specific details is for the purpose of illustration and description and is not intended to be limiting, since the disclosure is not intended to be limited to the specific details so described.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
It should be noted that the flowcharts and/or block diagrams referred to herein are not limited to the forms shown herein, and may be divided and/or combined.
It is also noted that in the systems and methods of the present disclosure, components or steps may be decomposed and/or re-combined. These decompositions and/or recombinations are to be considered equivalents of the present disclosure. The embodiments in the present specification are described in a related manner, each embodiment focuses on differences from other embodiments, and the same and similar parts in the embodiments are referred to each other. Various changes, substitutions and alterations to the techniques described herein may be made without departing from the techniques of the teachings as defined by the appended claims. Moreover, the scope of the claims of the present disclosure is not limited to the particular aspects of the process, machine, manufacture, composition of matter, means, methods and acts described above. Processes, machines, manufacture, compositions of matter, means, methods, or acts, presently existing or later to be developed that perform substantially the same function or achieve substantially the same result as the corresponding aspects described herein may be utilized. Accordingly, the appended claims are intended to include within their scope such processes, machines, manufacture, compositions of matter, means, methods, or acts. Other embodiments of the present invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope of the disclosure being indicated by the following claims.
The above description is only for the preferred embodiment of the present disclosure, and is not intended to limit the scope of the present disclosure. Any modification, equivalent replacement, improvement, change, addition, sub-combination and the like made within the spirit and principle of the present disclosure are included in the protection scope of the present disclosure.
Claims (7)
1. An information presentation method, comprising:
acquiring information to be determined;
extracting a user emotion keyword from the information to be determined;
determining user emotion information based on the user emotion keywords by using a machine learning method;
presenting the emotion information of the user in a visual mode;
the step of determining the emotion information of the user based on the emotion keywords of the user by using the machine learning method specifically includes:
classifying the emotion keywords of the user according to emotion dimensionality to obtain a classification result;
acquiring the activeness of information issued by a user and historical emotion information;
determining the weight of the classification result according to the liveness of the user issued information and the historical emotion information;
and determining the emotion information of the user based on the classification result and the weight thereof.
2. The method according to claim 1, wherein the step of determining the emotional information of the user based on the classification result and the weight thereof specifically comprises:
scoring the classification results;
calculating a weighted sum of the scoring results and the weights;
and taking the emotion elements in the emotion dimensionality corresponding to the maximum value of the summation score as the emotion information of the user.
3. The method of claim 1, wherein after the step of determining user emotion information based on the user emotion keyword using the machine learning method, the method further comprises:
giving a predetermined label to the user emotion information;
and presenting the emotion information of the user in the form of the tag.
4. An information presentation apparatus, comprising:
the acquisition module is used for acquiring information to be determined;
the extraction module is used for extracting the emotion key words of the user from the information to be determined;
the determining module is used for determining the emotion information of the user based on the emotion keywords of the user by utilizing a machine learning device;
the first presentation module is used for presenting the emotion information of the user in a visual mode;
wherein the determining module is specifically configured to:
classifying the emotion keywords of the user according to emotion dimensionality to obtain a classification result;
acquiring the activeness of information issued by a user and historical emotion information;
determining the weight of the classification result according to the liveness of the user issued information and the historical emotion information;
and determining the emotion information of the user based on the classification result and the weight thereof.
5. The apparatus of claim 4, wherein the determining module is further configured to:
scoring the classification results;
calculating a weighted sum of the scoring results and the weights;
and taking the emotion elements in the emotion dimensionality corresponding to the maximum value of the summation score as the emotion information of the user.
6. The apparatus of claim 4, further comprising:
the endowing module is used for endowing the emotion information of the user with a preset label;
and the second presentation module is used for presenting the emotion information of the user in the form of the label.
7. An electronic device comprising a processor and a memory; wherein:
the memory is used for storing a computer program;
the processor, when executing the program stored in the memory, implementing the method steps of any of claims 1-3.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910556396.0A CN110245236B (en) | 2019-06-25 | 2019-06-25 | Information presentation method and device and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910556396.0A CN110245236B (en) | 2019-06-25 | 2019-06-25 | Information presentation method and device and electronic equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110245236A CN110245236A (en) | 2019-09-17 |
CN110245236B true CN110245236B (en) | 2021-07-20 |
Family
ID=67889560
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910556396.0A Active CN110245236B (en) | 2019-06-25 | 2019-06-25 | Information presentation method and device and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110245236B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112667196A (en) * | 2021-01-28 | 2021-04-16 | 百度在线网络技术(北京)有限公司 | Information display method and device, electronic equipment and medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106296282A (en) * | 2016-08-08 | 2017-01-04 | 南京大学 | A kind of net purchase Product evaluation method marked based on user comment and history |
CN107066442A (en) * | 2017-02-15 | 2017-08-18 | 阿里巴巴集团控股有限公司 | Detection method, device and the electronic equipment of mood value |
CN107870896A (en) * | 2016-09-23 | 2018-04-03 | 苏宁云商集团股份有限公司 | A kind of dialog analysis method and device |
CN108363699A (en) * | 2018-03-21 | 2018-08-03 | 浙江大学城市学院 | A kind of netizen's school work mood analysis method based on Baidu's mhkc |
CN104951807B (en) * | 2015-07-10 | 2018-09-25 | 沃民高新科技(北京)股份有限公司 | The determination method and apparatus of stock market's mood |
CN109040471A (en) * | 2018-10-15 | 2018-12-18 | Oppo广东移动通信有限公司 | Emotive advisory method, apparatus, mobile terminal and storage medium |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106682929B (en) * | 2015-11-10 | 2021-01-22 | 北京国双科技有限公司 | Information analysis method and device |
US20170169008A1 (en) * | 2015-12-15 | 2017-06-15 | Le Holdings (Beijing) Co., Ltd. | Method and electronic device for sentiment classification |
CN106022676A (en) * | 2016-05-09 | 2016-10-12 | 华南理工大学 | Method and apparatus for rating complaint willingness of logistics client |
US20190122232A1 (en) * | 2017-10-25 | 2019-04-25 | Mashwork Inc. Dba Canvs | Systems and methods for improving classifier accuracy |
CN108764010A (en) * | 2018-03-23 | 2018-11-06 | 姜涵予 | Emotional state determines method and device |
-
2019
- 2019-06-25 CN CN201910556396.0A patent/CN110245236B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104951807B (en) * | 2015-07-10 | 2018-09-25 | 沃民高新科技(北京)股份有限公司 | The determination method and apparatus of stock market's mood |
CN106296282A (en) * | 2016-08-08 | 2017-01-04 | 南京大学 | A kind of net purchase Product evaluation method marked based on user comment and history |
CN107870896A (en) * | 2016-09-23 | 2018-04-03 | 苏宁云商集团股份有限公司 | A kind of dialog analysis method and device |
CN107066442A (en) * | 2017-02-15 | 2017-08-18 | 阿里巴巴集团控股有限公司 | Detection method, device and the electronic equipment of mood value |
CN108363699A (en) * | 2018-03-21 | 2018-08-03 | 浙江大学城市学院 | A kind of netizen's school work mood analysis method based on Baidu's mhkc |
CN109040471A (en) * | 2018-10-15 | 2018-12-18 | Oppo广东移动通信有限公司 | Emotive advisory method, apparatus, mobile terminal and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN110245236A (en) | 2019-09-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10546005B2 (en) | Perspective data analysis and management | |
US10496752B1 (en) | Consumer insights analysis using word embeddings | |
Vu et al. | An experiment in integrating sentiment features for tech stock prediction in twitter | |
US11182806B1 (en) | Consumer insights analysis by identifying a similarity in public sentiments for a pair of entities | |
US10685183B1 (en) | Consumer insights analysis using word embeddings | |
US10558759B1 (en) | Consumer insights analysis using word embeddings | |
US10509863B1 (en) | Consumer insights analysis using word embeddings | |
US10803248B1 (en) | Consumer insights analysis using word embeddings | |
US20160328761A1 (en) | Automatic review excerpt extraction | |
US20210157856A1 (en) | Positive/negative facet identification in similar documents to search context | |
US11030539B1 (en) | Consumer insights analysis using word embeddings | |
CN109101489A (en) | A kind of text automatic abstracting method, device and a kind of electronic equipment | |
US8290925B1 (en) | Locating product references in content pages | |
CN108805444A (en) | Appraisal procedure, device, equipment and computer readable storage medium | |
CN112560461A (en) | News clue generation method and device, electronic equipment and storage medium | |
US20180247240A1 (en) | Judgment support system and judgment support method | |
CN110245236B (en) | Information presentation method and device and electronic equipment | |
US10042913B2 (en) | Perspective data analysis and management | |
US10685184B1 (en) | Consumer insights analysis using entity and attribute word embeddings | |
US20170277694A1 (en) | Search navigation element | |
KR102299525B1 (en) | Product Evolution Mining Method And Apparatus Thereof | |
US20210271637A1 (en) | Creating descriptors for business analytics applications | |
CN117251761A (en) | Data object classification method and device, storage medium and electronic device | |
CN112231468A (en) | Information generation method and device, electronic equipment and storage medium | |
CN111797633A (en) | Feature submission deduplication engine |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |