CN110245236A - Information demonstrating method, device and electronic equipment - Google Patents
Information demonstrating method, device and electronic equipment Download PDFInfo
- Publication number
- CN110245236A CN110245236A CN201910556396.0A CN201910556396A CN110245236A CN 110245236 A CN110245236 A CN 110245236A CN 201910556396 A CN201910556396 A CN 201910556396A CN 110245236 A CN110245236 A CN 110245236A
- Authority
- CN
- China
- Prior art keywords
- information
- user emotion
- user
- emotional
- presented
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/35—Clustering; Classification
- G06F16/353—Clustering; Classification into predefined classes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/35—Clustering; Classification
- G06F16/358—Browsing; Visualisation therefor
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
The embodiment of the present disclosure provides a kind of information demonstrating method, device and electronic equipment.Wherein, which includes: to obtain information to be determined;User emotion keyword is extracted from information to be determined;Using the method for machine learning, it is based on user emotion keyword, determines user emotion information;Pass through visual mode, presentation user's emotional information.By the embodiment of the present disclosure, solves the technical issues of information for accurately reflecting user's true emotional how is presented, the information presented is visual, information can intuitively be presented, it is not likely to produce visual fatigue, user emotion can be distinguished, additionally it is possible to avoid intensifying discussion contradiction.
Description
Technical field
This disclosure relates to technical field of information processing, more particularly to a kind of information demonstrating method, device and electronic equipment.
Background technique
Currently, people would generally deliver certain viewpoint for some things or some event.For example, for gold such as stocks
Melt data, passes through the mood for delivering certain viewpoint to express to personal share.
The prior art is generally by way of enumerating come presentation user's viewpoint information.
However, the mode for this presentation user's viewpoint information that the prior art is taken is prosily formula, presented
Information be mixed and disorderly unordered, it is difficult to the User Perspective information to be presented is embodied from numerous viewpoint informations.
Therefore, the prior art enumerates mode presentation user's viewpoint information because using, and true in the presence of that can not accurately reflect user
The defect of truth thread.
Disclosure
The main purpose of the embodiment of the present disclosure is to provide a kind of information demonstrating method, device and electronic equipment, to solve
The technical issues of information for accurately reflecting user's true emotional how is presented.
To achieve the goals above, in a first aspect, present disclose provides following technical schemes:
A kind of information demonstrating method comprising:
Obtain information to be determined;
User emotion keyword is extracted from the information to be determined;
Using the method for machine learning, it is based on the user emotion keyword, determines user emotion information;
By visual mode, the user emotion information is presented.
Further, the method using machine learning is based on the user emotion keyword, determines that user emotion is believed
The step of breath, specifically includes:
According to emotional dimension, classifies to the user emotion keyword, obtain classification results;
Obtain the liveness and history emotional information of user's release information;
According to liveness and the history emotional information that the user releases news, the power of the classification results is determined
Weight;
Based on the classification results and its weight, the user emotion information is determined.
Further, the step of being based on the classification results and its weight, determining the user emotion information, it is specific to wrap
It includes:
It gives a mark to the classification results;
Calculate the weighted sum of marking result and the weight;
It will sum corresponding to score maximum value, the mood element in the emotional dimension, believe as the user emotion
Breath.
Further, in the method using machine learning, it is based on the user emotion keyword, determines user emotion
After the step of information, the method also includes:
Scheduled label is assigned to the user emotion information;
By way of the label, the user emotion information is presented.
To achieve the goals above, second aspect, the disclosure additionally provide following technical scheme:
A kind of information presentation device comprising:
Module is obtained, for obtaining information to be determined;
Extraction module, for extracting user emotion keyword from the information to be determined;
Determining module is based on the user emotion keyword for the device using machine learning, determines that user emotion is believed
Breath;
First is presented module, for the user emotion information to be presented by visual mode.
Further, the determining module is specifically used for:
According to emotional dimension, classifies to the user emotion keyword, obtain classification results;
Obtain the liveness and history emotional information of user's release information;
According to liveness and the history emotional information that the user releases news, the power of the classification results is determined
Weight;
Based on the classification results and its weight, the user emotion information is determined.
Further, the determining module is also used to:
It gives a mark to the classification results;
Calculate the weighted sum of marking result and the weight;
It will sum corresponding to score maximum value, the mood element in the emotional dimension, believe as the user emotion
Breath.
Further, described device further include:
Module is assigned, for assigning scheduled label to the user emotion information;
Second is presented module, for the user emotion information by way of the label, to be presented.
To achieve the goals above, the third aspect, the disclosure additionally provide following technical scheme:
A kind of electronic equipment comprising processor and memory;Wherein:
The memory, for storing computer program;
The processor when for executing the program stored on the memory, realizes any one of first aspect
The method and step.
The embodiment of the present disclosure provides a kind of information demonstrating method, device and electronic equipment.Wherein, the information demonstrating method packet
It includes: obtaining information to be determined;User emotion keyword is extracted from information to be determined;Using the method for machine learning, it is based on
User emotion keyword determines user emotion information;Pass through visual mode, presentation user's emotional information.Pass through the disclosure
The information for accurately reflecting user's true emotional can be presented in embodiment, and information is visually, letter can be intuitively presented
Breath, is not likely to produce visual fatigue, can distinguish user emotion, additionally it is possible to avoid intensifying discussion contradiction.
Certainly, any product for implementing the disclosure is not necessarily required to realize all the above advantage simultaneously.
In order to better understand the technological means of the disclosure, and can be implemented in accordance with the contents of the specification, and be
The above and other objects, features and advantages of the disclosure are allowed to can be more clearly understood, it is special below to lift preferred embodiment, and cooperate attached
Figure, detailed description are as follows.The other feature and advantage of the disclosure will illustrate in the following description, also, partly from froming the perspective of
It is become apparent in bright book, or emerged from by implementing the disclosure.The purpose of the disclosure and other advantages can pass through
Specifically noted structure is achieved and obtained in the specification, claims and drawings.Theme claimed is not
It is limited to solve any or all disadvantage mentioned in the background.
Detailed description of the invention
In order to illustrate more clearly of the embodiment of the present disclosure or technical solution in the prior art, to embodiment or will show below
There is attached drawing needed in technical description to be briefly described, it should be apparent that, attached drawing is used as a part of this disclosure
To be further understood to the disclosure.The accompanying drawings in the following description is only some embodiments of the present disclosure, for this field
For those of ordinary skill, without creative efforts, it is also possible to obtain other drawings based on these drawings.
Wherein:
Fig. 1 is the flow diagram according to the information demonstrating method of an exemplary embodiment;
Fig. 2 is the structural schematic diagram according to the information presentation device of an exemplary embodiment.
Above-mentioned attached drawing and verbal description are not intended in any way limit the protection scope of the disclosure, but pass through ginseng
Specific embodiment is examined, to illustrate the concept of the disclosure to those skilled in the art.Also, the label and text in either figure are only
Merely to being illustrated more clearly that the disclosure, it is not construed as the improper restriction to disclosure protection scope.
Specific embodiment
Illustrate embodiment of the present disclosure below by specific specific example, those skilled in the art can be by this specification
Disclosed content understands other advantages and effect of the disclosure easily.Obviously, described embodiment is only the disclosure
A part of the embodiment, instead of all the embodiments.The disclosure can also be subject to reality by way of a different and different embodiment
It applies or applies, the various details in this specification can also be based on different viewpoints and application, in the spirit without departing from the disclosure
Lower carry out various modifications or alterations.It should be noted that in the absence of conflict, the feature in following embodiment and embodiment can
Technical solution is formed with intercombination.Based on the embodiment in the disclosure, those of ordinary skill in the art are not making wound
Every other embodiment obtained under the premise of the property made labour, belongs to the range of disclosure protection.
It should be noted that the various aspects of embodiment within the scope of the appended claims are described below.Ying Xian
And be clear to, aspect described herein can be embodied in extensive diversified forms, and any specific structure described herein
And/or function is only illustrative.Based on the disclosure, it will be understood by one of ordinary skill in the art that one described herein
Aspect can be independently implemented with any other aspect, and can combine the two or both in these aspects or more in various ways.
For example, carry out facilities and equipments in terms of any number set forth herein can be used and/or practice method.In addition, can make
With other than one or more of aspect set forth herein other structures and/or it is functional implement this equipment and/or
Practice the method.
It should also be noted that, diagram provided in following embodiment only illustrates the basic structure of the disclosure in a schematic way
Think, component count, shape and the size when only display is with component related in the disclosure rather than according to actual implementation in schema are drawn
System, when actual implementation kenel, quantity and the ratio of each component can arbitrarily change for one kind, and its assembly layout kenel can also
It can be increasingly complex.
In addition, in the following description, specific details are provided for a thorough understanding of the examples.However, fields
The skilled person will understand that the aspect can be practiced without these specific details.The illustrative examples of the disclosure
And its explanation can be used for explaining the disclosure, but not constitute the improper restriction to disclosure protection scope.
Currently, the prior art carries out information presentation generally by way of enumerating User Perspective information.However, this side
The information that formula is presented is mixed and disorderly unordered, it is difficult to show in multi-user's viewpoint information of comforming and accurately reflect user's true emotional
Information.In this regard, the technical issues of in order to solve how to present the information for accurately reflecting user's true emotional, the embodiment of the present disclosure
A kind of information demonstrating method is provided.This method can be applied to cloud, server, server cluster etc..As shown in Figure 1, the party
Method mainly may include:
S100: information to be determined is obtained.
Wherein, which for example can be the viewpoint information that user delivers in community, stock forum, crude oil forum
The viewpoint information etc. that user delivers in community.
S110: user emotion keyword is extracted from information to be determined.
Wherein, user emotion keyword for example can be " rising ", " liking ", " ox ", " good ", " mechanism is bought in " etc..
In practical applications, user emotion can be extracted from information to be determined by the method for natural language processing
Keyword.
S120: using the method for machine learning, it is based on user emotion keyword, determines user emotion information.
In an alternative embodiment, this step can specifically include: following steps S121 to step S124.Wherein:
S121: according to emotional dimension, classify to user emotion keyword, obtain classification results.
Wherein, emotional dimension for example may include active mood, negative feeling and neutral mood, also may include positive feelings
Thread and negative sense mood etc..
For example, user emotion keyword classification can be positive mood keyword or negative sense mood keyword by this step.
S122: the liveness and history emotional information of user's release information are obtained.
By taking user is to the viewpoint information of personal share as an example, this step at least considers following situations: the mood of a user is past
Subsequent toward can mutually agree with the moos index of personal share, the mood of the user would generally be in the calculating of word other personal share moos indexes
Shared specific gravity increased.
" liveness of user's release information " that user is showed for some personal share
S123: the liveness and history emotional information to be released news according to user determines the weight of classification results;
S124: it is based on the classification results and its weight, determines user emotion information.
Specifically, this step S124 may include step Sa1 to step Sa3.Wherein:
Sa1: it gives a mark to classification results.
For example, if being positive mood keyword and negative sense mood keyword by user emotion keyword classification;Then distinguish
It gives a mark for positive mood keyword and negative sense mood keyword, the score as mood element in emotional dimension.
Sa2: the weighted sum of marking result and weight is calculated.
Sa3: it will sum corresponding to score maximum value, the mood element in emotional dimension, as user emotion information.
For example, if the mood element corresponding to summation score maximum value, in emotional dimension is positive mood, user
Emotional information is expressed as positive mood.
S130: by visual mode, the user emotion information is presented.
Wherein, visual mode includes but is not limited to small video, picture, animation etc..
By visual mode, come presentation user's emotional information, the technology that quantization is discussed may be implemented in the embodiment of the present disclosure
Effect.
In an alternative embodiment, after step S120, this method can also include:
Sb1: scheduled label is assigned to user emotion information;
Sb2: by way of the label, the user emotion information is presented.
The present embodiment is through the above technical solutions, realize the skill for carrying out presentation user's mood in the form of User Perspective label
Art effect.
In conclusion the embodiment of the present disclosure, which by taking above-mentioned technical proposal, can be presented, accurately reflects the true feelings of user
The information of thread, and information is visually, information can be intuitively presented, be not likely to produce visual fatigue, can distinguish user
Mood, additionally it is possible to avoid intensifying discussion contradiction.
Hereinbefore, although describing each step in information demonstrating method embodiment, ability according to above-mentioned sequence
Field technique personnel it should be clear that the step in the embodiment of the present disclosure not necessarily executes in the order described above, can also with inverted order, simultaneously
Other sequences such as row, intersection execute, moreover, those skilled in the art can also add other on the basis of above-mentioned steps
The mode of step, these obvious variants or equivalent replacement should also be included within the protection scope of the disclosure, and details are not described herein.
It is below embodiment of the present disclosure, embodiment of the present disclosure is used to execute embodiments of the present disclosure realization
Step illustrates only part relevant to the embodiment of the present disclosure for ease of description, disclosed by specific technical details, please join
According to embodiments of the present disclosure.Each functional unit in each Installation practice of the disclosure can integrate in a processing unit
In, it is also possible to each unit and physically exists alone, can also be integrated in one unit with two or more units.It is above-mentioned
Integrated unit both can take the form of hardware realization, can also realize in the form of hardware adds SFU software functional unit.
The technical issues of in order to solve how to present the information for accurately reflecting user's true emotional, the embodiment of the present disclosure also mentions
For a kind of information presentation device.As shown in Fig. 2, the device mainly may include: to obtain module 21, extraction module 22, determine mould
Module 24 is presented in block 23 and first.Wherein, module 21 is obtained for obtaining information to be determined.Extraction module 22 is used for to be determined
User emotion keyword is extracted in information.Determining module 23 is used for the device using machine learning, crucial based on user emotion
Word determines user emotion information.First, which is presented module 24, is used to that the user emotion information to be presented by visual mode.
In an alternative embodiment, determining module 23 is specifically used for: according to emotional dimension, to user emotion keyword
Classify, obtains classification results;Obtain the liveness and history emotional information of user's release information;It is released news according to user
Liveness and history emotional information, determine the weight of classification results;Based on classification results and its weight, determine that user emotion is believed
Breath.
In an alternative embodiment, determining module 23 is also used to: being given a mark to classification results;Calculate marking result
With the weighted sum of weight;It will sum corresponding to score maximum value, the mood element in emotional dimension, believe as user emotion
Breath.
In an alternative embodiment, which can also include assigning module and second module is presented.Wherein, it assigns
Module is used to assign scheduled label to user emotion information.Second, which is presented module, is used for the presentation user by way of label
Emotional information.
It is apparent to those skilled in the art that for convenience of description and succinctly, the dress of foregoing description
The technical effect of the technical issues of specific work process and its solution for setting and acquirement, can be with reference in preceding method embodiment
The technical effect of the technical issues of corresponding process and its solution and acquirement, details are not described herein.
In short, module is presented using module 21, extraction module 22, determining module 23 and first is obtained in the embodiment of the present disclosure
24, the information for accurately reflecting user's true emotional can be presented, and information is visually, information can be intuitively presented,
It is not likely to produce visual fatigue, user emotion can be distinguished, additionally it is possible to avoid intensifying discussion contradiction.
In addition, the technical issues of in order to solve how to present the information for accurately reflecting user's true emotional, the disclosure is implemented
Example also provides a kind of electronic equipment comprising processor and memory.Wherein, the memory is for storing computer program.It should
Processor is for realizing described in aforementioned any information rendering method embodiment when executing the program stored on the memory
Method and step.
The processor may include one or more processing cores, such as 4 core processors, 8 core processors etc..Processing
Device can use DSP (Digital Signal Processing, Digital Signal Processing), FPGA (FieldProgrammable
Gate Array, field programmable gate array), in PLA (Programmable Logic Array, programmable logic array)
At least one example, in hardware is realized.Processor also may include primary processor and coprocessor, primary processor be for
The processor that data under wake-up states are handled, also referred to as CPU (Central Processing Unit, central processing unit);
Coprocessor is the low power processor for being handled data in the standby state.In some embodiments, it handles
Device can be integrated with GPU (Graphics Processing Unit, image processor), and GPU is for being responsible for needed for display screen
The rendering and drafting of content to be shown.In some embodiments, processor can also include AI (Artificial
Intelligence, artificial intelligence) processor, the AI processor is for handling the calculating operation in relation to machine learning.
The memory may include one or more computer readable storage mediums, which can be with
It is non-transient.Memory may also include high-speed random access memory and nonvolatile memory, such as one or more
Disk storage equipment, flash memory device.In some embodiments, the non-transient computer readable storage medium in memory
For storing at least one instruction, at least one instruction is for performed by processor.
In some exemplary embodiments, which also optionally includes: peripheral device interface and at least one
Peripheral equipment.It can be connected by bus or signal wire between processor, memory and peripheral device interface.Each peripheral equipment
It can be connected by bus, signal wire or circuit board with peripheral device interface.
Above-mentioned electronic equipment includes but is not limited to cloud, server, server cluster etc..
It is apparent to those skilled in the art that for convenience of description and succinctly, the electricity of foregoing description
The technical effect of the technical issues of specific work process and its solution of sub- equipment and acquirement can refer to preceding method embodiment
In corresponding process and its solution the technical issues of and acquirement technical effect, details are not described herein.
The basic principle of the disclosure is described in conjunction with specific embodiments above, however, it is desirable to, it is noted that in the disclosure
The advantages of referring to, advantage, effect etc. are only exemplary rather than limitation, must not believe that these advantages, advantage, effect etc. are the disclosure
Each embodiment is prerequisite.In addition, detail disclosed above is merely to exemplary effect and the work being easy to understand
With, rather than limit, it is that must be realized using above-mentioned concrete details that above-mentioned details, which is not intended to limit the disclosure,.
It should be noted that, in this document, relational terms such as first and second and the like are used merely to a reality
Body or operation are distinguished with another entity or operation, are deposited without necessarily requiring or implying between these entities or operation
In any actual relationship or order or sequence.Moreover, the terms "include", "comprise" or its any other variant are intended to
Non-exclusive inclusion, so that the process, method, article or equipment including a series of elements is not only wanted including those
Element, but also including other elements that are not explicitly listed, or further include for this process, method, article or equipment
Intrinsic element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that
There is also other identical elements in process, method, article or equipment including the element.
It should be noted that the flowchart and or block diagram being referred to herein is not limited solely to form shown in this article,
It can also be divided and/or be combined.
It may also be noted that in the system and method for the disclosure, each component or each step are can to decompose and/or again
Combination nova.These decompose and/or reconfigure the equivalent scheme that should be regarded as the disclosure.Each embodiment in this specification is equal
It is described using relevant mode, each embodiment focuses on the differences from other embodiments, each embodiment
Between same and similar part may refer to each other.The technology instructed defined by the appended claims can not departed from and carried out
To the various changes of technology described herein, replacement and change.In addition, the scope of the claims of the disclosure is not limited to the above institute
Processing, machine, the manufacture, the composition of event, means, the specific aspect of method and movement stated.Can use with it is described herein
Corresponding aspect carry out essentially identical function or realize essentially identical result there is currently or later to be developed
Processing, machine, manufacture, the composition of event, means, method or movement.Thus, appended claims include within its scope this
Processing, machine, manufacture, the composition of event, means, method or the movement of sample.Those skilled in the art consider specification and this
In after disclosed specific embodiment, can be readily apparent that other embodiments of the present invention.The disclosure is intended to cover of the invention
Any variations, uses, or adaptations, these variations, uses, or adaptations follow the general principles of this disclosure
And including the disclosure undocumented common knowledge or conventional techniques in the art.Description and embodiments only by
It is considered as illustratively, real protection scope of the invention is pointed out by claim.
The foregoing is merely the preferred embodiments of the disclosure, are not intended to limit the protection scope of the disclosure.It is all
Any modification, equivalent replacement, improvement, change, addition and sub-portfolio etc. within the spirit and principle of the disclosure include
In the protection scope of the disclosure.
Claims (9)
1. a kind of information demonstrating method characterized by comprising
Obtain information to be determined;
User emotion keyword is extracted from the information to be determined;
Using the method for machine learning, it is based on the user emotion keyword, determines user emotion information;
By visual mode, the user emotion information is presented.
2. the method according to claim 1, wherein the method using machine learning, is based on the user
Mood keyword, specifically includes the step of determining user emotion information:
According to emotional dimension, classifies to the user emotion keyword, obtain classification results;
Obtain the liveness and history emotional information of user's release information;
According to liveness and the history emotional information that the user releases news, the weight of the classification results is determined;
Based on the classification results and its weight, the user emotion information is determined.
3. according to the method described in claim 2, it is characterized in that, determining the use based on the classification results and its weight
It the step of family emotional information, specifically includes:
It gives a mark to the classification results;
Calculate the weighted sum of marking result and the weight;
It will sum corresponding to score maximum value, the mood element in the emotional dimension, as the user emotion information.
4. the method according to claim 1, wherein being based on the use in the method using machine learning
Family mood keyword, after the step of determining user emotion information, the method also includes:
Scheduled label is assigned to the user emotion information;
By way of the label, the user emotion information is presented.
5. a kind of information presentation device characterized by comprising
Module is obtained, for obtaining information to be determined;
Extraction module, for extracting user emotion keyword from the information to be determined;
Determining module is based on the user emotion keyword, determines user emotion information for the device using machine learning;
First is presented module, for the user emotion information to be presented by visual mode.
6. device according to claim 5, which is characterized in that the determining module is specifically used for:
According to emotional dimension, classifies to the user emotion keyword, obtain classification results;
Obtain the liveness and history emotional information of user's release information;
According to liveness and the history emotional information that the user releases news, the weight of the classification results is determined;
Based on the classification results and its weight, the user emotion information is determined.
7. device according to claim 6, which is characterized in that the determining module is also used to:
It gives a mark to the classification results;
Calculate the weighted sum of marking result and the weight;
It will sum corresponding to score maximum value, the mood element in the emotional dimension, as the user emotion information.
8. device according to claim 5, which is characterized in that described device further include:
Module is assigned, for assigning scheduled label to the user emotion information;
Second is presented module, for the user emotion information by way of the label, to be presented.
9. a kind of electronic equipment, which is characterized in that including processor and memory;Wherein:
The memory, for storing computer program;
The processor when for executing the program stored on the memory, realizes any one of claim 1-4 institute
The method and step stated.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910556396.0A CN110245236B (en) | 2019-06-25 | 2019-06-25 | Information presentation method and device and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910556396.0A CN110245236B (en) | 2019-06-25 | 2019-06-25 | Information presentation method and device and electronic equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110245236A true CN110245236A (en) | 2019-09-17 |
CN110245236B CN110245236B (en) | 2021-07-20 |
Family
ID=67889560
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910556396.0A Active CN110245236B (en) | 2019-06-25 | 2019-06-25 | Information presentation method and device and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110245236B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112667196A (en) * | 2021-01-28 | 2021-04-16 | 百度在线网络技术(北京)有限公司 | Information display method and device, electronic equipment and medium |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104951807A (en) * | 2015-07-10 | 2015-09-30 | 沃民高新科技(北京)股份有限公司 | Stock market emotion determining method and device |
CN106022676A (en) * | 2016-05-09 | 2016-10-12 | 华南理工大学 | Method and apparatus for rating complaint willingness of logistics client |
CN106296282A (en) * | 2016-08-08 | 2017-01-04 | 南京大学 | A kind of net purchase Product evaluation method marked based on user comment and history |
CN106682929A (en) * | 2015-11-10 | 2017-05-17 | 北京国双科技有限公司 | Information analysis method and device |
US20170169008A1 (en) * | 2015-12-15 | 2017-06-15 | Le Holdings (Beijing) Co., Ltd. | Method and electronic device for sentiment classification |
CN107066442A (en) * | 2017-02-15 | 2017-08-18 | 阿里巴巴集团控股有限公司 | Detection method, device and the electronic equipment of mood value |
CN107870896A (en) * | 2016-09-23 | 2018-04-03 | 苏宁云商集团股份有限公司 | A kind of dialog analysis method and device |
CN108363699A (en) * | 2018-03-21 | 2018-08-03 | 浙江大学城市学院 | A kind of netizen's school work mood analysis method based on Baidu's mhkc |
CN108764010A (en) * | 2018-03-23 | 2018-11-06 | 姜涵予 | Emotional state determines method and device |
CN109040471A (en) * | 2018-10-15 | 2018-12-18 | Oppo广东移动通信有限公司 | Emotive advisory method, apparatus, mobile terminal and storage medium |
US20190122232A1 (en) * | 2017-10-25 | 2019-04-25 | Mashwork Inc. Dba Canvs | Systems and methods for improving classifier accuracy |
-
2019
- 2019-06-25 CN CN201910556396.0A patent/CN110245236B/en active Active
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104951807A (en) * | 2015-07-10 | 2015-09-30 | 沃民高新科技(北京)股份有限公司 | Stock market emotion determining method and device |
CN104951807B (en) * | 2015-07-10 | 2018-09-25 | 沃民高新科技(北京)股份有限公司 | The determination method and apparatus of stock market's mood |
CN106682929A (en) * | 2015-11-10 | 2017-05-17 | 北京国双科技有限公司 | Information analysis method and device |
US20170169008A1 (en) * | 2015-12-15 | 2017-06-15 | Le Holdings (Beijing) Co., Ltd. | Method and electronic device for sentiment classification |
CN106022676A (en) * | 2016-05-09 | 2016-10-12 | 华南理工大学 | Method and apparatus for rating complaint willingness of logistics client |
CN106296282A (en) * | 2016-08-08 | 2017-01-04 | 南京大学 | A kind of net purchase Product evaluation method marked based on user comment and history |
CN107870896A (en) * | 2016-09-23 | 2018-04-03 | 苏宁云商集团股份有限公司 | A kind of dialog analysis method and device |
CN107066442A (en) * | 2017-02-15 | 2017-08-18 | 阿里巴巴集团控股有限公司 | Detection method, device and the electronic equipment of mood value |
US20190122232A1 (en) * | 2017-10-25 | 2019-04-25 | Mashwork Inc. Dba Canvs | Systems and methods for improving classifier accuracy |
CN108363699A (en) * | 2018-03-21 | 2018-08-03 | 浙江大学城市学院 | A kind of netizen's school work mood analysis method based on Baidu's mhkc |
CN108764010A (en) * | 2018-03-23 | 2018-11-06 | 姜涵予 | Emotional state determines method and device |
CN109040471A (en) * | 2018-10-15 | 2018-12-18 | Oppo广东移动通信有限公司 | Emotive advisory method, apparatus, mobile terminal and storage medium |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112667196A (en) * | 2021-01-28 | 2021-04-16 | 百度在线网络技术(北京)有限公司 | Information display method and device, electronic equipment and medium |
Also Published As
Publication number | Publication date |
---|---|
CN110245236B (en) | 2021-07-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Dwyer et al. | Immersive analytics: An introduction | |
Wu et al. | Understanding freehand gestures: a study of freehand gestural interaction for immersive VR shopping applications | |
Liu | Research on the application of multimedia elements in visual communication art under the Internet background | |
Kucher et al. | Active learning and visual analytics for stance classification with ALVA | |
US20220100807A1 (en) | Systems and methods for categorizing, evaluating, and displaying user input with publishing content | |
Zhou et al. | Pervasive social computing: augmenting five facets of human intelligence | |
TWI828888B (en) | Thread of conversation displaying method, computer readable recording medium and computer device | |
Hannon | Gender and status in voice user interfaces | |
WO2020151548A1 (en) | Method and device for sorting followed pages | |
Lugmayr et al. | Immersive interactive technologies in digital humanities: a review and basic concepts | |
CN110770765B (en) | Transforming a dictation idea into a visual representation | |
Ribes et al. | Trust indicators and explainable AI: A study on user perceptions | |
CN110245236A (en) | Information demonstrating method, device and electronic equipment | |
CN110189742A (en) | Determine emotion audio, affect display, the method for text-to-speech and relevant apparatus | |
Ross | Gender and media: A very short herstory | |
CN110287414A (en) | Information-pushing method, device and electronic equipment | |
Chen et al. | Classifying mood in plurks | |
CN114331198A (en) | Work order distribution method, equipment and storage medium | |
Paz et al. | Application of the communicability evaluation method to evaluate the user interface design: a case study in web domain | |
Lu et al. | AI Assistance for UX: A Literature Review Through Human-Centered AI | |
CN113704608A (en) | Personalized item recommendation method and device, electronic equipment and storage medium | |
Şemsioğlu et al. | Co-exploring the design space of emotional ar visualizations | |
Liu et al. | [Retracted] Visual Communication Design and Wireless Data Transmission Technology for Blockchain Big Data Information Presentation | |
Croker | Formation of the cloud: History, metaphor, and materiality | |
CN110795652A (en) | Promotional resource replacement method, device and computer readable medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |