CN109697472A - One seed mood incorporates method into - Google Patents

One seed mood incorporates method into Download PDF

Info

Publication number
CN109697472A
CN109697472A CN201811617912.8A CN201811617912A CN109697472A CN 109697472 A CN109697472 A CN 109697472A CN 201811617912 A CN201811617912 A CN 201811617912A CN 109697472 A CN109697472 A CN 109697472A
Authority
CN
China
Prior art keywords
mood
dividing value
sub
degree
less
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811617912.8A
Other languages
Chinese (zh)
Other versions
CN109697472B (en
Inventor
徐承迪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Jianxin Intelligent Technology Co ltd
Original Assignee
Hangzhou Rabbit Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Rabbit Network Technology Co Ltd filed Critical Hangzhou Rabbit Network Technology Co Ltd
Priority to CN201811617912.8A priority Critical patent/CN109697472B/en
Publication of CN109697472A publication Critical patent/CN109697472A/en
Application granted granted Critical
Publication of CN109697472B publication Critical patent/CN109697472B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Psychiatry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • General Engineering & Computer Science (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • General Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Signal Processing (AREA)
  • Physiology (AREA)
  • Mathematical Physics (AREA)
  • Fuzzy Systems (AREA)
  • Child & Adolescent Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Educational Technology (AREA)
  • Hospice & Palliative Care (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The present invention provides a seed moods to incorporate method into, and the method in a manner of sub- mood for describing primary emotion vector, comprising: quantifies the expression degree of primary emotion;The value of primary emotion vector primary emotion vector is a={ a0,a1,a2,a3,a4,a5,a6,a7, the meaning of each positional value is followed successively by the degree of being surprised, wild with joy degree, angry degree, grieved degree, vigilant degree, has deep love for degree, hatred degree, panic degree;The sub- mood of primary emotion is divided using preset three-level sorting algorithm.The present invention can the emotion judgment based on sub- emotion expression service method output character as a result, in order to understand the emotional state of user, there is good user experience.

Description

One seed mood incorporates method into
Technical field
The present invention relates to intelligent medical fields more particularly to a seed mood to incorporate method into.
Background technique
Research report shows that the emotional state of user and the physical condition of user have important relationship, therefore, how base Achieve the purpose that concern user emotion state is current urgent problem to be solved at any time in the intelligent wearable device of modernization.
Mood cause is complicated, and includes Multiple components, therefore is always the difficult point of academia for the analysis of mood, and Judge that mood is even more to have higher difficulty based on physiological signal, therefore, how based on physiological signal acquisition this problem of mood Have always to be solved.
Summary of the invention
In order to solve the above-mentioned technical problem, the invention proposes a seed moods to incorporate method into.The present invention is specifically with such as What lower technical solution was realized:
One seed mood incorporates method into, and the method in a manner of sub- mood for describing primary emotion vector, comprising:
Quantify the expression degree of primary emotion;The value of primary emotion vector primary emotion vector is a={ a0,a1,a2,a3, a4,a5,a6,a7, the meaning of each positional value is followed successively by the degree of being surprised, wild with joy degree, angry degree, grieved degree, vigilant journey It spends, have deep love for degree, hatred degree, panic degree;
The sub- mood of primary emotion is divided using preset three-level sorting algorithm.
Further, quantify the expression degree of primary emotion using numerical value 0-100, wherein 100 expression are the strongest, 0 It expresses the slightest.
Further, its corresponding first dividing value and the second dividing value, each original feelings are set for each primary emotion Corresponding first dividing value of thread can be different, and the second dividing value can also be different.
Further, the sub- mood for dividing primary emotion using preset three-level sorting algorithm includes:
a0Less than the first dividing value, then its corresponding sub- mood is dispersion/entanglement, a0Greater than the first dividing value and less than second Dividing value, corresponding sub- mood are surprised, a0Greater than the second dividing value, corresponding sub- mood is to be surprised;
a1Less than the first dividing value, then its corresponding sub- mood is calmness, a1Greater than the first dividing value and less than the second dividing value, Its corresponding sub- mood is happy;a1Greater than the second dividing value, corresponding sub- mood is wild with joy;
a2Less than the first dividing value, then its corresponding sub- mood is worry, a2Greater than the first dividing value and less than the second dividing value, Its corresponding sub- mood is indignant, a2Greater than the second dividing value, corresponding sub- mood is indignation;
a3Less than the first dividing value, then its corresponding sub- mood is to ponder, a3Greater than the first dividing value and less than the second dividing value, Its corresponding sub- mood is sadness, a3Greater than the second dividing value, corresponding sub- mood is grief;
a4Less than the first dividing value, then its corresponding sub- mood is interest, a4Greater than the first dividing value and less than the second dividing value, Its corresponding sub- mood is early warning, a4Greater than the second dividing value, corresponding sub- mood is vigilant;
a5Less than the first dividing value, then its corresponding sub- mood is to receive, a5Greater than the first dividing value and less than the second dividing value, Its corresponding sub- mood is to like, a5Greater than the second dividing value, corresponding sub- mood is to have deep love for;
a6Less than the first dividing value, then its corresponding sub- mood is boring, a6Greater than the first dividing value and less than the second dividing value, Its corresponding sub- mood is to detest, a6Greater than the second dividing value, corresponding sub- mood is hatred;
a7Less than the first dividing value, then its corresponding sub- mood is worry, a7Greater than the first dividing value and less than the second dividing value, Its corresponding sub- mood is to fear, a7Greater than the second dividing value, corresponding sub- mood is fear.
Further, further includes:
Primary emotion vector is exported in the form of text.
Further, further includes:
The textual result is the union of the corresponding sub- mood of nonzero value in mood vector.
Further, before the expression degree of the quantization primary emotion, further includes:
Acquire primitive physiological signal;
Obtain the corresponding primary emotion vector of primary emotion vector.
The embodiment of the present invention, which is provided, incorporates method into for a seed mood, can be based on sub- emotion expression service method output character Emotion judgment has good user experience as a result, in order to understand the emotional state of user.
Detailed description of the invention
In order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, to embodiment or will show below There is attached drawing needed in technical description to be briefly described, it should be apparent that, the accompanying drawings in the following description is only this Some embodiments of invention for those of ordinary skill in the art without creative efforts, can be with Other attached drawings are obtained according to these attached drawings.
Fig. 1 is that a kind of mood provided in an embodiment of the present invention incorporates roughly model building method flow chart into;
Fig. 2 is the training method flow chart of neural network provided in an embodiment of the present invention;
Fig. 3 is that a kind of mood provided in an embodiment of the present invention finely incorporates model building method flow chart into;
Fig. 4 is a kind of automatic method flow diagram for carrying out mood and obtaining roughly provided in an embodiment of the present invention;
Fig. 5 is a kind of accurate acquisition methods flow chart of mood provided in an embodiment of the present invention;
Fig. 6 is provided in an embodiment of the present invention primary emotion vector method flow chart to be described in a manner of sub- mood.
Specific embodiment
In order to enable those skilled in the art to better understand the solution of the present invention, below in conjunction in the embodiment of the present invention Attached drawing, technical scheme in the embodiment of the invention is clearly and completely described, it is clear that described embodiment is only The embodiment of a part of the invention, instead of all the embodiments.Based on the embodiments of the present invention, ordinary skill people The model that the present invention protects all should belong in member's every other embodiment obtained without making creative work It encloses.
It should be noted that description and claims of this specification and term " first " in above-mentioned attached drawing, " Two " etc. be to be used to distinguish similar objects, without being used to describe a particular order or precedence order.It should be understood that using in this way Data be interchangeable under appropriate circumstances, so as to the embodiment of the present invention described herein can in addition to illustrating herein or Sequence other than those of description is implemented.In addition, term " includes " and " having " and their any deformation, it is intended that cover Cover it is non-exclusive include, for example, the process, method, system, product or equipment for containing a series of steps or units are not necessarily limited to Step or unit those of is clearly listed, but may include be not clearly listed or for these process, methods, product Or other step or units that equipment is intrinsic.
The embodiment of the present invention discloses a kind of mood and incorporates model building method into roughly, as shown in Figure 1, which comprises
S1. raw data set is obtained, each element that the initial data is concentrated includes primitive physiological signal data packet With primary emotion vector.
The primitive physiological signal data packet is by setting upper pendant in testee when testee is under certain mood It wears acquisition equipment and obtains.In order to enable testee is among certain mood, mood induction can be carried out to testee.
Specifically, the acquisition of the primitive physiological signal includes following the description:
(1) the skin conductivity signal of testee is measured.
(2) electrode slice is affixed on to the left and right forehead and bilateral ear-lobe of testee, to measure the brain wave letter of testee Number.
(3) pulse signal and breath signal of testee are measured.
Specifically, model is incorporated into order to establish more reasonable mood, inventor is passed through in the embodiment of the present invention for big The research achievement for measuring user behavior characteristics and emotion expression service is extracted the primary emotion for being best able to expression human body mood, as feelings Thread degree incorporates the foundation of model, the primary emotion extracted in the embodiment of the present invention into are as follows: surprising, wild with joy, angry, grieved, It is vigilant, has deep love for, hating, is panic.Obviously, in people's daily life, most of the time can't generate especially strong Mood, inventor think that the intensity of mood is also another key factor that mood incorporates model into.
Based on the studies above as a result, the meaning of each position of primary emotion vector is successively in the embodiment of the present invention are as follows: be surprised Off course degree, angry degree, grieved degree, vigilant degree, has deep love for degree, hatred degree, panic degree at wild with joy degree.
S2. it is concentrated from initial data and extracts emotional information, constructed the element concentrated with original amount and train correspondingly Collect, the element in the training set is the mood signal phasor that mood signal is constituted, the mood signal phasor and the original Has one-to-one relationship between beginning mood vector.
Specifically, the mood signal is to carry out data processing to primitive physiological signal data packet and obtain, and the present invention is real It applies example and discloses specific data processing method, including following the description:
(1) emotional information in skin conductance signal is extracted, method particularly includes:
Discrete sampling is carried out to the skin conductance signal, obtains discrete sampling sequence { i, f (i) }.
According to formulaCalculate the fisrt feature parameter of skin conductance signal.WhereinN is discrete sampling number, and t is predetermined constant, is greater than 10 less than 40.
According to formulaCalculate the second feature parameter of skin conductance signal.Wherein p is default Constant, value between 0 to 1, N be discrete sampling number, wherein Λ (i-1, p, N-1)=2F1(0,-i+1;-N+1;1/p), In2F1() indicates Gauss hypergeometric progression.
(2) emotional information in brain wave is extracted, method particularly includes: the frequency of occurrences of statistics α wave, β wave and θ wave.
(3) statistical value of pulse signal and breath signal is calculated.
Therefore, the meaning for each position of mood signal phasor that mood signal corresponding with original amount concentration element is constituted Successively are as follows: fisrt feature parameter, second feature parameter, the α wave frequency of occurrences, the β wave frequency of occurrences, the θ wave frequency of occurrences, pulse rates Average value, respiratory intensity average value and respiratory rate average value.
S3. according to the default mind of one-to-one relationship training between the mood signal phasor and the primary emotion vector Through network model.
Specifically, the neural network model is input with mood signal phasor, with the corresponding grouping of mood signal phasor Number is output.
Specifically, the neural network model includes input layer, neuronal layers and output layer.The neuronal layers include more A sublayer, the neuron of each sublayer has reference numeral, and the neuronal quantity of each sublayer is equal.In each sublayer Portion, adjacent neuron interconnect communication, and the neuron connection communication of each neuron and next layer of reference numeral.
The training method of the neural network is disclosed in embodiments of the present invention, as shown in Figure 2, which comprises
S31. it is that the mood signal phasor is grouped according to primary emotion vector, obtains packet data collection.
The identical primary emotion vector of element position where primary emotion vector maximization is summed up as one group, correspondingly, Its corresponding mood signal phasor is also summed up as one group.
Therefore, eight groups of training datas are obtained in the embodiment of the present invention.Respectively surprised data group, wild with joy data group, anger Anger data group, vigilant data group, has deep love for data group, hatred data group and panic data group at grieved data group;Correspondingly, this eight The packet numbering of group is 0-7.
S32. according to the packet data collection training neural network model, by adjusting each nerve in neural network The weight of first node obtains target nerve network.In the target nerve network, with any feelings of the packet data convergence Thread signal phasor is that the output that input obtains is consistent with the group result of the packet data convergence.
In order to more accurately obtain a kind of the incorporating into as a result, the embodiment of the present invention further provides for mood finely draws of mood Return model building method, the mood finely incorporates each grouping of model building method effect packet data convergence, such as Fig. 3 into It is shown, which comprises
S100. the consistency of be in a bad mood signal phasor in being grouped is calculated, and chooses the minimum mood signal arrow of consistency Amount is target emotion signal phasor.
Mood signal phasor a in the embodiment of the present inventioniConsistency beWhereinWherein n is the mood signal phasor number in grouping.
S200. the corresponding target neighborhood collection of the target emotion signal phasor is calculated.
Specifically, the element that the target neighborhood is concentrated is target neighborhood.Target neighborhood in the embodiment of the present invention be with Centered on target emotion signal phasor, be in a bad mood signal arrow in M-1 times of R radius and M times of the formed annular region of R radius The set of amount, M are the characteristic value of target neighborhood.R radius is related with the consistency of mood signal phasor in the embodiment of the present invention, tool Body,Wherein q is adjustable coefficient, O (xi) it is mood signal phasor aiConsistency.Each target The characteristic value of neighborhood is respectively 1,2 ...,
S300. the K target neighborhood most comprising mood signal phasor quantity is selected.
S400. corresponding target emotion signal phasor is obtained in each target neighborhood as sub- classification center.S500. it obtains The corresponding primary emotion vector in sub- classification center.
Based on the training result of above-mentioned neural network model, the embodiment of the present invention further provides for a kind of automatic progress mood The method obtained roughly, as shown in Figure 4, which comprises
S101. collected primitive physiological signal is obtained.
S102. data processing is carried out to the primitive physiological signal, obtains its corresponding mood signal phasor.
S103. the corresponding packet numbering of the mood signal phasor is obtained to obtain target according to the neural network model Grouping.
S104. the corresponding primary emotion vector of target emotion signal phasor in the targeted packets is exported.
Further, the result of implementation of model building method is finely incorporated into based on mood, the embodiment of the present invention further mentions For a kind of accurate acquisition methods of mood, the method is further implemented on the basis of the output result of step S104, such as Fig. 5 institute Show, which comprises
S201. it calculates each height in the targeted packets and sorts out center at a distance from the mood signal phasor.
S202. the nearest corresponding primary emotion vector in sub- classification center of distance is exported.
Further, more fine in order to divide, the embodiment of the present invention further discloses a seed mood and incorporates method into, The method is emotion expression service and behavioural characteristic progress a large amount of summary research discovery primary emotions differences of the inventor according to user Expression degree can produce different sub- moods, sub- mood has thus been obtained by the way of quantitative classification incorporates method into, For the method for describing primary emotion vector in a manner of sub- mood, the method is as shown in Figure 6, comprising:
S10. quantify the expression degree of primary emotion.
Specifically, quantify the expression degree of primary emotion in the embodiment of the present invention using numerical value 0-100, wherein 100 tables Up to the most strongly, 0 expression is the slightest.
S20. the sub- mood of primary emotion is divided using three-level classification.
Specifically, its corresponding first dividing value and the second dividing value, each primary emotion are set for each primary emotion Corresponding first dividing value can be different, and the second dividing value can also be different.The value of primary emotion vector is a=in the embodiment of the present invention {a0,a1,a2,a3,a4,a5,a6,a7, the division methods of sub- mood are as follows:
(1)a0Less than the first dividing value, then its corresponding sub- mood is dispersion/entanglement, a0Greater than the first dividing value and less than Two dividing values, corresponding sub- mood are surprised, a0Greater than the second dividing value, corresponding sub- mood is to be surprised.
(2)a1Less than the first dividing value, then its corresponding sub- mood is calmness, a1Greater than the first dividing value and less than the second boundary Value, corresponding sub- mood are happy;a1Greater than the second dividing value, corresponding sub- mood is wild with joy.
(3)a2Less than the first dividing value, then its corresponding sub- mood is worry, a2Greater than the first dividing value and less than the second boundary Value, corresponding sub- mood are indignant, a2Greater than the second dividing value, corresponding sub- mood is indignation.
(4)a3Less than the first dividing value, then its corresponding sub- mood is to ponder, a3Greater than the first dividing value and less than the second boundary Value, corresponding sub- mood are sadness, a3Greater than the second dividing value, corresponding sub- mood is grief.
(5)a4Less than the first dividing value, then its corresponding sub- mood is interest, a4Greater than the first dividing value and less than the second boundary Value, corresponding sub- mood are early warning, a4Greater than the second dividing value, corresponding sub- mood is vigilant.
(6)a5Less than the first dividing value, then its corresponding sub- mood is to receive, a5Greater than the first dividing value and less than the second boundary Value, corresponding sub- mood are to like, a5Greater than the second dividing value, corresponding sub- mood is to have deep love for.
(7)a6Less than the first dividing value, then its corresponding sub- mood is boring, a6Greater than the first dividing value and less than the second boundary Value, corresponding sub- mood are to detest, a6Greater than the second dividing value, corresponding sub- mood is hatred.
(8)a7Less than the first dividing value, then its corresponding sub- mood is worry, a7Greater than the first dividing value and less than the second boundary Value, corresponding sub- mood are to fear, a7Greater than the second dividing value, corresponding sub- mood is fear.
Above-mentioned sub- mood, which incorporates method into, can be used for the output result of processing step S104 and step S202.
Method is incorporated into based on above-mentioned sub- mood, primary emotion vector can be exported in the form of text, thus carrying out feelings Thread can be with the result of output character when obtaining, it is clear that this would be even more beneficial to the understanding of user.The textual result is mood The union of the corresponding sub- mood of nonzero value in vector.For example, then output is feared if primary emotion vector is { 0,0,0,0,0,0,98 } Unbearably.If primary emotion vector is { 56,90,0,0,0,0,0,0 }, then export surprised and wild with joy.If primary emotion vector be 0,0, 78,0,0,0,80,60 }, then export indignant, early warning and be vigilant, hate and fear.
Model is slightly incorporated into the embodiment of the invention provides mood and finely finely incorporates the construction method of model into, and is provided The automatic method for carrying out mood and incorporating into roughly and finely incorporating into, the physiological signal that can be automatically based upon user judge the feelings of user Not-ready status has more granularities, automatically, the high remarkable advantage of accuracy;Sub- emotion expression service method is devised simultaneously, and can be based on The emotion judgment of sub- emotion expression service method output character has good user as a result, in order to understand the emotional state of user Experience.
It should be understood that referenced herein " multiple " refer to two or more."and/or", description association The incidence relation of object indicates may exist three kinds of relationships, for example, A and/or B, can indicate: individualism A exists simultaneously A And B, individualism B these three situations.Character "/" typicallys represent the relationship that forward-backward correlation object is a kind of "or".
The serial number of the above embodiments of the invention is only for description, does not represent the advantages or disadvantages of the embodiments.
Those of ordinary skill in the art will appreciate that realizing that all or part of the steps of above-described embodiment can pass through hardware It completes, relevant hardware can also be instructed to complete by program, the program can store in a kind of computer-readable In storage medium, storage medium mentioned above can be read-only memory, disk or CD etc..
The foregoing is merely presently preferred embodiments of the present invention, is not intended to limit the invention, it is all in spirit of the invention and Within principle, any modification, equivalent replacement, improvement and so on be should all be included in the protection scope of the present invention.

Claims (7)

1. a seed mood incorporates method into, for describing primary emotion vector in a manner of sub- mood, feature exists the method In, comprising:
Quantify the expression degree of primary emotion;The value of primary emotion vector primary emotion vector is a={ a0,a1,a2,a3,a4,a5, a6,a7, the meaning of each positional value is followed successively by the degree of being surprised, wild with joy degree, angry degree, grieved degree, vigilant degree, has deep love for Degree, hatred degree, panic degree;
The sub- mood of primary emotion is divided using preset three-level sorting algorithm.
2. according to the method described in claim 1, it is characterized by:
Quantify the expression degree of primary emotion using numerical value 0-100, wherein 100 expression are the strongest, 0 expression is the slightest.
3. according to the method described in claim 1, it is characterized by:
Its corresponding first dividing value and the second dividing value set for each primary emotion, corresponding first boundary of each primary emotion Value can be different, and the second dividing value can also be different.
4. according to the method described in claim 1, it is characterized by:
It is described using preset three-level sorting algorithm divide primary emotion sub- mood include:
a0Less than the first dividing value, then its corresponding sub- mood is dispersion/entanglement, a0Greater than the first dividing value and less than the second dividing value, Its corresponding sub- mood is surprised, a0Greater than the second dividing value, corresponding sub- mood is to be surprised;
a1Less than the first dividing value, then its corresponding sub- mood is calmness, a1It is right greater than the first dividing value and less than the second dividing value The sub- mood answered is happy;a1Greater than the second dividing value, corresponding sub- mood is wild with joy;
a2Less than the first dividing value, then its corresponding sub- mood is worry, a2It is right greater than the first dividing value and less than the second dividing value The sub- mood answered is indignant, a2Greater than the second dividing value, corresponding sub- mood is indignation;
a3Less than the first dividing value, then its corresponding sub- mood is to ponder, a3It is right greater than the first dividing value and less than the second dividing value The sub- mood answered is sadness, a3Greater than the second dividing value, corresponding sub- mood is grief;
a4Less than the first dividing value, then its corresponding sub- mood is interest, a4It is right greater than the first dividing value and less than the second dividing value The sub- mood answered is early warning, a4Greater than the second dividing value, corresponding sub- mood is vigilant;
a5Less than the first dividing value, then its corresponding sub- mood is to receive, a5It is right greater than the first dividing value and less than the second dividing value The sub- mood answered is to like, a5Greater than the second dividing value, corresponding sub- mood is to have deep love for;
a6Less than the first dividing value, then its corresponding sub- mood is boring, a6It is right greater than the first dividing value and less than the second dividing value The sub- mood answered is to detest, a6Greater than the second dividing value, corresponding sub- mood is hatred;
a7Less than the first dividing value, then its corresponding sub- mood is worry, a7It is right greater than the first dividing value and less than the second dividing value The sub- mood answered is to fear, a7Greater than the second dividing value, corresponding sub- mood is fear.
5. the method according to claim 1, wherein further include:
Primary emotion vector is exported in the form of text.
6. according to the method described in claim 5, it is characterized by further comprising:
The textual result is the union of the corresponding sub- mood of nonzero value in mood vector.
7. the method according to claim 1, wherein being gone back before the expression degree of the quantization primary emotion Include:
Acquire primitive physiological signal;
Obtain the corresponding primary emotion vector of primary emotion vector.
CN201811617912.8A 2018-12-28 2018-12-28 Sub-emotion marking-in method Active CN109697472B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811617912.8A CN109697472B (en) 2018-12-28 2018-12-28 Sub-emotion marking-in method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811617912.8A CN109697472B (en) 2018-12-28 2018-12-28 Sub-emotion marking-in method

Publications (2)

Publication Number Publication Date
CN109697472A true CN109697472A (en) 2019-04-30
CN109697472B CN109697472B (en) 2021-05-04

Family

ID=66232271

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811617912.8A Active CN109697472B (en) 2018-12-28 2018-12-28 Sub-emotion marking-in method

Country Status (1)

Country Link
CN (1) CN109697472B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021012684A1 (en) * 2019-07-23 2021-01-28 中译语通科技股份有限公司 Method and system for establishing market sentiment monitoring system

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030069728A1 (en) * 2001-10-05 2003-04-10 Raquel Tato Method for detecting emotions involving subspace specialists
CN1692341A (en) * 2002-12-11 2005-11-02 索尼株式会社 Information processing device and method, program, and recording medium
CN104951434A (en) * 2015-07-10 2015-09-30 沃民高新科技(北京)股份有限公司 Brand emotion determining method and device
CN104951807A (en) * 2015-07-10 2015-09-30 沃民高新科技(北京)股份有限公司 Stock market emotion determining method and device
CN105095415A (en) * 2015-07-10 2015-11-25 沃民高新科技(北京)股份有限公司 Method and apparatus for confirming network emotion
CN106236116A (en) * 2016-08-29 2016-12-21 无锡卓信信息科技股份有限公司 A kind of inmate's emotion monitoring method and system
CN106919673A (en) * 2017-02-21 2017-07-04 浙江工商大学 Text mood analysis system based on deep learning
CN106955112A (en) * 2017-03-17 2017-07-18 泉州装备制造研究所 Brain wave Emotion recognition method based on Quantum wavelet neural networks model
CN107220591A (en) * 2017-04-28 2017-09-29 哈尔滨工业大学深圳研究生院 Multi-modal intelligent mood sensing system
CN108095740A (en) * 2017-12-20 2018-06-01 姜涵予 A kind of user emotion appraisal procedure and device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030069728A1 (en) * 2001-10-05 2003-04-10 Raquel Tato Method for detecting emotions involving subspace specialists
CN1692341A (en) * 2002-12-11 2005-11-02 索尼株式会社 Information processing device and method, program, and recording medium
CN104951434A (en) * 2015-07-10 2015-09-30 沃民高新科技(北京)股份有限公司 Brand emotion determining method and device
CN104951807A (en) * 2015-07-10 2015-09-30 沃民高新科技(北京)股份有限公司 Stock market emotion determining method and device
CN105095415A (en) * 2015-07-10 2015-11-25 沃民高新科技(北京)股份有限公司 Method and apparatus for confirming network emotion
CN106236116A (en) * 2016-08-29 2016-12-21 无锡卓信信息科技股份有限公司 A kind of inmate's emotion monitoring method and system
CN106919673A (en) * 2017-02-21 2017-07-04 浙江工商大学 Text mood analysis system based on deep learning
CN106955112A (en) * 2017-03-17 2017-07-18 泉州装备制造研究所 Brain wave Emotion recognition method based on Quantum wavelet neural networks model
CN107220591A (en) * 2017-04-28 2017-09-29 哈尔滨工业大学深圳研究生院 Multi-modal intelligent mood sensing system
CN108095740A (en) * 2017-12-20 2018-06-01 姜涵予 A kind of user emotion appraisal procedure and device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021012684A1 (en) * 2019-07-23 2021-01-28 中译语通科技股份有限公司 Method and system for establishing market sentiment monitoring system

Also Published As

Publication number Publication date
CN109697472B (en) 2021-05-04

Similar Documents

Publication Publication Date Title
George et al. Recognition of emotional states using EEG signals based on time-frequency analysis and SVM classifier.
Pradhan et al. Detection of seizure activity in EEG by an artificial neural network: A preliminary study
CN110222643A (en) A kind of Steady State Visual Evoked Potential Modulation recognition method based on convolutional neural networks
CN106919956A (en) Brain wave age forecasting system based on random forest
Esfahani et al. Using brain–computer interfaces to detect human satisfaction in human–robot interaction
CN106503799A (en) Deep learning model and the application in brain status monitoring based on multiple dimensioned network
US20210109140A1 (en) Method for identifying parameters of 10 kv static load model based on similar daily load curves
CN106175757B (en) Behaviour decision making forecasting system based on brain wave
CN112043473B (en) Parallel nested and autonomous preferred classifier for brain-myoelectricity fusion perception of intelligent artificial limb
Palaniappan et al. Individual identification technique using visual evoked potential signals
Altan et al. Deep belief networks based brain activity classification using EEG from slow cortical potentials in stroke
CN106510702B (en) The extraction of sense of hearing attention characteristics, identifying system and method based on Middle latency auditory evoked potential
Mustafa et al. The analysis of eeg spectrogram image for brainwave balancing application using ann
CN107292296A (en) A kind of human emotion wake-up degree classifying identification method of use EEG signals
CN109330613A (en) Human body Emotion identification method based on real-time brain electricity
CN107361767A (en) A kind of human emotion's potency classifying identification method using EEG signals
CN109359610A (en) Construct method and system, the data characteristics classification method of CNN-GB model
CN112806994A (en) System and method for predicting individual stress coping mode based on physiological signal
CN114145745B (en) Graph-based multitasking self-supervision emotion recognition method
CN109697472A (en) One seed mood incorporates method into
CN113397567B (en) Human behavior electroencephalogram signal classification method and system
CN110403602A (en) Improvement public space pattern feature extracting method for EEG signals sentiment analysis
CN109685149A (en) Mood finely incorporates model construction and the automatic method for carrying out mood and finely obtaining into
CN110458066A (en) A kind of age bracket classification method based on tranquillization state eeg data
CN115281692A (en) Closed-loop self-adaptive transcranial electrical stimulation device and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20210419

Address after: 225300 No.8 Xingguo Road, high tech Zone, Gaogang District, Taizhou City, Jiangsu Province

Applicant after: Taizhou JINDA Electronic Technology Co.,Ltd.

Address before: 476, 4 floor, building A, 3 building, No. 301 foreshore Road, Changhe Road, Binjiang District, Zhejiang, Hangzhou, China 310052

Applicant before: HANGZHOU YITU NETWORK TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20231016

Address after: Room 311, Accelerator Production Workshop, Building B-2, Haichuang Technology Industrial Park, No. 627 Lugu Avenue, High tech Development Zone, Changsha City, Hunan Province, 410000

Patentee after: HUNAN JIANXIN INTELLIGENT TECHNOLOGY Co.,Ltd.

Address before: 225300 No.8 Xingguo Road, high tech Zone, Gaogang District, Taizhou City, Jiangsu Province

Patentee before: Taizhou JINDA Electronic Technology Co.,Ltd.