CN112370058A - Method for identifying and monitoring emotion of user based on mobile terminal - Google Patents
Method for identifying and monitoring emotion of user based on mobile terminal Download PDFInfo
- Publication number
- CN112370058A CN112370058A CN202011256341.7A CN202011256341A CN112370058A CN 112370058 A CN112370058 A CN 112370058A CN 202011256341 A CN202011256341 A CN 202011256341A CN 112370058 A CN112370058 A CN 112370058A
- Authority
- CN
- China
- Prior art keywords
- data
- emotion
- user
- heart rate
- mobile terminal
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000008451 emotion Effects 0.000 title claims abstract description 51
- 238000000034 method Methods 0.000 title claims abstract description 35
- 238000012544 monitoring process Methods 0.000 title claims abstract description 22
- 238000004891 communication Methods 0.000 claims abstract description 15
- 230000007958 sleep Effects 0.000 claims abstract description 12
- 230000002996 emotional effect Effects 0.000 claims abstract description 8
- 238000012545 processing Methods 0.000 claims abstract description 8
- 230000008909 emotion recognition Effects 0.000 claims description 21
- 230000001133 acceleration Effects 0.000 claims description 13
- 238000003062 neural network model Methods 0.000 claims description 7
- 230000001502 supplementing effect Effects 0.000 claims description 5
- 230000000007 visual effect Effects 0.000 claims description 2
- 238000004422 calculation algorithm Methods 0.000 abstract description 4
- 238000013480 data collection Methods 0.000 abstract description 4
- 230000006399 behavior Effects 0.000 abstract description 3
- 238000010801 machine learning Methods 0.000 abstract description 2
- 230000008447 perception Effects 0.000 abstract description 2
- 230000002354 daily effect Effects 0.000 description 9
- 238000012549 training Methods 0.000 description 9
- 230000003203 everyday effect Effects 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 230000033764 rhythmic process Effects 0.000 description 5
- 238000012706 support-vector machine Methods 0.000 description 5
- 238000004458 analytical method Methods 0.000 description 3
- 210000002569 neuron Anatomy 0.000 description 3
- 230000004622 sleep time Effects 0.000 description 3
- 210000004556 brain Anatomy 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000002474 experimental method Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 238000012800 visualization Methods 0.000 description 2
- 206010062519 Poor quality sleep Diseases 0.000 description 1
- 230000002159 abnormal effect Effects 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000002490 cerebral effect Effects 0.000 description 1
- 238000013145 classification model Methods 0.000 description 1
- 238000002790 cross-validation Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000012905 input function Methods 0.000 description 1
- 238000012067 mathematical method Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000036578 sleeping time Effects 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 239000013598 vector Substances 0.000 description 1
- 230000002618 waking effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/165—Evaluating the state of mind, e.g. depression, anxiety
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Psychiatry (AREA)
- Engineering & Computer Science (AREA)
- Educational Technology (AREA)
- Biomedical Technology (AREA)
- Hospice & Palliative Care (AREA)
- Psychology (AREA)
- Social Psychology (AREA)
- Physics & Mathematics (AREA)
- Child & Adolescent Psychology (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Developmental Disabilities (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
The invention provides a method for identifying and monitoring the emotion of a user based on a mobile terminal, which is different from the traditional single prediction based on expression, behavior and language, and collects various data generated by the user such as communication condition, sleep condition, APP use condition and the like through a mobile phone and a bracelet which are carried by the user daily to construct a data set highly covering the life of the user so as to finish the accurate perception of the emotion of the user. According to the method, the collection process is simplified in the data collection process, the user friendliness is improved, the emotional characteristics of the user are described in a multi-dimensional view, the accuracy of the identification result is improved, meanwhile, in the data processing stage, the key characteristics are extracted for modeling, the complexity of a subsequent machine learning algorithm is reduced, and the algorithm operation time is shortened.
Description
Technical Field
The invention relates to the field of data processing, in particular to a method for identifying and monitoring emotion of a user based on a mobile terminal.
Background
Currently, user emotion recognition is mainly divided into two modules: user emotion recognition based on the traditional way (face-based user emotion recognition, text-based user emotion recognition, voice-based user emotion recognition, and physiological feature-based user emotion recognition) and user emotion recognition based on the new way (gesture behavior-based user emotion recognition).
The traditional emotion technology identification method has good applicability to capturing instantaneous emotion of a user in scientific research, but in real life, the emotion of the user may change every moment, and the user may tend to obtain average emotion in a certain period of time compared with the analysis of the instantaneous emotion. The user emotion recognition based on the traditional method only describes the user emotion from a single angle, so that the accuracy of emotion recognition is reduced. Meanwhile, complex equipment is sometimes required to be worn for data collection based on user emotion recognition of the traditional method, for example, the user emotion analysis is carried out based on brain waves of the user, and the user is required to wear a responsible brain wave collecting device for collecting physiological characteristics. In addition, in subsequent training networks, emotion technology recognition based on the traditional method is to train the network by using an open-source data set, and the data may have the problems of low naturalness, poor soundness and the like, so that the emotion recognition accuracy is directly influenced.
Disclosure of Invention
Aiming at the defects, the invention provides a method for identifying and monitoring the emotion of a user based on a mobile terminal, which collects diversified daily data generated by equipment of the mobile terminal, such as a smart phone and a smart bracelet, extracts characteristic values and creates a model, so that the emotion of the user can be described as accurately as possible from multiple aspects and angles under the condition of not influencing the daily life and work of the user, and meanwhile, an analysis result is provided for other programs in need to be utilized.
The technical scheme of the invention is as follows: a method for identifying and monitoring emotion of a user based on a mobile terminal comprises the following steps:
step 1: collecting data of a user to generate a data set; the data heart rate data and the emotion data; the data set comprises a heart rate data set and an emotion data set;
step 2: the data preliminary processing comprises supplementing the condition that a part of data in a data set has characteristic value loss, integrating the data and extracting corresponding characteristic values;
and step 3: and performing emotion recognition, and selecting a model for recognition.
Further, the method for identifying and monitoring the emotion of the user based on the mobile terminal comprises the steps that heart rate data are divided into daytime heart rate data and nighttime heart rate data, and the daytime heart rate data are obtained through a heart rate sensor; the night heart rate data is obtained through a three-axis acceleration sensor and a heart rate sensor.
Further, the emotion data comprises an active class, a general class and a passive class.
Further, the method for identifying and monitoring the emotion of the user based on the mobile terminal is characterized in that the characteristic value is supplemented by an automatic strategy according to other known values and by means of an average value.
Furthermore, the method for identifying and monitoring the emotion of the user based on the mobile terminal is characterized in that after the corresponding characteristic values are extracted, part of data are integrated and the corresponding characteristic values are extracted after the data collected by the APP are classified and subjected to graphic visualization observation.
Further, a method for identifying and monitoring the emotion of a user based on a mobile terminal is provided, wherein the emotion characteristics relate to APP indexes, physiological indexes, sleep indexes and communication indexes;
the emotional characteristics are described as follows:
whereinIs an APP index,Is used as a physiological index,Is used as an index of sleep,Is the communication index.
Further, the mobile terminal-based emotion recognition and monitoring method for the user comprises an SVM (support vector machine) model and a neural network model.
According to the invention, the daily data of the user is collected through the smart phone and the smart bracelet, the characteristic data is found out for network training, so that the mapping relation between the external expression data and the internal emotion of the user is obtained, and the internal emotion of the user is identified. Daily data of a user are collected based on the mobile device, and the problems of low data naturalness, poor soundness and the like in traditional emotion recognition can be solved. Firstly, the high popularity of the smart phone in China can ensure the quantity of collectable data; secondly, different people have different conditions for using the mobile phone, so that the specificity of data and the soundness of experimental results can be ensured; finally, the mobile phone is portable and easy to carry, and different from traditional emotion recognition, the novel emotion recognition based on the smart phone can conveniently acquire data generated by a user in daily life to recognize the emotion of the user without the need of wearing an additional data collection device by the user.
The invention relates to a method for identifying and monitoring the emotion of a user based on a mobile terminal, which is different from the traditional method that the emotion of the user is simply predicted based on expression, behavior and language, and collects various data generated by the user such as communication condition, sleep condition, APP use condition and the like through a mobile phone and a bracelet which are carried by the user daily to construct a data set highly covering the life of the user, thereby finishing accurate perception of the emotion of the user. According to the method, the collection process is simplified in the data collection process, the user friendliness is improved, the emotional characteristics of the user are described in a multi-dimensional view, the accuracy of the identification result is improved, meanwhile, in the data processing stage, the key characteristics are extracted for modeling, the complexity of a subsequent machine learning algorithm is reduced, and the algorithm operation time is shortened.
Drawings
FIG. 1 is a flow chart of emotion recognition and monitoring based on a mobile end user.
Detailed Description
The technical solution of the present invention is further described below with reference to the accompanying drawings: as shown in fig. 1, a method for identifying and monitoring emotion of a user based on a mobile terminal includes the following steps.
Step 1: collecting data of a user to generate a data set; the data heart rate data and the emotion data; the data sets include a heart rate data set and an emotion data set.
In order to obtain a more prepared result, dividing the heart rate data into daytime heart rate data and nighttime heart rate data, wherein the daytime heart rate data is obtained through a heart rate sensor; the night heart rate data is obtained through a three-axis acceleration sensor and a heart rate sensor. The emotion data comprises an active class, a general class and a passive class.
Specifically, a mobile phone APP is designed to collect data generated by a user every day. Recording the service time of alpha application programs through the APP through the UsageStats to obtain a setThe function of the pedometer is realized through the newly added STEP DETECTOR and STEP COUNTER sensors of the Android, the number s of the STEPs of the user on the day and the movement distance d are recorded, and a set is obtainedThe total communication time duration ct and the times cf of the user are collected through the PhoneStateListener to obtain data
Secondly, design a bracelet APP, through the rhythm of the heart sensor that carries on the bracelet, record daytime rhythm of the heart dataThree-axis acceleration sensor and heart rate sensor carried by bracelet set time node to record night heart rate data And triaxial acceleration data Collecting emotion data of a user every day, and dividing the emotion into three categories: 1 → positive, 0 → general, -1 → negative, resulting in the user's daily emotion data setAccording to the data generated by the mobile terminal every day and the emotional change of the user every day, a data set can be obtained:
step 2: the data preliminary processing comprises supplementing the condition that a part of data in a data set has characteristic value loss, integrating the data and extracting corresponding characteristic values; supplementing the characteristic value by using an automatic strategy according to other known values and utilizing an average value; the corresponding characteristic value is extracted by integrating part of data after the data classification collected by APP is subjected to graphic visualization observation; the emotional characteristics relate to APP indexes, physiological indexes, sleep indexes and communication indexes.
The specific method comprises the following steps: the collected data is managed first, and is mainly divided into three parts: firstly, aiming at the condition that a part of data in a data set has characteristic value loss, an automatic strategy is adopted to supplement the characteristic value by using an average value according to other known values; and secondly, integrating data, and integrating part of data to extract corresponding characteristic values after carrying out graphic visual observation through data classification collected by APP.
APP index
The APP index comprises the type of APP, the using time of the APP and the using times of the APP. Considering the kinds of APP on the market and the influence on the emotion of the user, APP is currently classified into three main categories: work, entertainment, social. The total working time of the user on the day can be obtained according to different APP typesTotal time of day's recreationTotal social duration of the day
Considering the difference of the influence degree of the use time of different APP types on the user emotion, carrying out weighting processing on the total APP durations of different types to obtain APP indexes:
physiological index
Firstly, obtaining the movement distance s and the step number d of a user every day through a mobile phone APP to obtain a setSecondly design a bracelet APP, through the rhythm of the heart sensor that carries on the bracelet, set up the time node, once rhythm of the heart data and triaxial acceleration are collected to every 1h, obtain the maximum value set of rhythm of the heart:
and (3) obtaining the variance of the daytime heart rate data to obtain a daytime heart rate abnormal value:
further, the physiological indexes can be obtained:
sleep index
Setting a time node, collecting heart rate data and triaxial acceleration every 30min at night to obtain a heart rate maximum set:
collect the bracelet triaxial acceleration's that corresponds in the same time interval night data simultaneously, obtain the data set:the parameters are obtained by processing the obtained triaxial accelerated data set as followsThe value: obtaining data sets corresponding to each time interval at nightFirstly, judging whether a user is in a sleep state or a waking state, taking a triaxial acceleration sensor as a leading part, assisting the heart rate, and when the acceleration calculation weight is smaller than the heart rate, namelyWhen the user is in sleep state, the time is synchronizedThe sum of the time intervals can obtain the total sleeping time of the userThe sleep state can be further classified, the heart rate is used as a leading factor, the acceleration sensor is used for assisting, and the sleep time is divided into deep sleep time and light sleep time:
further obtaining a sleep index:
communication index
Communication data Indicates the duration of communication, andthe communication times are indicated, and the average communication time length can be obtained
Combining the indexes, the emotional characteristics are described as follows:
the above symbols have the following meanings:
h_mε-the ith user maximum daytime heart rate at the jth time interval on day j;
And step 3: and performing emotion recognition, and selecting a model for recognition. The model comprises an SVM model and a neural network model.
Model selection
In order to select a better model, a plurality of classification methods capable of identifying and monitoring the emotion of the user are used for comparison.
Based on SVM model
SVM is a very common and efficient classifier that establishes a maximally spaced hyperplane in space by mapping vectors into a higher latitude space.
The invention adopts default Graph of TensorFlow to create ops, and uses placeholder () as an input container of data, and indicates the corresponding node of TensorFlow operation diagram through sess. And continuously training through the collected training data to obtain a usable classification model.
The main steps of the experiment are as follows: analyzing a data set, performing cross validation, preprocessing data, drawing a data flow chart, defining a Gaussian kernel function, creating a dual loss function, creating a classification kernel function, creating a classification function, setting an optimizer, initializing, training and evaluating a model.
Neural network model
The neural network model is also called artificial neural network, is a derivative for simulating the working principle of human neuron cerebral neurons by using a mathematical method, and is also divided into an input layer, a hidden layer and an output layer. The input layer inputs the characteristic data, defines the number of layers of the hidden layer and the number of neurons of each layer, and the output layer is used for outputting results.
The neural network model constructed by the project is trained by using a data set generated in the daily life of the user and collected by the mobile terminal, and an available emotion recognition model is trained through continuous optimization to recognize and monitor the daily emotion of the user. Here mainly the TensorFlow two higher order categories are used: APIEstimator and Dataset to construct a neural network model. The collected data were divided into two categories, one for training as a sample and one for evaluation, while the training process was visualized using a TensorBoard.
The experiment is divided into the following steps: importing and analyzing a data set, describing data, defining a model type, defining a training input function, training a model, defining a test model, and evaluating the model.
Claims (7)
1. A method for identifying and monitoring emotion of a user based on a mobile terminal is characterized by comprising the following steps: the method comprises the following steps:
step 1: collecting data of a user to generate a data set; the data heart rate data and the emotion data; the data set comprises a heart rate data set and an emotion data set;
step 2: the data preliminary processing comprises supplementing the condition that a part of data in a data set has characteristic value loss, integrating the data and extracting corresponding characteristic values;
and step 3: and performing emotion recognition, and selecting a model for recognition.
2. The method for recognizing and monitoring the emotion of a user based on a mobile terminal as claimed in claim 1, wherein: the heart rate data are divided into daytime heart rate data and nighttime heart rate data, and the daytime heart rate data are obtained through a heart rate sensor; the night heart rate data is obtained through a three-axis acceleration sensor and a heart rate sensor.
3. The method for recognizing and monitoring the emotion of a user based on a mobile terminal as claimed in claim 1, wherein: the emotion data comprises an active class, a general class and a passive class.
4. The method for recognizing and monitoring the emotion of a user based on a mobile terminal as claimed in claim 1, wherein: and supplementing the characteristic value by using an automatic strategy according to other known values and utilizing an average value.
5. The method for recognizing and monitoring the emotion of a user based on a mobile terminal as claimed in claim 1, wherein: and the corresponding characteristic value is extracted by integrating part of data after the data classification collected by the APP is subjected to graphic visual observation.
6. The method for recognizing and monitoring the emotion of a user based on a mobile terminal as claimed in claim 5, wherein: the emotional characteristics relate to APP indexes, physiological indexes, sleep indexes and communication indexes;
the emotional characteristics are described as follows:
7. The method for recognizing and monitoring the emotion of a user based on a mobile terminal as claimed in claim 1, wherein: the model comprises an SVM model and a neural network model.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011256341.7A CN112370058A (en) | 2020-11-11 | 2020-11-11 | Method for identifying and monitoring emotion of user based on mobile terminal |
PCT/CN2021/113012 WO2022100187A1 (en) | 2020-11-11 | 2021-08-17 | Mobile terminal-based method for identifying and monitoring emotions of user |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011256341.7A CN112370058A (en) | 2020-11-11 | 2020-11-11 | Method for identifying and monitoring emotion of user based on mobile terminal |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112370058A true CN112370058A (en) | 2021-02-19 |
Family
ID=74582794
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011256341.7A Pending CN112370058A (en) | 2020-11-11 | 2020-11-11 | Method for identifying and monitoring emotion of user based on mobile terminal |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN112370058A (en) |
WO (1) | WO2022100187A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022100187A1 (en) * | 2020-11-11 | 2022-05-19 | 西北工业大学 | Mobile terminal-based method for identifying and monitoring emotions of user |
CN116631628A (en) * | 2023-07-21 | 2023-08-22 | 北京中科心研科技有限公司 | Method and device for identifying dysthymia and wearable equipment |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012025622A2 (en) * | 2010-08-27 | 2012-03-01 | Smartex S.R.L. | Monitoring method and system for assessment of prediction of mood trends |
US20140052682A1 (en) * | 2012-08-16 | 2014-02-20 | Samsung Electronics Co., Ltd. | Using physical sensory input to determine human response to multimedia content displayed on a mobile device |
CN105306703A (en) * | 2015-09-30 | 2016-02-03 | 西安沧海网络科技有限公司 | Emotion recognition wearable device based on smartphone |
CN106037749A (en) * | 2016-05-18 | 2016-10-26 | 武汉大学 | Old people falling monitoring method based on smart mobile phone and wearable device |
CN106510658A (en) * | 2016-10-25 | 2017-03-22 | 广东乐源数字技术有限公司 | Human body emotion judgment method based on bracelet |
CN106725382A (en) * | 2016-12-28 | 2017-05-31 | 天津众阳科技有限公司 | Sleep state judgement system and method based on action and HRV measurements |
CN110909876A (en) * | 2019-11-27 | 2020-03-24 | 上海交通大学 | Sign information monitoring method and system based on multiple physiological parameters and CNN |
CN111444863A (en) * | 2020-03-30 | 2020-07-24 | 华南理工大学 | Camera-based 5G vehicle-mounted network cloud-assisted driver emotion recognition method |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TW201801037A (en) * | 2016-06-30 | 2018-01-01 | 泰金寶電通股份有限公司 | Emotion analysis method and electronic apparatus thereof |
CN108216254B (en) * | 2018-01-10 | 2020-03-10 | 山东大学 | Road anger emotion recognition method based on fusion of facial image and pulse information |
CN109670406B (en) * | 2018-11-25 | 2023-06-20 | 华南理工大学 | Non-contact emotion recognition method for game user by combining heart rate and facial expression |
CN110507335B (en) * | 2019-08-23 | 2021-01-01 | 山东大学 | Multi-mode information based criminal psychological health state assessment method and system |
CN111259895B (en) * | 2020-02-21 | 2022-08-30 | 天津工业大学 | Emotion classification method and system based on facial blood flow distribution |
CN112370058A (en) * | 2020-11-11 | 2021-02-19 | 西北工业大学 | Method for identifying and monitoring emotion of user based on mobile terminal |
-
2020
- 2020-11-11 CN CN202011256341.7A patent/CN112370058A/en active Pending
-
2021
- 2021-08-17 WO PCT/CN2021/113012 patent/WO2022100187A1/en active Application Filing
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012025622A2 (en) * | 2010-08-27 | 2012-03-01 | Smartex S.R.L. | Monitoring method and system for assessment of prediction of mood trends |
US20140052682A1 (en) * | 2012-08-16 | 2014-02-20 | Samsung Electronics Co., Ltd. | Using physical sensory input to determine human response to multimedia content displayed on a mobile device |
CN105306703A (en) * | 2015-09-30 | 2016-02-03 | 西安沧海网络科技有限公司 | Emotion recognition wearable device based on smartphone |
CN106037749A (en) * | 2016-05-18 | 2016-10-26 | 武汉大学 | Old people falling monitoring method based on smart mobile phone and wearable device |
CN106510658A (en) * | 2016-10-25 | 2017-03-22 | 广东乐源数字技术有限公司 | Human body emotion judgment method based on bracelet |
CN106725382A (en) * | 2016-12-28 | 2017-05-31 | 天津众阳科技有限公司 | Sleep state judgement system and method based on action and HRV measurements |
CN110909876A (en) * | 2019-11-27 | 2020-03-24 | 上海交通大学 | Sign information monitoring method and system based on multiple physiological parameters and CNN |
CN111444863A (en) * | 2020-03-30 | 2020-07-24 | 华南理工大学 | Camera-based 5G vehicle-mounted network cloud-assisted driver emotion recognition method |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022100187A1 (en) * | 2020-11-11 | 2022-05-19 | 西北工业大学 | Mobile terminal-based method for identifying and monitoring emotions of user |
CN116631628A (en) * | 2023-07-21 | 2023-08-22 | 北京中科心研科技有限公司 | Method and device for identifying dysthymia and wearable equipment |
Also Published As
Publication number | Publication date |
---|---|
WO2022100187A1 (en) | 2022-05-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107220591A (en) | Multi-modal intelligent mood sensing system | |
CN105877766B (en) | A kind of state of mind detection system and method based on the fusion of more physiological signals | |
CN110070105B (en) | Electroencephalogram emotion recognition method and system based on meta-learning example rapid screening | |
CN108304917A (en) | A kind of P300 signal detecting methods based on LSTM networks | |
CN110353702A (en) | A kind of emotion identification method and system based on shallow-layer convolutional neural networks | |
CN104484644B (en) | A kind of gesture identification method and device | |
CN105894039A (en) | Emotion recognition modeling method, emotion recognition method and apparatus, and intelligent device | |
CN107092894A (en) | A kind of motor behavior recognition methods based on LSTM models | |
CN110353673B (en) | Electroencephalogram channel selection method based on standard mutual information | |
CN105184325A (en) | Human body action recognition method and mobile intelligent terminal | |
CN114052735B (en) | Deep field self-adaption-based electroencephalogram emotion recognition method and system | |
CN109543526A (en) | True and false facial paralysis identifying system based on depth difference opposite sex feature | |
CN109902660A (en) | A kind of expression recognition method and device | |
CN106228200A (en) | A kind of action identification method not relying on action message collecting device | |
CN111598451B (en) | Control work efficiency analysis method, device and system based on task execution capacity | |
WO2022100187A1 (en) | Mobile terminal-based method for identifying and monitoring emotions of user | |
CN110974219A (en) | Human brain idea recognition system based on invasive BCI | |
CN108958482B (en) | Similarity action recognition device and method based on convolutional neural network | |
WO2021004510A1 (en) | Sensor-based separately deployed human body behavior recognition health management system | |
WO2024098649A1 (en) | Street greening quality testing method based on physiological arousal recognition | |
CN113208593A (en) | Multi-modal physiological signal emotion classification method based on correlation dynamic fusion | |
CN111063437A (en) | Personalized chronic disease analysis system | |
CN108717548B (en) | Behavior recognition model updating method and system for dynamic increase of sensors | |
CN109567832A (en) | A kind of method and system of the angry driving condition of detection based on Intelligent bracelet | |
CN113642432A (en) | Method for identifying human body posture by convolutional neural network based on covariance matrix transformation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20210219 |
|
WD01 | Invention patent application deemed withdrawn after publication |