CN112370058A - Method for identifying and monitoring emotion of user based on mobile terminal - Google Patents

Method for identifying and monitoring emotion of user based on mobile terminal Download PDF

Info

Publication number
CN112370058A
CN112370058A CN202011256341.7A CN202011256341A CN112370058A CN 112370058 A CN112370058 A CN 112370058A CN 202011256341 A CN202011256341 A CN 202011256341A CN 112370058 A CN112370058 A CN 112370058A
Authority
CN
China
Prior art keywords
data
emotion
user
heart rate
mobile terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011256341.7A
Other languages
Chinese (zh)
Inventor
李志刚
李斯羽
问静波
齐振翮
张娜娜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwestern Polytechnical University
Original Assignee
Northwestern Polytechnical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwestern Polytechnical University filed Critical Northwestern Polytechnical University
Priority to CN202011256341.7A priority Critical patent/CN112370058A/en
Publication of CN112370058A publication Critical patent/CN112370058A/en
Priority to PCT/CN2021/113012 priority patent/WO2022100187A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Psychiatry (AREA)
  • Engineering & Computer Science (AREA)
  • Educational Technology (AREA)
  • Biomedical Technology (AREA)
  • Hospice & Palliative Care (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Physics & Mathematics (AREA)
  • Child & Adolescent Psychology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Developmental Disabilities (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention provides a method for identifying and monitoring the emotion of a user based on a mobile terminal, which is different from the traditional single prediction based on expression, behavior and language, and collects various data generated by the user such as communication condition, sleep condition, APP use condition and the like through a mobile phone and a bracelet which are carried by the user daily to construct a data set highly covering the life of the user so as to finish the accurate perception of the emotion of the user. According to the method, the collection process is simplified in the data collection process, the user friendliness is improved, the emotional characteristics of the user are described in a multi-dimensional view, the accuracy of the identification result is improved, meanwhile, in the data processing stage, the key characteristics are extracted for modeling, the complexity of a subsequent machine learning algorithm is reduced, and the algorithm operation time is shortened.

Description

Method for identifying and monitoring emotion of user based on mobile terminal
Technical Field
The invention relates to the field of data processing, in particular to a method for identifying and monitoring emotion of a user based on a mobile terminal.
Background
Currently, user emotion recognition is mainly divided into two modules: user emotion recognition based on the traditional way (face-based user emotion recognition, text-based user emotion recognition, voice-based user emotion recognition, and physiological feature-based user emotion recognition) and user emotion recognition based on the new way (gesture behavior-based user emotion recognition).
The traditional emotion technology identification method has good applicability to capturing instantaneous emotion of a user in scientific research, but in real life, the emotion of the user may change every moment, and the user may tend to obtain average emotion in a certain period of time compared with the analysis of the instantaneous emotion. The user emotion recognition based on the traditional method only describes the user emotion from a single angle, so that the accuracy of emotion recognition is reduced. Meanwhile, complex equipment is sometimes required to be worn for data collection based on user emotion recognition of the traditional method, for example, the user emotion analysis is carried out based on brain waves of the user, and the user is required to wear a responsible brain wave collecting device for collecting physiological characteristics. In addition, in subsequent training networks, emotion technology recognition based on the traditional method is to train the network by using an open-source data set, and the data may have the problems of low naturalness, poor soundness and the like, so that the emotion recognition accuracy is directly influenced.
Disclosure of Invention
Aiming at the defects, the invention provides a method for identifying and monitoring the emotion of a user based on a mobile terminal, which collects diversified daily data generated by equipment of the mobile terminal, such as a smart phone and a smart bracelet, extracts characteristic values and creates a model, so that the emotion of the user can be described as accurately as possible from multiple aspects and angles under the condition of not influencing the daily life and work of the user, and meanwhile, an analysis result is provided for other programs in need to be utilized.
The technical scheme of the invention is as follows: a method for identifying and monitoring emotion of a user based on a mobile terminal comprises the following steps:
step 1: collecting data of a user to generate a data set; the data heart rate data and the emotion data; the data set comprises a heart rate data set and an emotion data set;
step 2: the data preliminary processing comprises supplementing the condition that a part of data in a data set has characteristic value loss, integrating the data and extracting corresponding characteristic values;
and step 3: and performing emotion recognition, and selecting a model for recognition.
Further, the method for identifying and monitoring the emotion of the user based on the mobile terminal comprises the steps that heart rate data are divided into daytime heart rate data and nighttime heart rate data, and the daytime heart rate data are obtained through a heart rate sensor; the night heart rate data is obtained through a three-axis acceleration sensor and a heart rate sensor.
Further, the emotion data comprises an active class, a general class and a passive class.
Further, the method for identifying and monitoring the emotion of the user based on the mobile terminal is characterized in that the characteristic value is supplemented by an automatic strategy according to other known values and by means of an average value.
Furthermore, the method for identifying and monitoring the emotion of the user based on the mobile terminal is characterized in that after the corresponding characteristic values are extracted, part of data are integrated and the corresponding characteristic values are extracted after the data collected by the APP are classified and subjected to graphic visualization observation.
Further, a method for identifying and monitoring the emotion of a user based on a mobile terminal is provided, wherein the emotion characteristics relate to APP indexes, physiological indexes, sleep indexes and communication indexes;
the emotional characteristics are described as follows:
Figure BDA0002773233520000031
wherein
Figure BDA0002773233520000032
Is an APP index,
Figure BDA0002773233520000033
Is used as a physiological index,
Figure BDA0002773233520000034
Is used as an index of sleep,
Figure BDA0002773233520000035
Is the communication index.
Further, the mobile terminal-based emotion recognition and monitoring method for the user comprises an SVM (support vector machine) model and a neural network model.
According to the invention, the daily data of the user is collected through the smart phone and the smart bracelet, the characteristic data is found out for network training, so that the mapping relation between the external expression data and the internal emotion of the user is obtained, and the internal emotion of the user is identified. Daily data of a user are collected based on the mobile device, and the problems of low data naturalness, poor soundness and the like in traditional emotion recognition can be solved. Firstly, the high popularity of the smart phone in China can ensure the quantity of collectable data; secondly, different people have different conditions for using the mobile phone, so that the specificity of data and the soundness of experimental results can be ensured; finally, the mobile phone is portable and easy to carry, and different from traditional emotion recognition, the novel emotion recognition based on the smart phone can conveniently acquire data generated by a user in daily life to recognize the emotion of the user without the need of wearing an additional data collection device by the user.
The invention relates to a method for identifying and monitoring the emotion of a user based on a mobile terminal, which is different from the traditional method that the emotion of the user is simply predicted based on expression, behavior and language, and collects various data generated by the user such as communication condition, sleep condition, APP use condition and the like through a mobile phone and a bracelet which are carried by the user daily to construct a data set highly covering the life of the user, thereby finishing accurate perception of the emotion of the user. According to the method, the collection process is simplified in the data collection process, the user friendliness is improved, the emotional characteristics of the user are described in a multi-dimensional view, the accuracy of the identification result is improved, meanwhile, in the data processing stage, the key characteristics are extracted for modeling, the complexity of a subsequent machine learning algorithm is reduced, and the algorithm operation time is shortened.
Drawings
FIG. 1 is a flow chart of emotion recognition and monitoring based on a mobile end user.
Detailed Description
The technical solution of the present invention is further described below with reference to the accompanying drawings: as shown in fig. 1, a method for identifying and monitoring emotion of a user based on a mobile terminal includes the following steps.
Step 1: collecting data of a user to generate a data set; the data heart rate data and the emotion data; the data sets include a heart rate data set and an emotion data set.
In order to obtain a more prepared result, dividing the heart rate data into daytime heart rate data and nighttime heart rate data, wherein the daytime heart rate data is obtained through a heart rate sensor; the night heart rate data is obtained through a three-axis acceleration sensor and a heart rate sensor. The emotion data comprises an active class, a general class and a passive class.
Specifically, a mobile phone APP is designed to collect data generated by a user every day. Recording the service time of alpha application programs through the APP through the UsageStats to obtain a set
Figure BDA0002773233520000041
The function of the pedometer is realized through the newly added STEP DETECTOR and STEP COUNTER sensors of the Android, the number s of the STEPs of the user on the day and the movement distance d are recorded, and a set is obtained
Figure BDA0002773233520000042
The total communication time duration ct and the times cf of the user are collected through the PhoneStateListener to obtain data
Figure BDA0002773233520000043
Secondly, design a bracelet APP, through the rhythm of the heart sensor that carries on the bracelet, record daytime rhythm of the heart data
Figure BDA0002773233520000051
Three-axis acceleration sensor and heart rate sensor carried by bracelet set time node to record night heart rate data
Figure BDA0002773233520000052
Figure BDA0002773233520000053
And triaxial acceleration data
Figure BDA0002773233520000054
Figure BDA0002773233520000055
Collecting emotion data of a user every day, and dividing the emotion into three categories: 1 → positive, 0 → general, -1 → negative, resulting in the user's daily emotion data set
Figure BDA0002773233520000056
According to the data generated by the mobile terminal every day and the emotional change of the user every day, a data set can be obtained:
Figure BDA0002773233520000057
step 2: the data preliminary processing comprises supplementing the condition that a part of data in a data set has characteristic value loss, integrating the data and extracting corresponding characteristic values; supplementing the characteristic value by using an automatic strategy according to other known values and utilizing an average value; the corresponding characteristic value is extracted by integrating part of data after the data classification collected by APP is subjected to graphic visualization observation; the emotional characteristics relate to APP indexes, physiological indexes, sleep indexes and communication indexes.
The specific method comprises the following steps: the collected data is managed first, and is mainly divided into three parts: firstly, aiming at the condition that a part of data in a data set has characteristic value loss, an automatic strategy is adopted to supplement the characteristic value by using an average value according to other known values; and secondly, integrating data, and integrating part of data to extract corresponding characteristic values after carrying out graphic visual observation through data classification collected by APP.
APP index
The APP index comprises the type of APP, the using time of the APP and the using times of the APP. Considering the kinds of APP on the market and the influence on the emotion of the user, APP is currently classified into three main categories: work, entertainment, social. The total working time of the user on the day can be obtained according to different APP types
Figure BDA0002773233520000061
Total time of day's recreation
Figure BDA0002773233520000062
Total social duration of the day
Figure BDA0002773233520000063
Considering the difference of the influence degree of the use time of different APP types on the user emotion, carrying out weighting processing on the total APP durations of different types to obtain APP indexes:
Figure BDA0002773233520000064
physiological index
Firstly, obtaining the movement distance s and the step number d of a user every day through a mobile phone APP to obtain a set
Figure BDA0002773233520000065
Secondly design a bracelet APP, through the rhythm of the heart sensor that carries on the bracelet, set up the time node, once rhythm of the heart data and triaxial acceleration are collected to every 1h, obtain the maximum value set of rhythm of the heart:
Figure BDA0002773233520000066
and (3) obtaining the variance of the daytime heart rate data to obtain a daytime heart rate abnormal value:
Figure BDA0002773233520000067
further, the physiological indexes can be obtained:
Figure BDA0002773233520000068
sleep index
Setting a time node, collecting heart rate data and triaxial acceleration every 30min at night to obtain a heart rate maximum set:
Figure BDA0002773233520000069
collect the bracelet triaxial acceleration's that corresponds in the same time interval night data simultaneously, obtain the data set:
Figure BDA0002773233520000071
the parameters are obtained by processing the obtained triaxial accelerated data set as follows
Figure BDA0002773233520000072
The value:
Figure BDA0002773233520000073
Figure BDA0002773233520000074
obtaining data sets corresponding to each time interval at night
Figure BDA0002773233520000075
Firstly, judging whether a user is in a sleep state or a waking state, taking a triaxial acceleration sensor as a leading part, assisting the heart rate, and when the acceleration calculation weight is smaller than the heart rate, namely
Figure BDA0002773233520000076
When the user is in sleep state, the time is synchronizedThe sum of the time intervals can obtain the total sleeping time of the user
Figure BDA0002773233520000077
The sleep state can be further classified, the heart rate is used as a leading factor, the acceleration sensor is used for assisting, and the sleep time is divided into deep sleep time and light sleep time:
Figure BDA0002773233520000078
further obtaining a sleep index:
Figure BDA0002773233520000079
communication index
Communication data
Figure BDA00027732335200000710
Figure BDA00027732335200000711
Indicates the duration of communication, and
Figure BDA00027732335200000712
the communication times are indicated, and the average communication time length can be obtained
Figure BDA00027732335200000713
Figure BDA00027732335200000714
Combining the indexes, the emotional characteristics are described as follows:
Figure BDA00027732335200000715
the above symbols have the following meanings:
Figure BDA00027732335200000716
-the total duration of usage of the application program on the ith day by ∞ by the ith user;
Figure BDA00027732335200000717
-the total number of steps for the ith user on the jth day;
Figure BDA0002773233520000081
-the ith user distance moved on day j;
Figure BDA0002773233520000082
-the total communication duration of the ith user on the jth day;
Figure BDA0002773233520000083
-the total number of communications of the ith user on the jth day;
Figure BDA0002773233520000084
-heart rate data for the ith user at the jth time interval on day j;
Figure BDA0002773233520000085
-the daytime heart rate data set of the ith user on the jth day;
Figure BDA0002773233520000086
-heart rate data set of the ith user on day j;
Figure BDA0002773233520000087
-nighttime three-axis acceleration for the ith user at the jth day for the epsilon time interval;
Figure BDA0002773233520000088
-nighttime three-axis acceleration data set of the ith user on the jth day;
Figure BDA0002773233520000089
-sentiment data of the ith user at day j;
Figure BDA00027732335200000810
-the total duration of the working APP used by the ith user on the jth day;
Figure BDA00027732335200000811
-the total duration of the entertainment APP used by the ith user on the jth day;
Figure BDA00027732335200000812
-the total length of time that the ith user used the social APP on day j;
h_mε-the ith user maximum daytime heart rate at the jth time interval on day j;
Figure BDA00027732335200000813
-the set of maximum values of heart rate for the ith user during the jth day;
Figure BDA00027732335200000814
-the ith user j's movement distance s and step number d data set;
Figure BDA00027732335200000815
-heart rate outliers during the jth day of the ith user;
Figure BDA00027732335200000816
-the physiological index value of the ith user on the jth day;
Figure BDA00027732335200000817
-the maximum value of the heart rate of the ith user at night on the jth day.
And step 3: and performing emotion recognition, and selecting a model for recognition. The model comprises an SVM model and a neural network model.
Model selection
In order to select a better model, a plurality of classification methods capable of identifying and monitoring the emotion of the user are used for comparison.
Based on SVM model
SVM is a very common and efficient classifier that establishes a maximally spaced hyperplane in space by mapping vectors into a higher latitude space.
The invention adopts default Graph of TensorFlow to create ops, and uses placeholder () as an input container of data, and indicates the corresponding node of TensorFlow operation diagram through sess. And continuously training through the collected training data to obtain a usable classification model.
The main steps of the experiment are as follows: analyzing a data set, performing cross validation, preprocessing data, drawing a data flow chart, defining a Gaussian kernel function, creating a dual loss function, creating a classification kernel function, creating a classification function, setting an optimizer, initializing, training and evaluating a model.
Neural network model
The neural network model is also called artificial neural network, is a derivative for simulating the working principle of human neuron cerebral neurons by using a mathematical method, and is also divided into an input layer, a hidden layer and an output layer. The input layer inputs the characteristic data, defines the number of layers of the hidden layer and the number of neurons of each layer, and the output layer is used for outputting results.
The neural network model constructed by the project is trained by using a data set generated in the daily life of the user and collected by the mobile terminal, and an available emotion recognition model is trained through continuous optimization to recognize and monitor the daily emotion of the user. Here mainly the TensorFlow two higher order categories are used: APIEstimator and Dataset to construct a neural network model. The collected data were divided into two categories, one for training as a sample and one for evaluation, while the training process was visualized using a TensorBoard.
The experiment is divided into the following steps: importing and analyzing a data set, describing data, defining a model type, defining a training input function, training a model, defining a test model, and evaluating the model.

Claims (7)

1. A method for identifying and monitoring emotion of a user based on a mobile terminal is characterized by comprising the following steps: the method comprises the following steps:
step 1: collecting data of a user to generate a data set; the data heart rate data and the emotion data; the data set comprises a heart rate data set and an emotion data set;
step 2: the data preliminary processing comprises supplementing the condition that a part of data in a data set has characteristic value loss, integrating the data and extracting corresponding characteristic values;
and step 3: and performing emotion recognition, and selecting a model for recognition.
2. The method for recognizing and monitoring the emotion of a user based on a mobile terminal as claimed in claim 1, wherein: the heart rate data are divided into daytime heart rate data and nighttime heart rate data, and the daytime heart rate data are obtained through a heart rate sensor; the night heart rate data is obtained through a three-axis acceleration sensor and a heart rate sensor.
3. The method for recognizing and monitoring the emotion of a user based on a mobile terminal as claimed in claim 1, wherein: the emotion data comprises an active class, a general class and a passive class.
4. The method for recognizing and monitoring the emotion of a user based on a mobile terminal as claimed in claim 1, wherein: and supplementing the characteristic value by using an automatic strategy according to other known values and utilizing an average value.
5. The method for recognizing and monitoring the emotion of a user based on a mobile terminal as claimed in claim 1, wherein: and the corresponding characteristic value is extracted by integrating part of data after the data classification collected by the APP is subjected to graphic visual observation.
6. The method for recognizing and monitoring the emotion of a user based on a mobile terminal as claimed in claim 5, wherein: the emotional characteristics relate to APP indexes, physiological indexes, sleep indexes and communication indexes;
the emotional characteristics are described as follows:
Figure FDA0002773233510000021
wherein
Figure FDA0002773233510000022
Is an APP index,
Figure FDA0002773233510000023
Is used as a physiological index,
Figure FDA0002773233510000024
Is used as an index of sleep,
Figure FDA0002773233510000025
Is the communication index.
7. The method for recognizing and monitoring the emotion of a user based on a mobile terminal as claimed in claim 1, wherein: the model comprises an SVM model and a neural network model.
CN202011256341.7A 2020-11-11 2020-11-11 Method for identifying and monitoring emotion of user based on mobile terminal Pending CN112370058A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011256341.7A CN112370058A (en) 2020-11-11 2020-11-11 Method for identifying and monitoring emotion of user based on mobile terminal
PCT/CN2021/113012 WO2022100187A1 (en) 2020-11-11 2021-08-17 Mobile terminal-based method for identifying and monitoring emotions of user

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011256341.7A CN112370058A (en) 2020-11-11 2020-11-11 Method for identifying and monitoring emotion of user based on mobile terminal

Publications (1)

Publication Number Publication Date
CN112370058A true CN112370058A (en) 2021-02-19

Family

ID=74582794

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011256341.7A Pending CN112370058A (en) 2020-11-11 2020-11-11 Method for identifying and monitoring emotion of user based on mobile terminal

Country Status (2)

Country Link
CN (1) CN112370058A (en)
WO (1) WO2022100187A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022100187A1 (en) * 2020-11-11 2022-05-19 西北工业大学 Mobile terminal-based method for identifying and monitoring emotions of user
CN116631628A (en) * 2023-07-21 2023-08-22 北京中科心研科技有限公司 Method and device for identifying dysthymia and wearable equipment

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012025622A2 (en) * 2010-08-27 2012-03-01 Smartex S.R.L. Monitoring method and system for assessment of prediction of mood trends
US20140052682A1 (en) * 2012-08-16 2014-02-20 Samsung Electronics Co., Ltd. Using physical sensory input to determine human response to multimedia content displayed on a mobile device
CN105306703A (en) * 2015-09-30 2016-02-03 西安沧海网络科技有限公司 Emotion recognition wearable device based on smartphone
CN106037749A (en) * 2016-05-18 2016-10-26 武汉大学 Old people falling monitoring method based on smart mobile phone and wearable device
CN106510658A (en) * 2016-10-25 2017-03-22 广东乐源数字技术有限公司 Human body emotion judgment method based on bracelet
CN106725382A (en) * 2016-12-28 2017-05-31 天津众阳科技有限公司 Sleep state judgement system and method based on action and HRV measurements
CN110909876A (en) * 2019-11-27 2020-03-24 上海交通大学 Sign information monitoring method and system based on multiple physiological parameters and CNN
CN111444863A (en) * 2020-03-30 2020-07-24 华南理工大学 Camera-based 5G vehicle-mounted network cloud-assisted driver emotion recognition method

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201801037A (en) * 2016-06-30 2018-01-01 泰金寶電通股份有限公司 Emotion analysis method and electronic apparatus thereof
CN108216254B (en) * 2018-01-10 2020-03-10 山东大学 Road anger emotion recognition method based on fusion of facial image and pulse information
CN109670406B (en) * 2018-11-25 2023-06-20 华南理工大学 Non-contact emotion recognition method for game user by combining heart rate and facial expression
CN110507335B (en) * 2019-08-23 2021-01-01 山东大学 Multi-mode information based criminal psychological health state assessment method and system
CN111259895B (en) * 2020-02-21 2022-08-30 天津工业大学 Emotion classification method and system based on facial blood flow distribution
CN112370058A (en) * 2020-11-11 2021-02-19 西北工业大学 Method for identifying and monitoring emotion of user based on mobile terminal

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012025622A2 (en) * 2010-08-27 2012-03-01 Smartex S.R.L. Monitoring method and system for assessment of prediction of mood trends
US20140052682A1 (en) * 2012-08-16 2014-02-20 Samsung Electronics Co., Ltd. Using physical sensory input to determine human response to multimedia content displayed on a mobile device
CN105306703A (en) * 2015-09-30 2016-02-03 西安沧海网络科技有限公司 Emotion recognition wearable device based on smartphone
CN106037749A (en) * 2016-05-18 2016-10-26 武汉大学 Old people falling monitoring method based on smart mobile phone and wearable device
CN106510658A (en) * 2016-10-25 2017-03-22 广东乐源数字技术有限公司 Human body emotion judgment method based on bracelet
CN106725382A (en) * 2016-12-28 2017-05-31 天津众阳科技有限公司 Sleep state judgement system and method based on action and HRV measurements
CN110909876A (en) * 2019-11-27 2020-03-24 上海交通大学 Sign information monitoring method and system based on multiple physiological parameters and CNN
CN111444863A (en) * 2020-03-30 2020-07-24 华南理工大学 Camera-based 5G vehicle-mounted network cloud-assisted driver emotion recognition method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022100187A1 (en) * 2020-11-11 2022-05-19 西北工业大学 Mobile terminal-based method for identifying and monitoring emotions of user
CN116631628A (en) * 2023-07-21 2023-08-22 北京中科心研科技有限公司 Method and device for identifying dysthymia and wearable equipment

Also Published As

Publication number Publication date
WO2022100187A1 (en) 2022-05-19

Similar Documents

Publication Publication Date Title
CN107220591A (en) Multi-modal intelligent mood sensing system
CN105877766B (en) A kind of state of mind detection system and method based on the fusion of more physiological signals
CN110070105B (en) Electroencephalogram emotion recognition method and system based on meta-learning example rapid screening
CN108304917A (en) A kind of P300 signal detecting methods based on LSTM networks
CN110353702A (en) A kind of emotion identification method and system based on shallow-layer convolutional neural networks
CN104484644B (en) A kind of gesture identification method and device
CN105894039A (en) Emotion recognition modeling method, emotion recognition method and apparatus, and intelligent device
CN107092894A (en) A kind of motor behavior recognition methods based on LSTM models
CN110353673B (en) Electroencephalogram channel selection method based on standard mutual information
CN105184325A (en) Human body action recognition method and mobile intelligent terminal
CN114052735B (en) Deep field self-adaption-based electroencephalogram emotion recognition method and system
CN109543526A (en) True and false facial paralysis identifying system based on depth difference opposite sex feature
CN109902660A (en) A kind of expression recognition method and device
CN106228200A (en) A kind of action identification method not relying on action message collecting device
CN111598451B (en) Control work efficiency analysis method, device and system based on task execution capacity
WO2022100187A1 (en) Mobile terminal-based method for identifying and monitoring emotions of user
CN110974219A (en) Human brain idea recognition system based on invasive BCI
CN108958482B (en) Similarity action recognition device and method based on convolutional neural network
WO2021004510A1 (en) Sensor-based separately deployed human body behavior recognition health management system
WO2024098649A1 (en) Street greening quality testing method based on physiological arousal recognition
CN113208593A (en) Multi-modal physiological signal emotion classification method based on correlation dynamic fusion
CN111063437A (en) Personalized chronic disease analysis system
CN108717548B (en) Behavior recognition model updating method and system for dynamic increase of sensors
CN109567832A (en) A kind of method and system of the angry driving condition of detection based on Intelligent bracelet
CN113642432A (en) Method for identifying human body posture by convolutional neural network based on covariance matrix transformation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20210219

WD01 Invention patent application deemed withdrawn after publication