CN114305325A - Emotion detection method and device - Google Patents

Emotion detection method and device Download PDF

Info

Publication number
CN114305325A
CN114305325A CN202011063209.4A CN202011063209A CN114305325A CN 114305325 A CN114305325 A CN 114305325A CN 202011063209 A CN202011063209 A CN 202011063209A CN 114305325 A CN114305325 A CN 114305325A
Authority
CN
China
Prior art keywords
signal
user
emotion
target
neural network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011063209.4A
Other languages
Chinese (zh)
Inventor
蔡云丽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Cloud Computing Technologies Co Ltd
Original Assignee
Huawei Cloud Computing Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Cloud Computing Technologies Co Ltd filed Critical Huawei Cloud Computing Technologies Co Ltd
Priority to CN202011063209.4A priority Critical patent/CN114305325A/en
Publication of CN114305325A publication Critical patent/CN114305325A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The application discloses an emotion detection method in the field of artificial intelligence. The method comprises the following steps: acquiring a PPG signal of the pulse of a user through a PPG sensor; sending the PPG signal to a cloud server; receiving a user emotion state value sent by a cloud server, wherein the user emotion state value is obtained by processing a physiological characteristic vector according to a neural network model after the cloud server determines the physiological characteristic vector according to a PPG signal, and the physiological characteristic vector comprises blood oxygen saturation, heart rate and respiratory rate; and outputting the emotional state of the user according to the emotional state value of the user. The emotion detection method can improve emotion detection precision. The application also discloses a device capable of realizing the emotion detection method.

Description

Emotion detection method and device
Technical Field
The application relates to the field of artificial intelligence, in particular to a method and a device for emotion detection.
Background
Artificial Intelligence (AI) is a theory, method, technique and application system that uses a digital computer or a machine controlled by a digital computer to simulate, extend and expand human intelligence, perceive the environment, acquire knowledge and use the knowledge to obtain the best results. In other words, artificial intelligence is a branch of computer science that attempts to understand the essence of intelligence and produce a new intelligent machine that can react in a manner similar to human intelligence. Artificial intelligence is the research of the design principle and the realization method of various intelligent machines, so that the machines have the functions of perception, reasoning and decision making. Research in the field of artificial intelligence includes robotics, natural language processing, computer vision, decision and reasoning, human-computer interaction, recommendation and search, AI basic theory, and the like.
People in modern society are under great working and living stress, with the stress creating emotional problems of varying degrees, such as anxiety, depression, etc. In this regard, how to effectively detect and record mood swings becomes a concern.
Currently, an emotion reminding method is roughly as follows: and respectively detecting the heart rate, the respiration rate, the human body impedance and the body temperature variation of the user, and judging that the user has bad emotion when the detection result exceeds the normal range of the parameters.
In practical applications, the normal range of the parameter is an estimation value set empirically, and the accuracy of such a determination method is not high.
Disclosure of Invention
In view of this, the present application provides a method and an apparatus for emotion detection, which can improve emotion detection accuracy.
A first aspect provides a mood detection method in which a PPG signal of a user's pulse is acquired by a PPG sensor; sending the PPG signal to a cloud server; the cloud server determines a physiological characteristic vector according to the PPG signal, wherein the physiological characteristic vector comprises at least two of the blood oxygen saturation, the heart rate and the respiratory rate; the cloud server inputs the physiological characteristic vector into the neural network model and then outputs the emotional state value of the user, and after receiving the emotional state value of the user sent by the cloud server, the cloud server outputs the emotional state of the user according to the emotional state value of the user. According to the implementation, after the PPG signal is acquired, the cloud server acquires the physiological characteristic vector according to the PPG signal, and the physiological characteristic vector is processed through the neural network model to obtain the emotional state value of the user. Because the neural network model can better fit actual user emotion data, the detection of the emotion state of the user according to the neural network model is more accurate.
In another possible implementation manner, the emotion detection method further includes: generating an emotional state curve according to the emotional state values of the user at different moments; and when the duration of the negative emotion in the emotion state curve is longer than the preset duration, outputting a reminding signal for reminding the negative emotion. Therefore, the negative emotions of the user can be reminded, and the user can be helped to control the negative emotions.
In another possible implementation manner, when it is detected that the emotional state change amplitude is greater than the preset amplitude within the set time, a reminding signal is output. Therefore, the user can be reminded of severe emotional fluctuation, and the user can be helped to adjust the emotion.
A second aspect provides an emotion detection method in which target model parameters are acquired; receiving a PPG signal sent by a wearable device; determining a physiological feature vector from the PPG signal; inputting the physiological characteristic vector into a neural network model with target model parameters; and sending the emotional state value of the user output by the neural network model to the wearable device. The physiological characteristic vector includes at least two of a blood oxygen saturation, a heart rate, and a respiration rate. According to the implementation, after the PPG signal is acquired, the physiological characteristic vector is acquired according to the PPG signal, and the physiological characteristic vector is processed through the neural network model to obtain the emotional state value of the user. Because the neural network model can better fit actual user emotion data, the detection of the emotion state of the user according to the neural network model is more accurate.
In one possible implementation, obtaining the target model parameters includes: acquiring a plurality of user emotion data; acquiring initial model parameters of a neural network model; and processing the plurality of user emotion data and the initial model parameters through a neural network algorithm to obtain target model parameters. Each user emotion data includes a physiological feature vector and a user emotional state value. Alternatively, the neural network algorithm may be, but is not limited to, a back propagation neural network algorithm, an adaptive resonance theory network algorithm, a learning vector quantization network algorithm, a kohonen network algorithm, or a hopfield network algorithm. Therefore, various target model parameters can be obtained, and various neural network models can be generated for emotion state detection as each target model parameter corresponds to one neural network model.
In another possible implementation, determining the physiological feature vector from the PPG signal comprises: decomposing the PPG signal into a plurality of IMF components and a residual component using an empirical mode decomposition method, the plurality of IMF components including a noise dominant component and a signal dominant component; denoising each noise dominant component; performing morphological filtering on each signal dominant component; forming a target PPG signal by the denoised noise dominant component, the morphologically filtered signal dominant component and the residual component; a physiological feature vector is determined from the target PPG signal. Therefore, the collected PPG signal can be denoised, and the accuracy of emotion detection can be improved by taking the denoised PPG signal as input data of the neural network model.
In another possible implementation, decomposing the PPG signal into a plurality of IMF components and a residual component using an empirical mode decomposition method comprises: taking the PPG signal as a signal to be processed; acquiring an upper envelope and a lower envelope of a signal to be processed; determining a target signal according to the signal to be processed and the upper envelope and the lower envelope of the signal to be processed; when the target signal is not the IMF component, taking the target signal as a signal to be processed, and triggering to acquire an upper envelope and a lower envelope of the signal to be processed; recording the target signal when the target signal is an IMF component; determining a difference signal according to the signal to be processed and the target signal; when the frequency of the differential signal is greater than the preset frequency, updating the signal to be processed into the differential signal, and triggering to acquire an upper envelope and a lower envelope of the signal to be processed; and when the frequency of the difference signal is less than or equal to the preset frequency, taking the difference signal as a residual component. This provides a feasible solution for the signal decomposition method, which facilitates implementation of the solution.
A third aspect provides an emotion detection apparatus, which includes a PPG sensor, a transmitting module, a receiving module, and an output module; the PPG sensor is used for acquiring a PPG signal of the pulse of the user; the sending module is used for sending the PPG signal to a cloud server; the receiving module is used for receiving a user emotion state value sent by the cloud server, the user emotion state value is obtained by processing a physiological characteristic vector by using a neural network model after the cloud server determines the physiological characteristic vector according to the PPG signal, and the physiological characteristic vector comprises blood oxygen saturation, heart rate and respiratory rate; and the output module is used for outputting the emotional state of the user according to the emotional state value of the user.
In another possible implementation manner, the emotion detection apparatus further includes a generation module, where the generation module is configured to generate an emotion state curve according to the emotion state values of the user at different times; the output module is further used for outputting a reminding signal for reminding the negative emotion when the duration of the negative emotion in the emotion state curve is longer than the preset duration.
The steps and advantages performed by the modules in the emotion detection apparatus of the third aspect can be found in the related description of the first aspect.
A fourth aspect provides an emotion detecting apparatus including: the device comprises an acquisition module, a receiving module, a determining module, a neural network processing module and a sending module; the acquisition module is used for acquiring target model parameters; the receiving module is used for receiving a PPG signal sent by the wearable equipment; the determining module is used for determining a physiological characteristic vector according to the PPG signal, wherein the physiological characteristic vector comprises the blood oxygen saturation, the heart rate and the respiratory rate; the neural network processing module is used for inputting the physiological characteristic vector into a neural network model with target model parameters; the sending module is used for sending the user emotion state value output by the neural network model to the wearable device.
In another possible implementation manner, the obtaining module is specifically configured to obtain a plurality of user emotion data, where each user emotion data includes a physiological feature vector and a user emotion state value; acquiring initial model parameters of a neural network model; and processing the plurality of user emotion data and the initial model parameters through a neural network algorithm to obtain target model parameters.
In another possible implementation, the neural network algorithm is a back propagation neural network algorithm, an adaptive resonance theory network algorithm, a learning vector quantization network algorithm, a kohonen network algorithm, or a hopfield network algorithm.
In another possible implementation manner, the determining module includes a decomposition unit, a denoising unit, a filtering unit, a constructing unit and a determining unit; a decomposition unit for decomposing the PPG signal into a plurality of IMF components and a residual component using a empirical mode decomposition method, the plurality of IMF components comprising a noise dominant component and a signal dominant component; the denoising unit is used for denoising each noise dominant component; the filtering unit is used for performing morphological filtering on each signal dominant component; the construction unit is used for constructing the denoised noise dominant component, the morphologically filtered signal dominant component and the residual component into a target PPG signal; the determination unit is used for determining a physiological feature vector according to the target PPG signal.
In another possible implementation, the decomposition unit is specifically configured to use the PPG signal as a signal to be processed; acquiring an upper envelope and a lower envelope of a signal to be processed; determining a target signal according to the signal to be processed and the upper envelope and the lower envelope of the signal to be processed; when the target signal is not the IMF component, the target signal is used as a signal to be processed, and a decomposition unit is triggered to acquire an upper envelope and a lower envelope of the signal to be processed; when the target signal is an IMF component, recording the target signal, and determining a difference signal according to the signal to be processed and the target signal; when the frequency of the differential signal is greater than the preset frequency, updating the signal to be processed into the differential signal, and triggering to acquire an upper envelope and a lower envelope of the signal to be processed; and when the frequency of the difference signal is less than or equal to the preset frequency, taking the difference signal as a residual component.
The steps and advantages performed by each module or unit in the emotion detection apparatus of the fourth aspect can be referred to the related description of the second aspect.
A fifth aspect provides a wearable device comprising a PPG sensor, a processor, and a memory; the PPG sensor is used for acquiring a PPG signal; the memory is used for storing programs and data; the processor is configured to implement the emotion detection method of the first aspect by executing a program.
A sixth aspect provides a cloud server comprising a processor and a memory, the memory for storing programs and data; the processor is configured to implement the emotion detection method of the second aspect by executing a program.
A seventh aspect provides an emotion detection system, which includes the wearable device of the fifth aspect and the cloud server of the sixth aspect.
An eighth aspect provides a computer-readable storage medium having stored therein instructions, which when run on a computer, cause the computer to perform the method of the above aspects.
A ninth aspect provides a computer program product comprising instructions which, when run on a computer, cause the computer to perform the method of the above aspects.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a schematic diagram of an artificial intelligence framework according to an embodiment of the present invention;
fig. 2 is a schematic diagram of an application environment according to an embodiment of the present application;
FIG. 3 is a schematic diagram of another application environment provided by an embodiment of the present application;
fig. 4 is a signaling interaction diagram of an emotion detection method provided in an embodiment of the present application;
fig. 5 is a block diagram of an emotion detection apparatus according to an embodiment of the present application;
fig. 6 is a block diagram of an emotion detection apparatus according to an embodiment of the present application;
fig. 7 is a block diagram of a wearable device provided in an embodiment of the present application;
fig. 8 is a structural diagram of a cloud server provided in the embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present invention will be described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
FIG. 1 shows a schematic diagram of an artificial intelligence body framework that describes the overall workflow of an artificial intelligence system, applicable to the general artificial intelligence field requirements.
The artificial intelligence topic framework described above is set forth below in terms of two dimensions, the "intelligent information chain" (horizontal axis) and the "IT value chain" (vertical axis).
The "smart information chain" reflects a list of processes processed from the acquisition of data. For example, the general processes of intelligent information perception, intelligent information representation and formation, intelligent reasoning, intelligent decision making and intelligent execution and output can be realized. In this process, the data undergoes a "data-information-knowledge-wisdom" refinement process.
The 'IT value chain' reflects the value of the artificial intelligence to the information technology industry from the bottom infrastructure of the human intelligence, information (realization of providing and processing technology) to the industrial ecological process of the system.
(1) Infrastructure:
the infrastructure provides computing power support for the artificial intelligent system, realizes communication with the outside world, and realizes support through a foundation platform. Communicating with the outside through a sensor; the computing power is provided by intelligent chips (hardware acceleration chips such as CPU, NPU, GPU, ASIC, FPGA and the like); the basic platform comprises distributed computing framework, network and other related platform guarantees and supports, and can comprise cloud storage and computing, interconnection and intercommunication networks and the like. For example, sensors and external communications acquire data that is provided to intelligent chips in a distributed computing system provided by the base platform for computation.
(2) Data of
Data at the upper level of the infrastructure is used to represent the data source for the field of artificial intelligence. The data relates to graphs, images, voice and texts, and also relates to the data of the Internet of things of traditional equipment, including service data of the existing system and sensing data such as force, displacement, liquid level, temperature, humidity and the like.
(3) Data processing
Data processing typically includes data training, machine learning, deep learning, searching, reasoning, decision making, and the like.
The machine learning and the deep learning can perform symbolized and formalized intelligent information modeling, extraction, preprocessing, training and the like on data.
Inference means a process of simulating an intelligent human inference mode in a computer or an intelligent system, using formalized information to think about and solve a problem by a machine according to an inference control strategy, and a typical function is searching and matching.
The decision-making refers to a process of making a decision after reasoning intelligent information, and generally provides functions of classification, sequencing, prediction and the like.
(4) General capabilities
After the above-mentioned data processing, further based on the result of the data processing, some general capabilities may be formed, such as algorithms or a general system, e.g. translation, analysis of text, computer vision processing, speech recognition, recognition of images, etc.
(5) Intelligent product and industrial application
The intelligent product and industry application refers to the product and application of an artificial intelligence system in various fields, and is the encapsulation of an artificial intelligence integral solution, the intelligent information decision is commercialized, and the landing application is realized, and the application field mainly comprises: intelligent manufacturing, intelligent transportation, intelligent home, intelligent medical treatment, intelligent security, automatic driving, safe city, intelligent terminal and the like.
Referring to fig. 2, a system architecture 200 is provided in accordance with an embodiment of the present invention. The data acquisition device 260 is used to acquire PPG signals. Photoplethysmography (PPG) is a non-invasive method of detecting changes in blood volume in living tissue by means of electro-optical means. When a light beam with a certain wavelength irradiates the surface of the skin at the finger end, the light beam is transmitted to the photoelectric receiver in a transmission or reflection mode, and the light intensity detected by the detector is weakened in the process due to the absorption attenuation effect of the skin muscle and blood at the detection end, wherein the absorption of the skin muscle, the tissue and the like to the light is kept constant in the whole blood circulation, the blood volume in the skin is changed in a pulsating mode under the action of the heart, the peripheral blood volume is the largest when the heart contracts, the light absorption amount is the largest, and the detected light intensity is the smallest; when the heart is in diastole, on the contrary, the detected light intensity is the maximum, so the light intensity received by the light receiver is in pulsatile change, and the change signal of the light intensity is converted into an electric signal, so the change of the volume pulse blood flow can be obtained. The volume pulse blood flow contains important physiological information about the blood flow including, but not limited to, the blood oxygen saturation, heart rate, and respiration rate of the human body.
The data acquisition device 260 determines emotion-related physiological characteristics such as blood oxygen saturation, heart rate, respiration rate, and the like from the PPG signal, and then sets corresponding user emotional state values for the physiological characteristics, thereby obtaining user emotional data including the physiological characteristics and the user emotional state values. User emotion data is stored in database 230 and training device 220 generates target model/rules 201 based on the user emotion data maintained in database 230. How training device 220 derives a target model/rule 201 based on the user emotion data will be described in more detail below, and target model/rule 201 is able to derive a corresponding user emotional state value from the input physiological feature vector.
The target models/rules obtained by the training device 220 may be applied in different systems or devices. In fig. 2, the execution device 210 is configured with an I/O interface 212 for data interaction with external devices, and a user may input data to the I/O interface 212 through a client device 240. I/O interfaces are I/O interfaces.
The execution device 210 may call data, code, etc. from the data storage system 250 and may store data, instructions, etc. in the data storage system 250.
The calculation module 211 processes the input data using the target model/rule 201. Specifically, the blood oxygen saturation, the heart rate and the respiratory rate are determined according to the PPG signal acquired from the I/0 interface 212, a physiological feature vector composed of the blood oxygen saturation, the heart rate and the respiratory rate is used as input data of the target model/rule 201, and the emotional state value of the user is obtained after the processing by the target model/rule 201. The calculation module 211 may also perform denoising processing on the PPG signal.
The method for calculating the blood oxygen saturation is as follows: the method comprises the steps of detecting oxyhemoglobin HbO2 and deoxyhemoglobin Hb by using red light (600-800 nm) and light close to infrared light (800-1000 nm) respectively to obtain a PPG signal of HbO2 and a PPG signal of Hb, and calculating a corresponding ratio through the beer-Lambert law to obtain the blood oxygen saturation. Wherein the Hb absorption coefficient is higher in a range of 600-800 nanometers (nm), and the HbO2 absorption coefficient is higher in a range of 800-1000 nanometers (nm).
The method of calculating heart and respiratory rates is roughly as follows: and filtering the PPG signal to obtain the number of wave crests in a certain time, and determining the heart rate and the respiratory rate according to the number of the wave crests. For example, if the time is 5 seconds after continuous sampling, the number of peaks in 5 seconds is N, the heart rate is N × 12, and the respiration rate is N × 6.
Finally, the I/O interface 212 returns the output of the calculation module 211 to the client device 240 for presentation to the user.
Further, the training device 220 may generate corresponding target models/rules 201 based on different data for different targets to provide better results to the user.
In the case shown in FIG. 2, the user may manually specify data to be input into the execution device 210, for example, to operate in an interface provided by the I/O interface 212. Alternatively, the client device 240 may automatically enter data into the I/O interface 212 and obtain the results, and if the client device 240 automatically enters data to obtain authorization from the user, the user may set the corresponding permissions in the client device 240. The user can view the result output by the execution device 210 at the client device 240, and the specific presentation form can be display, sound, action, and the like. The client device 240 may also serve as a data acquisition end to store the acquired PPG signals in the database 230.
It should be noted that fig. 2 is only a schematic diagram of a system architecture according to an embodiment of the present invention, and the position relationship between the devices, modules, etc. shown in the diagram does not constitute any limitation, for example, in fig. 2, the data storage system 250 is an external memory with respect to the execution device 210, and in other cases, the data storage system 250 may be disposed in the execution device 210.
Referring to fig. 3, a system architecture 300 is provided in accordance with an embodiment of the present invention. The execution device 210 is implemented by one or more servers, optionally in cooperation with other computing devices, such as: data storage, routers, load balancers, and the like; the execution device 210 may be disposed on one physical site or distributed across multiple physical sites. The execution device 210 may use data in the data storage system 250 or call program code in the data storage system 250 to implement the steps performed by the cloud server in the embodiment shown in fig. 4.
The user may operate wearable device 301 to interact with execution device 210. Wearable device 301 may be a smart watch, smart bracelet, smart headset, smart helmet, smart foot ring, or the like. Each user's wearable device 301 may interact with the enforcement device 210 via a communication network of any communication mechanism/communication standard, which may be a wide area network, a local area network, a peer-to-peer connection, etc., or any combination thereof.
In another implementation, one or more aspects of the performance device 210 may be implemented by each wearable device 301, e.g., the wearable device 301 may provide local data or feedback calculations for the performance device 210.
It is noted that performing all functions of the device 210 may also be performed by the wearable device 301. For example, wearable device 301 implements functionality to perform device 210 and provide services to its own user.
Artificial Neural Networks (ANN) are also known as neural networks or neural networks. A neural network is understood as an operational model, which is formed by a large number of nodes (or neurons) connected to each other. Each node represents a particular output function, called the excitation function. Every connection between two nodes represents a weighted value, called weight, for the signal passing through the connection, which is equivalent to the memory of the artificial neural network. When the connection mode, the weight value and the excitation function of the neural network are changed, the output of the network is changed. The neural network may be an approximation to an algorithm or a function in nature, or may be an expression of a logic strategy.
The operation of each layer in the neural network can be expressed mathematically
Figure BDA0002713016790000061
To describe: from the work of each layer in the physical layer neural network, it can be understood that the transformation of the input space into the output space (i.e. the row space to the column space of the matrix) is accomplished by five operations on the input space (set of input vectors), which include: 1. ascending/descending dimensions; 2. zooming in/out; 3. rotating; 4. translating; 5. "bending". Wherein 1, 2, 3 are operated by
Figure BDA0002713016790000062
The operation of 4 is completed by + b, and the operation of 5 is realized by a (). The expression "space" is used herein because the object being classified is not a single thing, but a class of things, and space refers to the collection of all individuals of such things. Where W is a weight vector, each value in the vector representing a weight value for a neuron in the layer of neural network. The vector W determines the spatial transformation of the input space into the output space described above, i.e. the weight W of each layer controls how the space is transformed. The goal of training the neural network, i.e. ultimately obtaining a weight matrix (of many layers) for all layers of the trained neural networkThe vector W of the layer forms a weight matrix). Therefore, the training process of the neural network is essentially a way of learning the control space transformation, and more specifically, the weight matrix.
Because it is desirable that the output of the neural network is as close as possible to the value actually desired to be predicted, the weight vector of each layer of the neural network can be updated by comparing the predicted value of the current network with the value actually desired to be predicted, and then updating the weight vector according to the difference between the predicted value and the value actually desired (of course, there is usually an initialization process before the first update, that is, the parameters are configured in advance for each layer of the neural network). Therefore, it is necessary to define in advance "how to compare the difference between the predicted value and the target value", which are loss functions (loss functions) or objective functions (objective functions), which are important equations for measuring the difference between the predicted value and the target value. Taking the loss function as an example, if the higher the output value (loss) of the loss function indicates the larger the difference, the training of the neural network becomes a process of reducing the loss as much as possible.
The working process of the neural network comprises two stages of a learning period and a working period. In the learning period, modifying the connection weight of the neural network to obtain a target model parameter of the neural network model; and in the working period, calculating the input data by using the neural network model with the target model parameters to obtain a recognition result.
The following emotion detection device introduces the emotion detection method of the present application, taking wearable equipment and a cloud server as examples. Referring to fig. 4, an embodiment of an emotion detection method provided by the present application includes:
step 401, the cloud server obtains target model parameters.
In this embodiment, the target model parameters are model parameters of the trained neural network model.
Step 402, the wearable device acquires a PPG signal of the pulse of the user through a PPG sensor.
In this embodiment, a PPG signal of the pulse of the user may be obtained by measuring a body part (e.g., a wrist, an arm, an ankle, an ear, etc.) of the user with the PPG sensor.
Step 403, the wearable device sends the PPG signal to a cloud server.
Step 404, the cloud server determines a physiological feature vector according to the PPG signal, wherein the physiological feature vector includes a blood oxygen saturation level, a heart rate and a respiratory rate.
Step 405, the cloud server inputs the physiological feature vector into a neural network model with target model parameters.
The cloud server takes the physiological characteristic vector as input data of the neural network model, and the emotional state value of the user is obtained after the input data is processed by the neural network model. Emotional states include, but are not limited to, neutral, angry, slight, aversive, frightened, happy, bored, and surprised. The user emotional state value may be a number, for example, 1 for neutral, 2 for happy, and 3 for unhappy. The user emotional state value may also be a binary value, e.g., 001 for neutral, 010 for happy, and 011 for sad.
Specifically, the neural network model includes an input layer, a hidden layer, and an output layer, and the number of layers of the hidden layer may be one or more. After an input layer of the neural network model receives physiological characteristic vectors to be identified, the input layer performs weighted calculation on the physiological characteristic vectors, then transmits calculation results to nodes of the hidden layers, each node of the hidden layers performs weighted calculation on the output of the node of the previous layer, and the calculation result of the node of the last hidden layer is transmitted to the node of the output layer. Since each node of the output layer represents a class of emotional states, the possible probability that the result is an emotional state is calculated. And when the value of a certain node of the output layer is far larger than the values of other nodes, determining that the node with the maximum value wins, and taking the emotional state represented by the node as the emotional state corresponding to the physiological characteristic vector to be recognized.
And step 406, the cloud server sends the user emotion state value output by the neural network model to the wearable device.
Optionally, the cloud server determines the emotional state of the user according to the emotional state value of the user, and sends the emotional state of the user to the wearable device.
Step 407, the wearable device outputs the emotional state of the user according to the emotional state value of the user.
After receiving the user emotional state value sent by the cloud server, the wearable device can remind the user of the current user emotional state through characters, vibration, audio signals (such as bells and music) and videos serving as reminding signals.
In the embodiment, the neural network model can better fit actual emotion data of the user, so that the emotion state of the user can be detected more accurately according to the neural network model.
Secondly, the PPG sensor of this application can acquire the PPG signal, and then calculates and obtains physiological characteristics such as oxyhemoglobin saturation, rhythm of the heart and respiratory rate, and the sensor that needs is few.
Thirdly, this application need not to save a large amount of data, and is lower to wearable device's memory space requirement like this, has low-cost advantage.
In another optional embodiment, the emotion detection method further includes: generating an emotional state curve according to the emotional state values of the user at different moments; and when the duration of the negative emotion in the emotion state curve is longer than the preset duration, outputting a reminding signal for reminding the negative emotion.
In the present embodiment, the duration of the negative emotion refers to a duration of continuous negative emotion, and the negative emotion includes, but is not limited to, anger, slight, aversion, fear, difficulty, surprise, and the like. After an emotional state curve is drawn according to the emotional state values of the users in a statistical time period (such as one hour, one day, one week or one month), the emotional state curve is detected, and when the duration of negative emotion in the emotional state curve is longer than a preset duration, a reminding signal is output so that the emotion of the users can be adjusted conveniently. The duration of the statistical time interval and the preset duration can be set according to actual conditions, and the method is not limited in the application.
It should be understood that if the change amplitude of the emotional state is detected to be greater than the preset amplitude in a certain period of time, a reminding signal is output to remind the user that severe emotional fluctuation is generated. The device can play a role in reminding diseases (such as heart diseases or hypertension) affected by severe emotional fluctuation, so that timely intervention can be performed, and the device also helps to analyze the diseases so as to prevent certain diseases. The duration of the detection period can be set according to actual conditions, and the value of the duration is not limited in the application.
In another optional embodiment, a plurality of user emotion data are acquired, each user emotion data comprises a physiological feature vector and a user emotion state value, and the physiological feature vector comprises blood oxygen saturation, heart rate and respiration rate; acquiring initial model parameters of a neural network model; and processing the plurality of user emotion data and the initial model parameters through a neural network algorithm to obtain target model parameters.
Optionally, the emotion data of the plurality of users are divided into a training set, a validation set and a test set. The training set is used to modify the initial model parameters to first candidate model parameters. Determining a fitness of a neural network model having a first candidate model parameter from the user emotion data of the validation set; and if the fitting degree is smaller than the preset fitting degree, adjusting the initial model parameters, wherein the training set is used for modifying the adjusted initial model parameters into second candidate model parameters. And if the fitting degree of the neural network model with the second candidate model parameters reaches the preset fitting degree, determining the second candidate model parameters as target model parameters. According to the above method, a model parameter having a degree of fitting larger than a preset degree of fitting can be obtained, thereby preventing overfitting. The test set is used to test the error of the neural network model with the target model parameters.
The model parameters of the neural network model comprise weights from input layer nodes to hidden layer nodes, weights from the hidden layer nodes to the hidden layer nodes, weights from the hidden layer nodes to output layer nodes, the number of layers of the hidden layer and the like. The initial model parameters are the initial values of the above parameters. The neural network algorithm may be, but is not limited to, a Back Propagation (BP) neural network algorithm, an adaptive resonance theory network algorithm, a learning vector quantization network algorithm, a kohonen network algorithm, or a hopfield network algorithm.
In an optional embodiment, the determining the physiological feature vector from the PPG signal comprises: decomposing the PPG signal into a plurality of Intrinsic Mode Function (IMF) components and a residual component using an empirical mode decomposition method, the plurality of IMF components including a noise dominant component and a signal dominant component; denoising each noise dominant component; performing morphological filtering on each signal dominant component; forming a target PPG signal by the denoised noise dominant component, the morphologically filtered signal dominant component and the residual component; a physiological feature vector is determined from the target PPG signal.
In this embodiment, Empirical Mode Decomposition (EMD) is a signal processing method suitable for processing nonlinear or non-stationary signals, which overcomes some disadvantages of the conventional time domain analysis method and has strong adaptivity. The decomposition of the PPG signal into IMF components and a residual component using an empirical mode decomposition method can be expressed as:
Figure BDA0002713016790000081
x (t) represents the PPG signal, ciIs the ith IMF component, n is the total number of IMF components, rnRepresenting the residual component.
Optionally, denoising each noise-dominant component includes: and denoising each noise dominant component by using a wavelet-like threshold processing method.
In this embodiment, the cloud server can denoise the PPG signal through an empirical mode decomposition method, and then determines the oxyhemoglobin saturation, the heart rate and the respiratory rate according to the denoised PPG signal, so that more accurate oxyhemoglobin saturation, heart rate and respiratory rate can be obtained, thereby increasing the accuracy of emotion detection. It should be understood that in addition to the cloud server being able to de-noise the PPG signal, the wearable device may also be able to de-noise the PPG signal.
In another alternative embodiment, the decomposing the PPG signal into a plurality of IMF components and a residual component using a empirical mode decomposition method includes:
step A1: the PPG signal is taken as the signal x (t) to be processed.
Step A2: an upper envelope xu (t) and a lower envelope xl (t) of the signal to be processed are obtained.
Finding the local maxima and minima points of x (t), and calculating the upper envelope xu (t) and the lower envelope xl (t) by cubic spline interpolation.
Step A3: the target signal d1(t) is determined from the signal to be processed x (t), the upper envelope xu (t) and the lower envelope xl (t) of the signal to be processed.
Alternatively, d1(t) ═ x (t) - (xu (t)) + xl (t))/2.
Step A4: when the target signal d1(t) is not the IMF component, the target signal d1(t) is taken as the signal x (t) to be processed, and step a2 and step A3 are triggered.
Step A5: when the target signal d1(t) is an IMF component, the target signal d1(t) is recorded and a difference signal r1(t) is determined according to the signal x (t) to be processed and the target signal d1 (t).
IMF component c1(t) ═ d1 (t);
r1(t)=x(t)-c1(t)。
step A6: when the frequency of the difference signal r1(t) is greater than the predetermined frequency, the signal x (t) to be processed is updated to the difference signal r1(t), and step a2 to step a5 are triggered.
Step A7: when the frequency of the difference signal r1(t) is less than or equal to the predetermined frequency, the difference signal is regarded as the residual component.
When the frequency of the difference signal r1(t) is less than or equal to the predetermined frequency, it indicates that the difference signal is a stable signal, and the iteration is stopped, and the EMD decomposition is completed.
In this embodiment, a plurality of IMF components and one residual component can be obtained by the above method iteration, thereby providing a feasible solution.
The emotion detection method of the present application is introduced in a specific application scenario as follows:
in this specific application scenario, it is assumed that the number of user emotion data is 5000. The emotional states include neutral, happy and sad, which are respectively represented by 1, 2 and 3.
Figure BDA0002713016790000091
The neural network algorithm takes the BP neural network algorithm as an example, and the initial hidden layer number of the BP neural network model is set to be 10. Taking the physiological characteristic vector as input data of a BP neural network model, taking a user emotion state value as an expected value of the BP neural network model, adopting a BP neural network algorithm to carry out operation, and carrying out iterative training when an average error between an output value of the BP neural network and the expected value is greater than or equal to a preset error; and when the average error is smaller than the preset error, obtaining a first candidate model parameter, and stopping training. And calculating the fitting degree of the BP neural network model with the first candidate model parameters, adjusting the number of hidden layers of the BP neural network model to 12 when the fitting degree of the BP neural network model is smaller than the preset fitting degree, and then obtaining the second candidate model parameters according to a BP neural network algorithm. And when the fitting degree of the BP neural network model with the second candidate model parameters reaches or exceeds the preset fitting degree, determining the second candidate model parameters as target model parameters.
Wearable equipment uses smart watch as an example, after smart watch acquires the PPG signal of the current moment, smart watch sends the PPG signal to a cloud server, the cloud server determines the blood oxygen saturation, the heart rate and the respiratory rate according to the PPG signal, then the physiological characteristic data are used as a neural network model with target model parameters, the neural network model processes the user emotion state value obtained by the physiological characteristic data and uses 3 as an example, then 3 is sent to smart watch, and smart watch determines that the user emotion state is too difficult according to 3. The collection duration is one hour, for example, and the user emotional state value is transmitted every minute. And when the emotion state value of the user for 5 continuous minutes is 3, broadcasting reminding voice to enable the user to adjust the emotion.
In the specific application scenario, the number of the user emotion state values, the number of hidden layers, the acquisition time, the preset time for measuring the duration of the negative emotion, the period for sending the user emotion state values, and the like are illustrative examples, and the values can be set according to actual conditions, which is not limited in the application.
The above embodiment describes a process of obtaining a neural network model with target model parameters by using digital representation of the emotional state of the user and obtaining the target model parameters according to the physiological feature vector, the emotional state value of the user and the initial model parameters. The emotional degree can be represented by a number or a binary value, and a neural network model can be established for the physiological characteristic vector and the emotional degree. For example, 21 indicates slight happiness, 22 indicates moderate happiness, and 13 indicates very happiness. The degree of emotion is not limited to the above example, and other degrees of emotion may be analogized to each other. The method and the device can also obtain target model parameters according to the physiological characteristic vector, the emotion degree of the user and the initial model parameters, so that the process of obtaining the neural network model with the target model parameters is achieved. It should be understood that the process of obtaining the target model parameters according to the physiological characteristic vector, the emotion degree of the user and the initial model parameters is similar to the process of obtaining the target model parameters according to the physiological characteristic vector, the emotion state value of the user and the initial model parameters, and the detailed description is omitted here.
The emotion detection method of the present application is described above, and the emotion detection device of the present application is described below. Referring to fig. 5, the present application provides an emotion detection apparatus 500 including:
a PPG sensor 501 for acquiring a PPG signal of the pulse of the user;
a sending module 502, configured to send the PPG signal to a cloud server;
a receiving module 503, configured to receive a user emotional state value sent by the cloud server, where the user emotional state value is obtained by processing a physiological characteristic vector using a neural network model after the cloud server determines the physiological characteristic vector according to the PPG signal, and the physiological characteristic vector includes a blood oxygen saturation level, a heart rate, and a respiratory rate;
and an output module 504, configured to output the user emotional state according to the user emotional state value.
The emotion detection apparatus 500 in this embodiment can implement the functions performed by the wearable device in the embodiment shown in fig. 4. The information interaction and execution process between the modules/units in the emotion detection apparatus 500 shown in fig. 5 are based on the same concept, and the technical effect brought by the method embodiment is the same as that of the present application, and specific contents can be referred to the description in the foregoing method embodiment of the present application, and are not repeated here.
In an optional embodiment, the emotion detection apparatus 500 further includes:
the generating module is used for generating an emotional state curve according to the emotional state values of the user at different moments;
and the output module is also used for outputting a reminding signal for reminding the negative emotion when the duration of the negative emotion in the emotion state curve is longer than the preset duration.
Referring to fig. 6, the present application provides an emotion detection apparatus 600 including:
an obtaining module 601, configured to obtain target model parameters;
a receiving module 602, configured to receive a photoplethysmography (PPG) signal sent by a wearable device;
a determining module 603, configured to determine a physiological feature vector from the PPG signal, the physiological feature vector including a blood oxygen saturation, a heart rate, and a respiratory rate;
a neural network processing module 604 for inputting the physiological feature vectors into a neural network model having target model parameters;
a sending module 605, configured to send the user emotional state value output by the neural network model to the wearable device.
The emotion detection apparatus 600 in this embodiment can implement the steps executed by the cloud server in the embodiment shown in fig. 4. The information interaction and execution process between the modules/units in the emotion detection apparatus 600 shown in fig. 6 are based on the same concept, and the technical effect brought by the method embodiment is the same as that of the present application, and specific contents can be referred to the description in the foregoing method embodiment of the present application, and are not repeated here.
In an alternative embodiment of the method of the invention,
the obtaining module 601 is specifically configured to obtain a plurality of user emotion data, where each user emotion data includes a physiological feature vector and a user emotion state value; acquiring initial model parameters of a neural network model; and processing the plurality of user emotion data and the initial model parameters through a neural network algorithm to obtain target model parameters.
In another alternative embodiment, the neural network algorithm is a BP neural network algorithm, an adaptive resonance theory network algorithm, a learning vector quantization network algorithm, a kohonen network algorithm, or a hopfield network algorithm.
In another alternative embodiment, the determining module 602 includes:
a decomposition unit for decomposing the PPG signal into a plurality of intrinsic mode function, IMF, components and a residual component using an empirical mode decomposition method, the plurality of IMF components including a noise dominant component and a signal dominant component;
a denoising unit, configured to denoise each noise-dominant component;
the filtering unit is used for performing morphological filtering on each signal dominant component;
the construction unit is used for constructing the denoised noise dominant component, the morphologically filtered signal dominant component and the residual component into a target PPG signal;
a determination unit for determining a physiological feature vector from the target PPG signal.
Optionally, the decomposition unit is specifically configured to use the PPG signal as a signal to be processed; acquiring an upper envelope and a lower envelope of a signal to be processed; determining a target signal according to the signal to be processed and the upper envelope and the lower envelope of the signal to be processed; when the target signal is not the IMF component, the target signal is used as a signal to be processed, and a decomposition unit is triggered to acquire an upper envelope and a lower envelope of the signal to be processed; recording the target signal when the target signal is an IMF component; determining a difference signal according to the signal to be processed and the target signal; when the frequency of the differential signal is greater than the preset frequency, updating the signal to be processed into the differential signal, and triggering to acquire an upper envelope and a lower envelope of the signal to be processed; and when the frequency of the difference signal is less than or equal to the preset frequency, taking the difference signal as a residual component.
The wearable device and the cloud server of the application are introduced from the hardware perspective as follows:
referring to fig. 7, one embodiment of a wearable device 700 of the present application includes:
PPG sensor 701, processor 702 and memory 703 connected by bus 707, receiver 704, transmitter 705 and display 706;
a PPG sensor 701 for acquiring a target PPG signal of the user's pulse;
a transmitter 705 for transmitting the target PPG signal to the cloud server under control of the processor 702;
a receiver 704, configured to receive, under control of the processor 702, a user emotional state value sent by the cloud server, where the user emotional state value is obtained by processing a physiological feature vector using a neural network model after the cloud server determines the physiological feature vector according to a target PPG signal, and the physiological feature vector includes a blood oxygen saturation level, a heart rate, and a respiratory rate;
a display 706 for displaying the user emotional state according to the user emotional state value under control of the processor 702.
The memory 703 is used to store programs and data;
in an alternative embodiment, the processor 702 is configured to generate an emotional state curve based on the emotional state values of the user at different times; and when the duration of the negative emotion in the emotional state curve is longer than the preset duration, controlling the display 706 to output a reminding signal for reminding the negative emotion.
Referring to fig. 8, an embodiment of a cloud server 800 of the present application includes:
a processor 801, a memory 802, a receiver 803, and a transmitter 804;
the memory 802 is used for storing programs and data;
the receiver 803 is used for receiving data;
the transmitter 804 is used for transmitting data;
the processor 801 executes the following method by calling a program stored in the memory 802:
acquiring target model parameters;
receiving a PPG signal sent by a wearable device;
determining a physiological feature vector from the PPG signal, the physiological feature vector comprising blood oxygen saturation, heart rate and respiration rate;
inputting the physiological characteristic vector into a neural network model with target model parameters;
and sending the emotional state value of the user output by the neural network model to the wearable device.
In an alternative embodiment, the processor 801 is further configured to perform the following method:
acquiring a plurality of user emotion data, wherein each user emotion data comprises a physiological feature vector and a user emotion state value; acquiring initial model parameters of a neural network model; and processing the plurality of user emotion data and the initial model parameters through a neural network algorithm to obtain target model parameters.
In another alternative embodiment, the neural network algorithm is a BP neural network algorithm, an adaptive resonance theory network algorithm, a learning vector quantization network algorithm, a kohonen network algorithm, or a hopfield network algorithm.
In another alternative embodiment, the processor 801 is specifically configured to perform the following method:
decomposing the PPG signal into a plurality of Intrinsic Mode Function (IMF) components and a residual component by using an empirical mode decomposition method, wherein the IMF components comprise a noise dominant component and a signal dominant component;
denoising each noise dominant component;
performing morphological filtering on each signal dominant component;
forming a target PPG signal by the denoised noise dominant component, the morphologically filtered signal dominant component and the residual component;
a physiological feature vector is determined from the target PPG signal.
In another alternative embodiment, the processor 801 is specifically configured to perform the following method:
taking the PPG signal as a signal to be processed;
acquiring an upper envelope and a lower envelope of a signal to be processed;
determining a target signal according to the signal to be processed and the upper envelope and the lower envelope of the signal to be processed;
when the target signal is not the IMF component, taking the target signal as a signal to be processed, and triggering to acquire an upper envelope and a lower envelope of the signal to be processed;
recording the target signal when the target signal is an IMF component;
determining a difference signal according to the signal to be processed and the target signal;
when the frequency of the differential signal is greater than the preset frequency, updating the signal to be processed into the differential signal, and triggering to acquire an upper envelope and a lower envelope of the signal to be processed;
and when the frequency of the difference signal is less than or equal to the preset frequency, taking the difference signal as a residual component.
The application provides an emotion detection system, which includes wearable device 700 in the embodiment shown in fig. 7 and cloud server 800 in the embodiment shown in fig. 8.
The present application provides a computer storage medium comprising instructions that, when run on a computer, cause the computer to perform the emotion detection method of any one of the above embodiments or alternative embodiments.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product.
The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the procedures or functions described in accordance with the present application are generated, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that a computer can store or a data storage device, such as a server, a data center, etc., that is integrated with one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications or substitutions do not depart from the spirit and scope of the present disclosure as defined by the appended claims.

Claims (15)

1. A method of emotion detection, comprising:
acquiring a PPG signal of the pulse of a user by a PPG sensor through photoplethysmography;
sending the PPG signal to a cloud server;
receiving a user emotion state value sent by the cloud server, wherein the user emotion state value is obtained by processing a physiological characteristic vector by using a neural network model after the cloud server determines the physiological characteristic vector according to the PPG signal, and the physiological characteristic vector comprises blood oxygen saturation, heart rate and respiratory rate;
and outputting the emotional state of the user according to the emotional state value of the user.
2. The method of claim 1, further comprising:
generating an emotional state curve according to the emotional state values of the user at different moments;
and when the duration of the negative emotion in the emotion state curve is longer than the preset duration, outputting a reminding signal for reminding the negative emotion.
3. A method of emotion detection, comprising:
acquiring target model parameters;
receiving a photoplethysmography (PPG) signal sent by a wearable device;
determining a physiological feature vector from the PPG signal, the physiological feature vector comprising blood oxygen saturation, heart rate and respiration rate;
inputting the physiological feature vector into a neural network model with the target model parameters;
and sending the user emotion state value output by the neural network model to the wearable device.
4. The method of claim 3, wherein the obtaining target model parameters comprises:
acquiring a plurality of user emotion data, wherein each user emotion data comprises a physiological feature vector and a user emotion state value;
acquiring initial model parameters of a neural network model;
and processing the plurality of user emotion data and the initial model parameters through a neural network algorithm to obtain target model parameters.
5. The method of claim 4, wherein the neural network algorithm is a back-propagation neural network algorithm, an adaptive resonance theory network algorithm, a learning vector quantization network algorithm, a kohonen network algorithm, or a hopfield network algorithm.
6. The method according to any one of claims 3 to 5, wherein the determining a physiological feature vector from the PPG signal comprises:
decomposing the PPG signal into a plurality of intrinsic mode function, IMF, components and a residual component using an empirical mode decomposition method, the plurality of IMF components including a noise dominant component and a signal dominant component;
denoising each of the noise-dominated components;
performing morphological filtering on each signal dominant component;
forming a target PPG signal by the denoised noise dominant component, the morphologically filtered signal dominant component and the residual component;
determining a physiological feature vector from the target PPG signal.
7. The method of claim 6, wherein the decomposing the PPG signal into IMF components and a residual component using a empirical mode decomposition method comprises:
taking the PPG signal as a signal to be processed;
acquiring an upper envelope and a lower envelope of the signal to be processed;
determining a target signal according to the signal to be processed, the upper envelope and the lower envelope of the signal to be processed;
when the target signal is not an IMF component, taking the target signal as the signal to be processed, and triggering the step of acquiring the upper envelope and the lower envelope of the signal to be processed;
when the target signal is an IMF component, recording the target signal, and determining a difference signal according to the signal to be processed and the target signal;
when the frequency of the differential signal is greater than a preset frequency, updating the signal to be processed into the differential signal, and triggering the step of acquiring the upper envelope and the lower envelope of the signal to be processed;
and when the frequency of the difference signal is less than or equal to a preset frequency, taking the difference signal as a residual component.
8. An emotion detection device, comprising:
the photoplethysmography PPG sensor is used for acquiring a PPG signal of the pulse of a user;
the sending module is used for sending the PPG signal to a cloud server;
the receiving module is used for receiving a user emotion state value sent by the cloud server, wherein the user emotion state value is obtained by processing a physiological characteristic vector by using a neural network model after the cloud server determines the physiological characteristic vector according to the PPG signal, and the physiological characteristic vector comprises blood oxygen saturation, heart rate and respiratory rate;
and the output module is used for outputting the emotional state of the user according to the emotional state value of the user.
9. The apparatus of claim 8, wherein the emotion detection apparatus further comprises:
the generating module is used for generating an emotional state curve according to the emotional state values of the user at different moments;
and the output module is also used for outputting a reminding signal for reminding the negative emotion when the duration of the negative emotion in the emotion state curve is longer than the preset duration.
10. An emotion detection device, comprising:
the acquisition module is used for acquiring target model parameters;
the receiving module is used for receiving a photoplethysmography (PPG) signal sent by the wearable equipment;
a determining module for determining a physiological feature vector from the PPG signal, the physiological feature vector comprising a blood oxygen saturation, a heart rate and a respiration rate;
a neural network processing module for inputting the physiological feature vector into a neural network model having the target model parameters;
and the sending module is used for sending the user emotion state value output by the neural network model to the wearable equipment.
11. The emotion detection apparatus of claim 10,
the acquisition module is specifically used for acquiring a plurality of user emotion data, and each user emotion data comprises a physiological feature vector and a user emotion state value; acquiring initial model parameters of a neural network model; and processing the plurality of user emotion data and the initial model parameters through a neural network algorithm to obtain target model parameters.
12. The emotion detection apparatus of claim 11, wherein the neural network algorithm is a back propagation neural network algorithm, an adaptive resonance theory network algorithm, a learning vector quantization network algorithm, a kohonen network algorithm, or a hopfield network algorithm.
13. The emotion detection apparatus of any of claims 10 to 12, wherein the determination module comprises:
a decomposition unit for decomposing the PPG signal into a plurality of intrinsic mode function, IMF, components and a residual component using an empirical mode decomposition method, the plurality of IMF components including a noise dominant component and a signal dominant component;
a denoising unit, configured to denoise each noise-dominant component;
the filtering unit is used for performing morphological filtering on each signal dominant component;
the construction unit is used for constructing the denoised noise dominant component, the morphologically filtered signal dominant component and the residual component into a target PPG signal;
a determination unit for determining a physiological feature vector from the target PPG signal.
14. The emotion detection apparatus of claim 13,
the decomposition unit is specifically configured to use the PPG signal as a signal to be processed; acquiring an upper envelope and a lower envelope of the signal to be processed; determining a target signal according to the signal to be processed, the upper envelope and the lower envelope of the signal to be processed; when the target signal is not an IMF component, taking the target signal as the signal to be processed, and triggering the decomposition unit to acquire an upper envelope and a lower envelope of the signal to be processed; when the target signal is an IMF component, recording the target signal, and determining a difference signal according to the signal to be processed and the target signal; when the frequency of the differential signal is greater than a preset frequency, updating the signal to be processed into the differential signal, and triggering the step of acquiring the upper envelope and the lower envelope of the signal to be processed; and when the frequency of the difference signal is less than or equal to a preset frequency, taking the difference signal as a residual component.
15. A computer storage medium comprising instructions that, when run on a computer, cause the computer to perform the emotion detection method of any of claims 1 to 7.
CN202011063209.4A 2020-09-30 2020-09-30 Emotion detection method and device Pending CN114305325A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011063209.4A CN114305325A (en) 2020-09-30 2020-09-30 Emotion detection method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011063209.4A CN114305325A (en) 2020-09-30 2020-09-30 Emotion detection method and device

Publications (1)

Publication Number Publication Date
CN114305325A true CN114305325A (en) 2022-04-12

Family

ID=81032569

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011063209.4A Pending CN114305325A (en) 2020-09-30 2020-09-30 Emotion detection method and device

Country Status (1)

Country Link
CN (1) CN114305325A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115047824A (en) * 2022-05-30 2022-09-13 青岛海尔科技有限公司 Digital twin multimodal device control method, storage medium, and electronic apparatus
CN115429272A (en) * 2022-09-16 2022-12-06 济南大学 Psychological health state assessment method and system based on multi-modal physiological signals
CN115840890A (en) * 2023-02-24 2023-03-24 北京科技大学 Emotion recognition method and device based on non-contact physiological signals
CN115944293A (en) * 2023-03-15 2023-04-11 汶上县人民医院 Neural network-based hemoglobin level prediction system for kidney dialysis

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115047824A (en) * 2022-05-30 2022-09-13 青岛海尔科技有限公司 Digital twin multimodal device control method, storage medium, and electronic apparatus
CN115429272A (en) * 2022-09-16 2022-12-06 济南大学 Psychological health state assessment method and system based on multi-modal physiological signals
CN115429272B (en) * 2022-09-16 2024-04-30 济南大学 Psychological health state assessment method and system based on multi-mode physiological signals
CN115840890A (en) * 2023-02-24 2023-03-24 北京科技大学 Emotion recognition method and device based on non-contact physiological signals
CN115944293A (en) * 2023-03-15 2023-04-11 汶上县人民医院 Neural network-based hemoglobin level prediction system for kidney dialysis
CN115944293B (en) * 2023-03-15 2023-05-16 汶上县人民医院 Neural network-based hemoglobin level prediction system for kidney dialysis

Similar Documents

Publication Publication Date Title
CN114305325A (en) Emotion detection method and device
Cavallo et al. Emotion modelling for social robotics applications: a review
Salichs et al. A new approach to modeling emotions and their use on a decision-making system for artificial agents
Sadeeq et al. Neural networks architectures design, and applications: A review
KR102154676B1 (en) Method for training top-down selective attention in artificial neural networks
Churamani et al. Learning empathy-driven emotion expressions using affective modulations
KR20130082701A (en) Emotion recognition avatar service apparatus and method using artificial intelligences
O'Halloran et al. A Comparison of Deep Learning Models in Human Activity Recognition and Behavioural Prediction on the MHEALTH Dataset.
CN111161883A (en) Disease prediction system based on variational self-encoder and electronic equipment thereof
Pandey et al. A multistage deep residual network for biomedical cyber-physical systems
CN114357237A (en) Electrocardiosignal and music signal matching method, system, device and medium
Ma et al. A prediction method for transport stress in meat sheep based on GA-BPNN
US20230136939A1 (en) User experience modeling system
Dash et al. Robust multiclass ECG arrhythmia detection using balanced trained neural network
Ktistakis et al. A multimodal human-machine interaction scheme for an intelligent robotic nurse
Gembaczka et al. Combination of sensor-embedded and secure server-distributed artificial intelligence for healthcare applications
Bielskis et al. Modelling of intelligent multi-agent based E-Health care system for people with movement disabilities
Raina et al. Intelligent and Interactive Healthcare System (I 2 HS) Using Machine Learning
KR102394615B1 (en) Blood pressure measuring device wereable on wrist of user
Ganapathy et al. Sensor based efficient decision making framework for remote healthcare
Jadhav et al. Heart sounds segmentation and classification using adaptive learning neural networks
Bien et al. Soft computing based emotion/intention reading for service robot
US20210063972A1 (en) Collaborative human edge node devices and related systems and methods
EP3787849A1 (en) Method for controlling a plurality of robot effectors
Mekruksavanich et al. Heterogeneous Recognition of Human Activity with CNN and RNN-based Networks using Smartphone and Smartwatch Sensors

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination