NL1043927B1 - A system for emotion detection and a method for personalized patterns in emotion-related physiology thereof - Google Patents

A system for emotion detection and a method for personalized patterns in emotion-related physiology thereof Download PDF

Info

Publication number
NL1043927B1
NL1043927B1 NL1043927A NL1043927A NL1043927B1 NL 1043927 B1 NL1043927 B1 NL 1043927B1 NL 1043927 A NL1043927 A NL 1043927A NL 1043927 A NL1043927 A NL 1043927A NL 1043927 B1 NL1043927 B1 NL 1043927B1
Authority
NL
Netherlands
Prior art keywords
emotion
signals
personalized
content signals
control signals
Prior art date
Application number
NL1043927A
Other languages
Dutch (nl)
Inventor
Rinaldo Meinders Erwin
Marvin Smits Reon
De Vries Stefan
Maria Johanna Klep Denise
Bos Jasper
Original Assignee
Mentech Innovation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mentech Innovation filed Critical Mentech Innovation
Priority to NL1043927A priority Critical patent/NL1043927B1/en
Application granted granted Critical
Publication of NL1043927B1 publication Critical patent/NL1043927B1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/088Non-supervised learning, e.g. competitive learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/09Supervised learning

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

A system for capturing and converting emotion content signals from a human or anima! subject for emotion feedback interaction, said system comprising: 5 Sensors arranged for sensing changes in patterns of physiological, visual, auditive or environmental signals (emotional content signals) that are related to changes in individuals' emotional state. - Means for processing the emotion content signals into emotion control signals using neural networks or artificial intelligence algorithms based on parametric representations 10 of said emotion content signals. Means for advanced labelling of the emotion content signals via categorization of labels. - Means for personalization of the emotion control signals via supervised model training with said labelled emotion content signals. Means for further personalization of the emotion content signals via unsupervised 15 model training. - Means to visualize the said personalized emotion control signals. - Means to notify or alarm a human (or animal) on the derived emotion control signals through sound, visual notifications, text, or haptic feedback. characterized in that the labels are categorized in self-reporting, observation, and automatic 20 detection, and that the labels are further categorized in labels with 1) information of the emotion and/or mood 2) information of the body position and movement, and 3) information of the environment condition of the human or anima! subject. 1043927

Description

TITLE A system for emotion detection and a method for personalized patierns in emotion-related physiology thereof
FIELD OF THE INVENTION
[001] The invention relates to a system for capturing and converting emotion content signals from a human or animal subject for emotion-controlled interaction, said system comprising: 18 - Sensors arranged for sensing changes in patterns of physiological, visual, auditive or environmental signals (emotional content signals) that are related to changes in individuals’ emotional state.
- Means for processing the emotion content signals into emotion control signals using neural networks or artificial intelligence algorithms based on parametric representations of said emotion content signals.
- Means for advanced labelling of the emotion content signals via categorization of labels.
- Means for personalization of the emotion control signals via supervised model training with said labelled emotion content signals.
- Means for further personalization of the emotion content signals via unsupervised model training.
- Means to visualize the said personalized emotion control signals.
- Means to notify or alarm a human (or animal) on the derived emotion control signals through sound, visual notifications, text, or haptic feedback.
characterized in that the labels are calegorized in self-reporting, observation, and automatic detection, and thal the labels are further categorized in labels with 1) information of the emotion and/or mood 2} information of the body position and movement, and 3) information of the environment condition of the human or animal subject,
[002] The invention also relates to a system, further comprising means to label the emotion content signals via a sensor, preferably of the following type: - An acoustic sensor. - An image sensor. - A smart phone. - An accelerometer sensor.
- A gyroscope. A computer device. - A pressure sensor. - A switch (on-off) sensor. - A switch with multiple positions.
- A sensor to input text messages. - Atouch sensor. - Atemperalure sensor, - A proximity sensor.
- A heart rate sensor. = A skin conductance sensor. - Arespiration rate sensor. - A geographical position sensor.
{003] The invention also relates to a method for personalized labelling for enhancing the reliability of the emotion control signals, the method comprising the steps of: a. Exposing the human or animal subject to a controlled and interactive environmental selling.
b. Sensing and receiving the emotional content signals of said human or animal subject, said emotional content signals being generated by said subject due to. said exposure to the interactive environmental setting.
c. lLabsiling the emotional content signals, by categorizing them in self-reporting, automatic detection, or observation, the labels further categorized in information about 1) the emotion and/or mood 2) the body position and movement, and 3) environment condition of the human or animal subject.
d. Training of a personalized emotion detection model via converting the labelled emotion content signals Into personalized emotion control signals using supervised artificial intelligence algorithms, such as neural networks.
e, Further personalization of emotion detection mode! via converting unlabelled content signals into personalized emotion control signals using unsupervised artificial intelligence algorithms, such as neural networks.
f. Storage of the characteristic parameters describing the personalized emotion detection model.
[004] The invention also relates to a method, further comprising the step of using grouped- wise, sample-wise, or population-wise labeling.
[005] The invention also relates fo a method, further comprising the steps of using personalized and grouped-wise, sample-wise, or population-wise labelling.
[006] The invention also relates to a method, further comprising the steps of re-labelling of the emotion content signals.
[007] The invention also relates fo a method, further comprising the step of using acceleration data for automatic labelling of the body stature, movement, and position of the human or animal subject.
[008] The invention also relates to a method, further comprising the step of using a multi- scale score of the arousal and valence labels of the emotion or mood to label the emotion content signals.
[008] The invention also relates to a method, further comprising the steps of using a 5-scale arousal score to label the emotion content signals: - Score 1 (inactive). - Score 2 (below average aroused). - Score 3 {averaged aroused). - Score 4 (above average aroused). - Score 5 {highly aroused).
[010] The invention also relates toa method, further comprising the steps of using a 5-scale valence score to label the emotion content signals: 1, Score 1 (highly negative emotions).
2. Score 2 (moderate negative emotions).
3. Score 3 (neutral emotions).
4. Score 4 (moderate positive emotions).
5. Score 5 (highly positive emotions).
[011] The invention also relates to a method, further comprising the step of using a combination of the arousal and valence score to label the emotion content signals, preferably a
6. S-scale arousal and 5-scale valence scale to label the emotion content signals.
7. T-scale arousal and 7-scale valence scale to label the emotion content signals.
8. 9-scale arousal and 9-scale valence scale to label the emotion content signals.
[012] The invention also relates to a method, further comprising the step of using a multi- scale arousal scale and multi-scale valence scale, characterized in that both scales are different from each other, to label the emotion content signals.
[013] The invention also relates fo a method, further comprising the step of using visual, video, or auditory observations to label the emotion content signals,
[014] The invention also relates to a method, further comprising the step of using behaviour analysis to label the emotion content signals.
[015] The invention also relates to a method to convert the personalized emotion control signals in emotion interaction feedback signals, comprising the steps of: - Classifying the personalized emotion control signals in states of high, medium, and low arousal, - Classifying the personalized emotion control signals in states of negative, neutral, and positive valence. - Deriving an emotion based on the classified states of valence and arousal. - Converting the derived emotion and related emotion control signals in personalized feedback signals. - Notify or alarm a human {or animal} on the derived emation control signals.
[016] The invention also relates to a method in which the personalized feedback signals, the notifications or alarm, are of the type:
9. Multi-scale visualization (traffic light).
10. Acoustic signal.
11. Voice message.
12. Text message.
13. Bound message.
14. Vibration signal. 15, Diagrams.
[017] The invention also relates to a method, in which the personalized emotion control signals are visualized as levels of high stress, medium stress, and low stress.
[618] The invention alse relates to a method, in which the personalized emotion control signals are visualized in a dashboard,
[019] The invention also relates to a computer implemented method that is executed in the cloud enabling the carrying out of a method thereby providing an automatic analysis of the scoring results and the timing of tests.
[020] The invention also relates lo a computer program stored on a non-volatile record carrier, said computer program containing instruction codes, which instruction codes when executed comprising a computer program.
[921] The invention also relates to a method and computer program for classifying the personalized emotion control signals as diagnostic criteria for mental or physical disorders.
[022] The invention also relates to a method and computer program for classifying the personalized emotion control signals as parametric variables to enable the comparison within and between individuals.
BACKGROUND OF THE INVENTION 5
[023] 2.6% of the world population has an intellectual impairment (IQ below 75}, this is around 200 million. Worldwide nearly 50 million people suffer from dementia / Alzheimer (source: bright focus foundation), with nearly 10 million new cases every year. Between 5- 8% of the general population aged 60 and over has dementia. It is estimated that 1 out of 30 160 children have Autism Spectrum Disorder (1% of the world population).
[024] Cognitively impaired clients are vulnerable to stress. Their caregivers face the daily challenge to identify and regulate their client's stress and emotional state, based on non- verbal cues, A delay in recognition of stress. can result in challenging behaviour, such as aggression or agitation. (see Janssen, C. G. C., Schuengel, €. & Stolk, J. (2002).
Understanding challenging behaviour in people with severe and profound intellectual disability: a stress-attachmen! model. Journal of Intellectual Disability Research, 46(6), 445- 453). Challenging behaviour has negative consequences for quality of life, makes client care and support difficult, and the burden on caregivers results in higher-than-average work-place absenteeism and staff turnover.
1925] Challenging behaviour occurs frequently in people with dementia (275% in primary care and over 80% in nursing homes) and in people with intellectual disability {70% in residential settings and 4% outside residential settings). Challenging behaviour in people with dementia or intellectual disability is widespread, persistent and increases in severity over time. Challenging behaviour has negative consequences on quality of life. it can harm clients and other residents, it makes client care and management difficult for healthcare professionals, and it increases sick leave and drop-out of staff. Challenging behaviour is one of the most important reasons for transitioning from community care to expensive intramural care. Timely and effective prevention and management of challenging behaviour may lower the burden on relatives and professionals, and early admission to long-term cars.
Prevention ot early monitoring and management of challenging behaviour may result in huge Cost savings.
[026] In recent years, increased attention for emotion detection is noticed. Emotion states includes states of pleasure (for instance happiness), displeasure (for instance sadness), low arousal (for instance quietness), high arousal (for instance surprised). Social media make use of icons to express emotions. Emotion is expressed by facial, vocal, and postural expressions, Emotion can be determined from physiological reaction (activation or arousal, for instance increases in heart rate), the change in activity in the autonomic nervous system (ANS), blood pressure responses, skin responses, pupillary responses, brain waves, and heart responses. Examples include the {BM's emotion mouse (Ark, Dryer, & Lu 1998) and a variety of wearable sensors designed by the Affective Computing Group at MIT (e.g.
Picard 2000).
[027] In recent years, the availability of measurement devices to measure physiological parameters of users is growing. Examples include heartbeat sensors, respiratory sensors, skin conductance sensors, blood pressure sensors, temperature sensors, oxygen sensors, accelerometer sensors, motion sensors, GPS sensors. These sensors are more and more integrated in the human vicinity (for instance integrated in smart watches, clothing, shoes) or are embedded in the body (for instance underneath the human skin, or inside the body).
The quality of the content of the signals is also increased.
[028] Further information on content analysis is generally available to the person skilled in the art, see for example the articles: - ‘Activity-aware Mental Stress Detection Using Physiological Sensors’ by Sun FT. Kuo C., Cheng HT. Buthpitiva S., Collins P., Griss M. from Carnegie Mellon University and Nokia Research Center, published in: Gris M., Yang G. {eds) Mobile Computing, Applications, and Services. MobiCASE 2010. Lecture Notes of the Institute for Computer Sciences, Social informatics and Telecommunications Engineering, vol 76. Springer, Berlin, Heidelberg.
- Towards mental stress detection using wearable physiological sensors by Wijsman J1, Grundlehner B, Liu H, Hermens H, Penders J., in Cont Proc IEEE Eng Med Biol Soc.
2011;2011:1798-801. doi: 10.1109/1EMBS.2011.6080512.
- ‘Stress Detection Using Wearable Physiological Sensors’ by Sandulescu V. Andrews S., Ellis D, Bellotto N., Mozos O.M. (2015). In: Ferrandez Vicente J., Alvarez-Sanchez J., de la Paz López F., Toledo-Moreo F., Adel H. (eds) Artificial Computation in Biology and Medicine. IWINAC 2015. Lecture Notes in Computer Science, vol 8107. Springer, Cham - ‘Stress Recognition Using Wearable Sensors and Mobile Phones’ by Akane Sano, Rosalind W. Picard, in: Proceeding ACH '13 Proceedings of the 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction, pages 871-8786.
[029] International patent application No, WO20060980371A2 and US patent application No.
US2014234815A1 both disclose an emotion content control system for combining emotion content signals for feedback interaction according to the preamble of claim 1.
[030] In addition to the increased quantity of emotion content and the increased distribution flexibility of these signals, vocal/sound, facial images, and postural expressions are sources that provide information about the emotion level of a user. Pattern recognition can be applied fo derive emotion levels from these parameters.
[031] In addition to these sources of emotion content, increased attention is given to derive emotion content from neurelransmitters or hormones. Information about the presence and the temporal concentration of these hormones in a human or animal body provides insight in the emotional status. Pattern recognition of the concentration profiles can be applied to derive emotion levels from these parameters. The hormone Dopamine is a feel-good hormone and is created in the brains. The hormone Oxytocin strengthens the bond between persons, Mother's milk contains much of this hormone. Endorphin is used tc face stress and pain. It is a sort of relieve for pain. Serotonin is the happiness hormone, generated in the gut and brains,
[032] More information about the working principles of neurotransmilters and hormones is given in: - www.medicalnewstoday com/ko/serotonin-facts-232248 - en.wikipedia.org/wiki/Serotonin - www. sciencedirect.comfscience/article/piVS0166432814004768 - www newhealthadvisor.com/Serotonin-and-Dopamineg. him ~ www life-enhancement.comimagazine/article/178-5-hip-enhance-your-mood-your- sleep-and-a-lot-more + eocinstitute.org/meditation/dhea_gaba_cortisol_hgh_melatonin_serotonin_endorphing
[033] Many means nowadays exist {a augment the perception of senses. 3D imaging via virtual reality glasses is for instance used to augment the experience of watching movies. Surround sound is used to augment the sensation of audio. 4D cinema use all kind of tricks to augment the sensation of performance, via movement, water droplets, etc.
[034] The care industry anticipates to these developments by introducing care games to augment the sensing performance of vulnerable people, like mentally disabled persons, retarded elderly etc. Care games for instance can augment the sensation of feeling, or interaction between an image and movement of the body. For example, a care game used to stimulate the activity of retarded people or dement elderly can be equipped with emotion detection to increase the participation in the game. Excitement can be stimulated by increased complexity of the offered game features; boredom can be avoided by offering different features or levels.
[035] It has also been suggested that improved user experience may be achieved by providing emotion content signals. Queasy five performances of great singers is common practice, by augmenting the experience by showing live recordings, including live voicelsing, live dance/performance and other visual live elements. Augmentation of the performance by live emotion of the remote or passed-away performer will augment the experience of the public. How great will it be to listen to a recorded live performance of Elvis's ‘How great thou are’, with the sensation of feeling his emotions as well via recorded emotion data from past live performances? Or experience the sensation of making a goal during a world champion soccer game.
[036] It has also been suggested that improved gaming experience may be achieved by providing emotion content signals. If the emotion of a game player is determined and simultaneously provided as input signal to control the course of a game, the gaming experience will be influenced. For example, if a gamer wants to relax it can program the game in such a way that excitement, captured by the emotion content signal, is mitigated by changing the degree of difficulty, the pace of the game, the appearance of the game, the environmental setting, the appearance of the characters and personalities, sic. It may also be programmed to enhance the emotional status via the emotional control signal.
[037] Hi has also been suggested that improved raining performance may be achieved by providing emotion content signals. If the emotion of a person, of a horse or a dog during training is determined and simultaneously provided as input signal to steer the training program, the results of the program may be enhanced. Far instance, the emotional status of a dressage horse may be used to influence the training program. H the trainer noticed stress build-up, it can decide to practice a for the horse known exercise to reduce stress and to give the horse confidence, In case the trainer detects happiness or positive emotions, he can decide to increase the degree of difficulty, or practice a difficult element of dressage programs. The same applies for dogs and other animals. Also for sportsmen, the emotional status may be used io steer the training program, based on positive and negative atiributes.
Also for soldiers, the emotional status may be used to steer the training program.
[038] it has also heen suggested that improved education and learning experience may be achieved by providing emotion content signals. if the emotion of a student during education or learning experience is determined and simultaneously provided as input signal to steer the educational program (E-learning or school class}, the efficiency of the educational effort is increased. if the student experiences stress, the teacher may decide to introduce stress- relaxation exercises, if the student experiences happiness and positive vibe or flow, the teacher may decide to increase the degree of difficulty. The emotion content signal may also be used to change the subject of the learning program, the degree of difficulty of the exercises.
[039] i has also been suggested that improved mission experience may be achieved by providing emotion content signals. If the emotion of a soldier or peace worker during operations and missions is determined and simultaneously provided as input signal to determing the deployment of a soldier or peace worker, the efficiency of the operation is increased. if the soldierexperiences stress, the officer in charge may decide to redefine the soldier's deployment in the mission or operation.
[040] lt has also been suggested that improved sports experience may be achieved by providing emotion content signals. if the emotion of a sportsmen during exercise, training or real game is determined and simultaneously provided as input signal to steer the sports achievement, the efficiency of the sporis achievement is increased. if the sportsman experiences stress, the coach may decide to introduce stress-relaxation exercises, if the sportsman experiences happiness and positive vibe or flow, the coach may decide to increase the degree of difficulty. The emotion content signal may also be used to change the subject of the sports program, the degree of difficulty of the exercises.
[041] It has also been suggested that arificial intelligence of a robotic apparatus may be achieved by providing emotion content signals fo it. The robot or autonomous robotic apparatus can be provided with emotion control signals to make the robotic apparatus for instance autonomous, interactive with the environment, responsive to emotional situations, sensitive to environmental influences, etc.
[042] United States patent US 8,256,825 B2 discloses an emotion script generating method, which is based on receiving means, generating means, adjusting means, and providing means, However, the system of US 9,256,825 B2 {ends to have several associated disadvantages including the following. The system does not include personalized emotion labels to train a personalized model via adificial intelligence algorithms.
[043] European patent no EP2845539(A1) discloses a method for calculating normalized physiological signals by taking into consideration differences in physiology signals due to environmental factors. The normalized physiological signals are Used io derive stress levels. United States Patent Application Publication No US 2017 / 0316164 A1 discloses a method Tor estimating a condition of a person via an ensemble of machine learners. The machine learners are {rained individually to make an estimate of the condition of the person based on features from a single physiological sensor or environment sensor, thereby avoiding re-training or cosi-intensive data collection and labelling. However, the method of US2017/0316164(A1) tends to have several associated disadvantages including the following. The system does not include time-resolved personalized and categorized emotion labels fo train a personalized model via artificial intelligence algorithms.
[044] Hence, a system for emotion detection based on sensors arranged for sensing changes in patterns of physiological, visual, auditive or environmental signals {emotional content signals) that are related to changes in individuals’ emotional state, means for processing the emotion content signals into emotion control signals using neural networks or artificial intelligence algorithms based on parametric representations of said emotion content signals, means for advanced labelling of the emotion content signals via categorization of labels, means for personalization of the emotion control signals via supervised model training with sald labelled emotion content signals, means for further personalization of the emotion content signals via unsupervised model training, means to visualize the said personalized emotion contro! signals, means to notify or alarm a human {or animal) on the derived emotion control signals through sound, visual notifications, text, or haptic feedback, characterized in that the labels are categorized in self-reporting, observation, and automatic detection, and that the labels are further categorized in labels with 1) information of the emotion and/or mood 2) information of the body position and movement, and 3) information of the environment condition of the human or animal subject, is advaniageous.
[045] Accordingly, the invention preferably seeks to mitigate, alleviate, or eliminate one or more of the abovementioned disadvantages singly or in any combination.
[048] According to a first aspect of the invention, parametric characteristics of the emotion content signal are converted to emotion control signals (i.e. ‘personalized’ by providing information about personal emotional experiences in the form of labels and the moment they occur. The physiological response of a human or animal subject to a prompted emotional stimulus is representative for the perceived emotion by that human or animal subject. Pattern recognition methodologies are used {fo derive emotions from the physiological response to specific stimuli. In this approach, an artificial intelligence model, such as a neural network model, is trained with labelled and/or unlabelled physiological! data from {est persons. The trained model can identify emotions based on generic patterns that are hidden in the large dataset trorn the test persons. If the trained model is applied to new data, it identifies personalised emotion control signals based on the individual's personalised physiological patterns, specific emotional values, such as valence or arousal. This identification of emotional values like valence or arousal, can be augmented if the model is trained with personal data from the human or animal subject. Personal emotion labels can be provided by an observer or by the human subject her/himself (self-reporting). {0471 A second aspect of the Invention is that a multi-scale score for the arousal and valence of the emotional experience described by an individual or of the emotional behavior as described by an observant of the person that is using the invention is used to label the emotion content signals. Scores are categorized based on the intensity, duration, and value of experienced and/or perceived 1) basic emotions 2) physiological activity 3) body movements 4) cognitive-emotional arousal and 5) emotional valence. A five-scale score for arousal reads for instance: score 1 (inactive) - score 2 (below average aroused) - score 3 (averaged aroused) - score 4 (above average aroused) - score 5 {highly aroused). A five-scale valence score ic personalize the emotion content signals reads for instance: score 1 {highly negative emotions) - score 2 (moderate negative emotions) - score 3 (neutral emotions) - score 4 (moderate positive emotions) - Score 5 (highly positive emotions). A further aspect of the invention is to combine the five-scale arousal and the five-scale valence score to personalize the emotion content signals. Also, three-scale, seven-scale, ning-scale, or even higher-scale scores can be used to personalize the emotion content signals. Also, a dominant score, for instance related to a burst of rage or escalation can be used to personalize the emotion content signals. in one embodiment, these labsis are provided by the human subject himself (so-called quantified self). In another embodiment, the labels are provided by someone else (so-called quantified other).
[048] A third aspect of the invention is that the provided emotion labels create personalized emotion control signals, which in turn are used to generate multi-scale emotion interaction feedback signals. These personalized emotion control signals are used for intervention, or for alarming or notification.
[049] The emotion control signal may specifically comprise meta-data which indicates the status. For example, the meta-data may indicate that the emotion status of a person follows a certain {rajectory, for instance a periodic change in stress during the day {low in the morning, higher in the afternoon, low in the evening) or during the year (high in winter, low in summer}, or change in emotion status during holidays. The meta-data may also directly indicate reference characteristics or objects, the data may for example indicate specific events like Christmas or periods in life, The analysis of the emotion content signal may allow for a fully automated extraction of the reference content information without requiring any additional information to be included. For example, possible seasonal variation in the emotion content signal may become apparent. The reference content information is adapted to the current and temporal emotion content signal via a self-learning algorithm. This self-learning ensures up-to-date reference content information. For example, the self learning algorithm identifies repeating periods of stress from the emotion content signals.
[050] According to a different feature of the invention, the personalized emotion control signal may be used to visualize the stress level of a human or animal subject. A care giver can use the personalized emotion control signals to notify stress development in people with dementia, an intellectual disabifity or physical challenge. A care giver can also use these signals for behavioural analysis,
[051] According to a different feature of the invention, the personalized emotion control signal may be used to control an interactive game. The emotion control signal is for instance used to pre-set or change the level of the game, the speed, the intensity, the difficulty degree, or level, etc. It may be a gradual or a sudden change. In case of a sad person, the game may be programmed by the emotion control signal to let the person experience a winning feeling, by easier assignments or by faster character building. In case of a happy person, the game may be programmed by the emotion control signal to give the person more challenges, via difficulty levels, less bonus points, less fast character building.
[052] According to a different feature of the invention, the personalized emotion control signal might be used to control a learning or training device. The emotions prompted by the 18 learning device (E-learning, training, sic) are detected, processed, and used via an emotion control signal to adapt the learning or training device accordingly. The device can be programmed to flatten emotion levels, strengthen emotion levels, ste.
[053] According to a different feature of the invention, the personalized emotion control signal may be used to control an actuator to enhance body or physiological parameters.
Also, the personalized emotion control signal may be used lo expose a human subject to heat or vibration.
[054] According to a different feature of the invention, the personalized emotion control signal may be used for diagnosis and treatment of physiological conditions, stress-related disorders, somatic symptom disorders, dissociative disorders, substance abuse disarders, developmental disorders, bipolar and related disorders, anxiety disorders, mood disorders, schizophrenia and psychotic disorders, dementia, eating disorders and related problems.
[055] These and other aspects, features and advantages of the invention will be apparent from and elucidated with reference to the embodiment(s) described hereinafter.
BRIEF DESCRIPTION OF THE DRAWINGS
[058] An embodiment of the invention will be described, by way of example only, with refarence to the drawings, in which: FIG. 1 represents the emotion classification diagram with the arousal and valence axis, in which common emotions are visualized.
FIG. 2 illustrates the working principle of a system for personalized emotion detection, consisting of a sensor device arranged for sensing and processing emotion content signals, connection to the cloud for cloud-based processing of the emotion content signals, a device to collect emotion labels for personalization and a device for visualization of the personalized emotion control signals,
DETAILED DESCRIPTION OF THE DRAWINGS
[057] The following description focuses on an embodiment of the invention applicable to an emotion content system particularly suited for a professional care giver environment, but it will be appreciated that the invention is not limited to this application. For brevity, the term content signal has in the description been used to include both single signal sequences and multiple signal sequences.
[0588] FIG. 1 illustrates the emotion classification diagram. The emotion radar is a welk known graphical visualization of the different emotions a human {or animal} subject might experience. The vertical axis represents the arousal scale, from low arousal to neutral {origin} to high arousal. The horizontal axis represents the valence scale, from negative, neutral {origin}, to positive. Negative low-arousal emotions (such as bored, depressed, and tired) are plotted in the left-lower quadrant. Negative high-arousal emotions {such as tense, angry and frustrated) are plotied in the upper-left quadrant. Positive, low-arousal emotions {like clam, relaxed and content) are plotted in the right-lower quadrant. Positive, high- arousal emotions (like excited, delighted, and happy) are plotted in the upper-right quadrant,
[059] FIG. 2 lustrates the working principle of a system for personalized emotion detection. A sensor system (100) is used to collect emotion content signals (101). The sensor system consists of a receiver (102), a processor {103} coupled to the receiver (102), a storage device (104) coupled to the processor {103}, and a transmitter (105) coupled to the processor {103). The processor (103) is operable to process multiple emotion content signals and to generate the enhanced emotion content signal (106). In the preferred embodiment, the receiver (102), the processor (103), the storage device {104} and transmitter (105) are embedded in garment-integrated wearables (such as a smart sock, glove, or shirt}, conventional smart watches or other wearable devices.
[060] The sensor system (100) comprises a receiver (102) which receives the emotion content signal (101) from an external source (the human or animal subject). The receiver (102) comprises all necessary functionality required for receiving the emotion content signal and to extract or convert this into a suitable format. For example, for a heartbeat sensor signal the receiver (102) comprises all required functionality for amplifying, filtering demodulating and decoding the received signal lo generate a base band emotion content signal. Signal processing parameters might be stored in the storage device (104).
[061] The emotion content signal (101) consists of more than one emotion content signals {101-1}, (101-2) … (101-n), being body signals, physiological signals, vocal signals, facial expression signals, or pre-processed emotion content signals. The emotion content signals (101-1), (101-23) … (101-n} come typically from the human or animal subject. Examples of emotion content signals include heartbeat, skin conductance, facial expressions, vocal signals, concentrations of hormones, elc. The enhanced emotion content signal is transmitted to the cloud via a transmitter (105). For this, several implementations and protocols can be used, such as 4G / 5G, Blustooth, WiFi, and IOT / LoRa.
[062] Receiving means (202), again via 4G / 5G, Bluetooth, WiFi, and IOT / LoRa, receive the enhanced emotion content signals. The cloud-based processer (203) converts the emotion content signals via artificial intelligence (Al} and pattern recognition methodologies, such a machine learning or neural networks, into emotion control signals. Labels are typically stored in the data storage (204) and is used to classify the emotion content signals. The snhanced emotion label signals (207) are used io provide enhanced emotion control signals (208) via model training. Meta-data stored in the storage device (204) comprises information which is indicative of the emotion history of user. The enhanced emotion label signals (207) are received by an ernotion label signal receiver (208). The enhanced emotion control signals (208) are transmitted by the emotion control signal transmitter (208).
[083] The emotion labels (308) are obtained via self-reporting, observation, and automatic detection. The category self-reporting contains all means to self-report an emotion, including but not limited to sensors for measuring voice (a microphone, acoustic sensor), sensors to input text (smart phone, computer device, tablet), sensors to input values (like a switch, a {ouch sensor, a pressure sensor). The category observation contains all means for observation of an emotion, including but not limited to sensors for measuring voice (a microphone, acoustic sensor), sensors to inputtext (smart phone, computer device, tablet), sensors to input values (like a swilch, a touch sensor, a pressure sensor), sensors to caplure facial expressions (image sensor, video, camera), etc. The category automatic detection contains all means for automatic detection of labels, including but not limited to sensors for measuring acceleration, sensor to determine the geographical position (GPS), sensors to determine the physiological status of subject (heart rale, skin conductance, respiration rate, etc}, sensors to capture facial expressions (Image sensor, video, camera),
sensors for measuring voice (a microphone, acoustic sensor), sic. The labels are further categorized in labels with 1) information of the emotion and/or mood 2) information of the body position and movement, and 3) information of the environment condition of the human or animal subject.
[064] The applied artificial intelligence (Al) and pattern recognition methodologies can be based on hath supervised or unsupervised earning, For example, an applied model can be trained with physiology data and labels collected in a reference setting with a control group of test persons who are exposed to prompled emotions or stressors.
[085] The user interface apparatus (300) is used to visualize the emotion control signals and to collect the emotion labels. The emotion labels (309) are collected via the label signal receiver (306) and processed via the emotion control signal processing unit (303). The processing unit generales an enhanced label signal (307). The enhanced emotion label signal is used to train artificial intelligence models and to deliver the enhanced emotion control signals (208).
[068] The enhanced emotion control signals (208) are received by the emotion control signal receiver (302), processed by the emotion control signal processing unit (303) and transmitted to the user via the emotion control signal actuator (305). The actuator generates the feedback interaction signal (308). The emotion control signal actuator {305) can be a microphone, creating sound as the feedback interaction signal (308). The emotion control signal actuator (305) can be a display, creating an image as the feedback interaction signal (308). This image can be a traffic light, a colour, a text message, a visual, a pictogram, etc. The emotion control signal actuator (305) can be a micro electro-mechanical system (MEMS), creating mechanical vibration as the feedback interaction signal (308).
[087] The emotion control signals (106) may be derived from the different signals from the 28 emotion content signal (101) for example by a suitable repetition, selectivity, emotion content. Allernatively, or additionally, existing pre-stored emotion control signals may be used. For example, the emotion content storage apparatus may comprise many pre-stored emotion content signals corresponding lo different possible events and reference information characteristics.
[088] In one embodiment, the sensor system (100) and user interface (Ui) apparatus (300) are separate devices. In another embodiment, the sensor system (100) and user interface {Ui} apparatus (300) are integrated in one device.
[069] In one embodiment, the emotion content signals (101), emotion labels (309) and feedback interaction signal (308) relate to the same human subject (quantified-seff application, self-reporting). In another embodiment, the emotion content signals {101) and emotion labels (309) relate to the same human (or animal} subject, but the feedback interaction signal (308) is received by another human (or animal) subject (quantified-other application).
[070] In one embodiment, the feedback interaction signal (308) is represented as a traffic light indicating a state of stress of a person with misunderstood behaviour, such as people with dementia or an intellectual disability. The emotion labels (308) are collected by the care giver to train the artificial intelligence model to get enhanced emotion control signals (208). The feedback interaction signal (308) is in that case used to provide better care, for instance by early notification of stress development such that the care giver can provide the necessary interventions to avoid an escalation, or by better understanding behaviour by an in-depth behavioural analysis of the stressors.
[071] In one embodiment, the feedback interaction signal {308} is represented as a dashboard indicating the time-resolved stress levels of a person with misunderstood behaviour, such as people with dementia or an intellectual disability. The emotion labels (309) are collected by the care giver to train the artificial intelligence model to. get enhanced emotion control signals (208). The feedback interaction signal (308) is in that case used to provide beter care, for instance by belter understanding behaviour by an in-depth behavioural analysis of the stressors. {072} in one embodiment, the feedback interaction signal (308) is represented as a dashboard indicating the time-resolved stress levels of a person himself or herself (Quantified Self). The emotion labels (308) are collected by the person himself or herself to train the artificial intelligence model to gel enhanced emotion control signals (208). The feedback interaction signal (308) is in that case used to provide better insight in a persons’ stress perception such that the person can for instance provide the necessary interventions to avoid aggression, escalations, or burn-out of the person. The dashboard can also be used for understanding a person's behaviour by an in-depth behavioural analysis of the stressors,
[073] In one embodiment, meta-data may thus be extracted from the emotion content signal and used io select a suitable pre-stored emotion content signal. This signal may have characteristics amended to correspond to e.g. the victory of a soccer game.
[074] Thus in some embodiments, the determination of content may be used to determine estimates of the reference content information for a given emotion content signal. For example, if it is determined that the emotion content signal relates to a football match an emotion control signal comprising e.g. scoring of a goal, or the sensation of a victory may be generated.
[075] The user interface apparatus (300) may provide the emotion control signal selectively.
For example, the emotion control signal may be provided only when predefined events occur. As a specific example, an emotion content storage apparatus may be provided as a medical apparatus which contains a number of features and control means including for example the following: - Control input for changing the intensity of the emotion experience.
- Control input for selecting an emotion genre.
- Control input for selecting a content item category.
- Control input for changing the dynamics of the emotion control signal, 19 «Control input for controlling an emotion contrast.
~ Control input or automatic means for selecting and/or storing a user profile.
- Means for entering a self-leaming mode (e.g., measuring or determining characteristics of the operations of the user interface apparatus such as the number of emotions, for example succeeding stress events can be emphasized in time).
- Polarization control means e.g., for controlling thal enhancement occurs only for predefined events.
- A Source selecter for selecting source information for the emotion contro! signal, such as e.g., which information from the emotion content signal to use (heartbeat, skin conductance, respiratory data, etc).
- A purpose selector for selecting e.g., a purpose of the emation experience thereby allowing the emotion control signal to be selected to achieve this purpose most suitably.
- A mood selector.
[076] The invention can be implemented in any suitable form including hardware, software, firmware, or any combination of these. However, preferably, the invention is implemented at least partly as computer software running on one or more (cloud-based) data processors and/or digital signal processors, The elements and components of an embodiment of the invention may be physically, functionally, and logically implemented in any suitable way. The functionality may be implemented in a single unit, in a plurality of units or as part of other functional units. As such, the invention may be implemented in a single unit or may be physically and functionally distributed between different units and processors.
[077] Although the present invention has been described in connection with the preferred embodiment, it is not intended to be limited to the specific form set forth herein. Rather, the scope of the present invention is limited only by the accompanying claims. In the claims, the term comprising does not exclude the presence of other elements or steps.
Furthermore, although individually listed, a plurality of means, elements or method steps may be implemented by e.g. a single unit or processor.
Additionally, although individual features may be included in different claims, these may possibly be advantageously combined, and the inclusion in different claims does not imply that a combination of features is no feasible and/or advantageous.
In addition, singular references do nol exclude a plurality.
Thus, references to "a", "an", "irst”, second” alc. do not preclude a plurality.
FIGURES
NUMBER LISTING (100) = sensor system {191} = emotion content signals {101-1} = emotion content signal (101-2) = emotion content signal {101-n} = emation content signal (102) = emotion content signal receiver (103) = processor (104) = storage device (105) = transmitter (106) = enhanced emotion content signal (200) = model process and storage device (202) = emotion content signal receiver {203) = model processing unit (204) = model parameter and dala storage (205) = emotion conirol signal transrnitter {206} = emotion label signal receiver (207) = emotion label signal (208) = emotion contro! signal {300} = Ul apparatus (302) = emotion control signal receiver {303} = emotion control signal processing unit (305) = emotion control signal actuator (306) = emotion label signal receiver (307) = enhanced emotion label signal (308) = feedback interaction signal (309) = emotion label

Claims (22)

CONCLUSIESCONCLUSIONS 1. Een systeem voor het vastleggen en omzetten van emotie-inhoudsignalen van een menselijk of dierlijk onderwerp voor emotie-gecontroleerde interactie, waarbij het systeem bestaat uit: - Sensoren die zijn ingericht voor het waarnemen van veranderingen in patronen van fysiologische, visuele, auditieve of omgevingssignalen (signalen van emotionele inhoud) die verband houden met veranderingen in de emotionele toestand van individuen.A system for capturing and converting emotion content signals from a human or animal subject for emotion-controlled interaction, the system comprising: - Sensors adapted to sense changes in patterns of physiological, visual, auditory or environmental cues (signals of emotional content) associated with changes in the emotional state of individuals. - Middelen voor het verwerken van de emotie-inhoudsignalen tot emotiecontrolesignalen met behulp van neurale netwerken of kunstmatige intelligentie-algoritmen op basis van parametrische representaties van de emotie- inhoudsignalen.Means for processing the emotion content signals into emotion control signals using neural networks or artificial intelligence algorithms based on parametric representations of the emotion content signals. - Middelen voor geavanceerde labeling van de emotie-inhoudsignalen via categorisatie van labels.- Means for advanced labeling of the emotion content signals through label categorization. - Middelen voor personalisatie van de emotiecontrolesignalen via begeleide modeltraining met genoemde gelabelde emotie-inhoudsignalen.Means for personalization of the emotion control signals through guided model training with said labeled emotion content signals. - Middelen voor verdere personalisatie van de emotie-inhoudsignalen via onbewaakte modeltraining.- Means for further personalization of the emotion content signals via unsupervised model training. - Middelen om de genoemde gepersonaliseerde emotiecontrolesignalen te visualiseren.- Means to visualize said personalized emotion control signals. - Middelen om een mens (of dier) te waarschuwen of alarmeren over de afgeleide emotiecontrolesignalen door middel van geluid, visuele meldingen, tekst of haptische feedback.- Means to warn or alarm a human (or animal) about the derived emotion control signals through sound, visual notifications, text or haptic feedback. gekenmerkt doordat de labels gecategoriseerd zijn in zelfrapportage, observatie en automatische detectie, en dat de labels verder gecategoriseerd zijn in labels met 1) informatie over de emotie en / of stemming 2) informatie over de lichaamspositie en beweging, en 3 ) informatie over de omgevingsconditie van de mens of dier.characterized in that the labels are categorized into self-report, observation, and automatic detection, and that the labels are further categorized into labels containing 1) information about the emotion and/or mood 2) information about the body position and movement, and 3) information about the environmental condition of man or animal. 2. Systeem volgens conclusie 1, verder omvattende middelen om via een sensor de emotie-inhoudsignalen te labelen, bij voorkeur van het volgende type: - Een akoestische sensor.2. System according to claim 1, further comprising means for labeling the emotion content signals via a sensor, preferably of the following type: - An acoustic sensor. - Een beeldsensor.- An image sensor. - Een slimme telefoon.- A smart phone. - Een accelerometer sensor.- An accelerometer sensor. - Een gyroscoop.- A gyroscope. - Een computerapparaat.- A computing device. - Een druksensor.- A pressure sensor. - Een schakelaar (aan-uit) sensor.- A switch (on-off) sensor. - Een schakelaar met meerdere standen.- A switch with multiple positions. - Een sensor om tekstberichten in te voeren.- A sensor to enter text messages. - Een aanraaksensor.- A touch sensor. - Een temperatuursensor.- A temperature sensor. - Een naderingssensor.- A proximity sensor. - Een hartslagsensor.- A heart rate sensor. - Een huidgeleidingssensor.- A skin conductance sensor. - Een ademhalingsfrequentiesensor.- A respiration rate sensor. - Een geografische positiesensor.- A geographic position sensor. 3. Een methode voor gepersonaliseerd labelen om de betrouwbaarheid van de emotiecontrolesignalen te vergroten, waarbij de methode de volgende stappen omvat: a) De mens of het dier blootstellen aan een gecontroleerde en interactieve omgeving.3. A method of personalized labeling to increase the reliability of the emotion control signals, the method comprising the following steps: a) Exposing the human or animal to a controlled and interactive environment. b) Het waarnemen en ontvangen van de emotionele inhoudssignalen van het menselijke of dierlijke subject, waarbij de emotionele inhoudssignalen worden gegenereerd door het subject als gevolg van de blootstelling aan de interactieve omgeving.b) Perceiving and receiving the emotional content signals from the human or animal subject, wherein the emotional content signals are generated by the subject as a result of exposure to the interactive environment. c) Het labelen van de emotionele inhoudssignalen, door ze te categoriseren in zelfrapportage, automatische detectie of observatie, de labels verder gecategoriseerd in informatie over 1) de emotie en / of stemming 2) de lichaamspositie en beweging, en 3) de omgevingsconditie van de mens of dier.c) Labeling the emotional content signals, by categorizing them into self-report, automatic detection, or observation, further categorizing the labels into information about 1) the emotion and/or mood 2) the body position and movement, and 3) the environmental condition of the human or animal. d) Training van een gepersonaliseerd emotiedetectiemodel door de gelabelde emotie- inhoudssignalen om te zetten in gepersonaliseerde emotiecontrolesignalen met behulp van kunstmatige intelligentie-algoritmen, zoals neurale netwerken.d) Training a personalized emotion detection model by converting the labeled emotion content signals into personalized emotion control signals using artificial intelligence algorithms, such as neural networks. e) Verdere personalisatie van het emotiedetectiemodel door niet-gelabelde inhoudssignalen om te zetten in gepersonaliseerde emotiecontrolesignalen met behulp van niet-gecontroleerde kunstmatige intelligentie-algoritmen, zoals neurale netwerken.e) Further personalization of the emotion detection model by converting unlabeled content signals into personalized emotion control signals using unsupervised artificial intelligence algorithms, such as neural networks. fl Opslag van de karakteristieke parameters die het gepersonaliseerde emotiedetectiemodel beschrijven.fl Storage of the characteristic parameters that describe the personalized emotion detection model. 4. Werkwijze volgens conclusie 3, verder omvattende de stap van het gebruik van gegroepeerde, steekproefsgewijze of populatiegewijze labeling.The method of claim 3, further comprising the step of using batch, sample, or population labeling. 5. Werkwijze volgens conclusie 3, verder omvattende de stappen van het gebruik van gepersonaliseerde en gegroepeerde, steekproefsgewijze of populatiegewijze etikettering.The method of claim 3, further comprising the steps of using personalized and grouped, randomized or population-based labeling. 6. Werkwijze volgens conclusie 3, verder omvattende de stappen van het opnieuw labelen van de emotie-inhoudsignalen.The method of claim 3, further comprising the steps of relabeling the emotion content signals. 7. Werkwijze volgens conclusie 3, verder omvattende de stap van het gebruiken van versnellingsgegevens voor het automatisch labelen van de lichaamsstatus, beweging en positie van het menselijke of dierlijke subject.The method of claim 3, further comprising the step of using acceleration data to automatically label the body status, movement and position of the human or animal subject. 8. Werkwijze volgens conclusie 3, verder omvattende de stap van het gebruiken van een multi-schaalscore van de opwindings- en valentie-labels van de emotie of stemming om de emotie-inhoudsignalen te labelen.The method of claim 3, further comprising the step of using a multi-scale score of the arousal and valence labels of the emotion or mood to label the emotion content signals. 9. Werkwijze volgens conclusie 8, verder omvattende de stappen van het gebruik van een 5-schaal arousal score om de emotie-inhoudsignalen te labelen: - Score 1 (inactief).The method of claim 8, further comprising the steps of using a 5-scale arousal score to label the emotion content signals: Score 1 (inactive). - Score 2 (ondergemiddeld opgewonden). - Score 3 (gemiddeld opgewonden).- Score 2 (below average excitement). - Score 3 (average excitement). - Score 4 (bovengemiddeld opgewonden).- Score 4 (above average excited). - Score 5 (zeer opgewonden).- Score 5 (very excited). 10. Werkwijze volgens conclusie 8, verder omvattende de stappen van het gebruik van een valentiescore op 5 schaal om de emotie-inhoudsignalen te labelen: - Score 1 (zeer negatieve emoties). - Score 2 (matige negatieve emoties). - Score 3 (neutrale emoties). - Score 4 (matige positieve emoties). - Score 5 (zeer positieve emoties).The method of claim 8, further comprising the steps of using a 5-scale valence score to label the emotion content signals: Score 1 (highly negative emotions). - Score 2 (moderate negative emotions). - Score 3 (neutral emotions). - Score 4 (moderate positive emotions). - Score 5 (very positive emotions). 11. Werkwijze volgens conclusie 8, verder omvattende de stap van het gebruiken van een combinatie van de arousal- en valentiescore om de emotie-inhoudsignalen te labelen, bij voorkeur een: - b-schaal opwinding en 5-schaal valentieschaal om de emotie-inhoudsignalen te labelen. - 7-schaal opwinding en 7-schaal valentieschaal om de emotie-inhoudsignalen te labelen. - 9-schaal opwinding en 9-schaal valentieschaal om de emotie-inhoudsignalen te labelen.A method according to claim 8, further comprising the step of using a combination of the arousal and valence score to label the emotion content signals, preferably a: b-scale arousal and 5-scale valence scale to label the emotion content signals to label. - 7 scale arousal and 7 scale valence scale to label the emotion content signals. - 9-scale arousal and 9-scale valence scale to label the emotion content signals. 12. Werkwijze volgens conclusie 8, verder omvattende de stap van het gebruik van een meervoudige opwindingsschaal en een meervoudige valentieschaal, met het kenmerk dat beide schalen van elkaar verschillen, om de emotie-inhoudsignalen te labelen.The method of claim 8, further comprising the step of using a multiple arousal scale and a multiple valence scale, characterized in that both scales are different from each other, to label the emotion content signals. 13. Werkwijze volgens conclusie 3, verder omvattende de stap van het gebruiken van visuele, video- of auditieve waarnemingen om de emotie-inhoudsignalen te labelen.The method of claim 3, further comprising the step of using visual, video or auditory observations to label the emotion content signals. 14. Werkwijze volgens conclusie 3, verder omvattende de stap van het gebruiken van gedragsanalyse om de emotie-inhoudsignalen te labelen.The method of claim 3, further comprising the step of using behavioral analysis to label the emotion content signals. 15. Een methode om de gepersonaliseerde emotiecontrolesignalen om te zetten in feedbacksignalen voor emotie-interactie, bestaande uit de volgende stappen: - Het classificeren van de gepersonaliseerde emotiecontrolesignalen in staten van hoge, gemiddelde en lage opwinding. - Classificatie van de gepersonaliseerde emotiecontrolesignalen in staten van negatieve, neutrale en positieve valentie. - Een emotie afleiden op basis van de geclassificeerde staten van valentie en opwinding. - Omzetten van de afgeleide emotie en gerelateerde emotiecontrolesignalen in gepersonaliseerde feedbacksignalen. - Een mens (of dier) waarschuwen of alarmeren over de afgeleide emotiecontrolesignalen.15. A method to convert the personalized emotion control signals into feedback signals for emotion interaction, consisting of the following steps: Classifying the personalized emotion control signals into states of high, medium and low arousal. - Classification of the personalized emotion control signals into states of negative, neutral and positive valence. - Deduce an emotion based on the classified states of valence and arousal. - Converting the derived emotion and related emotion control signals into personalized feedback signals. - Warn or alarm a human (or animal) about the derived emotion control signals. 16. Werkwijze volgens conclusie 15, waarbij de gepersonaliseerde feedbacksignalen, de notificaties of alarm, van het type zijn: - Multi-scale visualisatie (verkeerslicht).16. Method according to claim 15, wherein the personalized feedback signals, the notifications or alarm, are of the type: - Multi-scale visualization (traffic light). - Akoestisch signaal. - Spraakbericht. - Tekstbericht. - Geluidsbericht. - Trillingssignaal. - Diagrammen.- Acoustic signal. - Voice message. - Text message. - Sound message. - Vibration signal. - Diagrams. 17. Werkwijze volgens conclusie 15, waarbij de gepersonaliseerde emotiecontrolesignalen worden gevisualiseerd als niveaus van hoge stress, gemiddelde stress en lage stress.The method of claim 15, wherein the personalized emotion control signals are visualized as high stress, medium stress, and low stress levels. 18. Werkwijze volgens conclusie 15, waarbij de gepersonaliseerde emotiecontrolesignalen worden gevisualiseerd in een dashboard.The method of claim 15, wherein the personalized emotion control signals are visualized in a dashboard. 19. Door een computer geïmplementeerde methode die wordt uitgevoerd in de cloud, waardoor een methode volgens conclusie 3 en / of conclusie 15 kan worden uitgevoerd, waardoor een automatische analyse van de scoringsresultaten en de timing van tests wordt verschaft.A computer-implemented method that runs in the cloud, enabling a method according to claim 3 and/or claim 15 to be executed, providing automatic analysis of the scoring results and the timing of tests. 20. Computerprogramma opgeslagen op een niet-vluchtige registratiedrager, welk computerprogramma instructiecodes bevat, welke instructie codeert wanneer uitgevoerd, omvattende een computerprogramma volgens conclusie 19.A computer program stored on a non-volatile record carrier, said computer program containing instruction codes, said instruction encoding when executed, comprising a computer program according to claim 19. 21. Een methode en computerprogramma voor het classificeren van de gepersonaliseerde emotiecontrolesignalen ais diagnostische criteria voor mentale of fysieke stoornissen.21. A method and computer program for classifying the personalized emotion control signals as diagnostic criteria for mental or physical disorders. 22. Een methode en computerprogramma voor het classificeren van de gepersonaliseerde emotiecontrolesignalen als parametrische variabelen om de vergelijking binnen en tussen individuen mogelijk te maken.22. A method and computer program for classifying the personalized emotion control signals as parametric variables to allow comparison within and between individuals.
NL1043927A 2021-02-09 2021-02-09 A system for emotion detection and a method for personalized patterns in emotion-related physiology thereof NL1043927B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
NL1043927A NL1043927B1 (en) 2021-02-09 2021-02-09 A system for emotion detection and a method for personalized patterns in emotion-related physiology thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
NL1043927A NL1043927B1 (en) 2021-02-09 2021-02-09 A system for emotion detection and a method for personalized patterns in emotion-related physiology thereof

Publications (1)

Publication Number Publication Date
NL1043927B1 true NL1043927B1 (en) 2022-09-09

Family

ID=83744578

Family Applications (1)

Application Number Title Priority Date Filing Date
NL1043927A NL1043927B1 (en) 2021-02-09 2021-02-09 A system for emotion detection and a method for personalized patterns in emotion-related physiology thereof

Country Status (1)

Country Link
NL (1) NL1043927B1 (en)

Similar Documents

Publication Publication Date Title
US11974851B2 (en) Systems and methods for analyzing brain activity and applications thereof
US20200337631A1 (en) Systems, environment and methods for identification and analysis of recurring transitory physiological states and events using a portable data collection device
McArthur et al. Toward an ecological theory of social perception.
Clark The experience machine: how our minds predict and shape reality
US20220314078A1 (en) Virtual environment workout controls
Kritikos et al. Personalized virtual reality human-computer interaction for psychiatric and neurological illnesses: a dynamically adaptive virtual reality environment that changes according to real-time feedback from electrophysiological signal responses
EP3984044A1 (en) Virtual reality therapeutic systems
WO2018215575A1 (en) System or device allowing emotion recognition with actuator response induction useful in training and psychotherapy
WO2018074224A1 (en) Atmosphere generating system, atmosphere generating method, atmosphere generating program, and atmosphere estimating system
McDaniel et al. Therapeutic haptics for mental health and wellbeing
KR102235716B1 (en) Learning disorder diagnosing/cure apparatus and method using virtual reality
CN116529750A (en) Method and system for interface for product personalization or recommendation
NL1043927B1 (en) A system for emotion detection and a method for personalized patterns in emotion-related physiology thereof
Drigas et al. Games for empathy for sensitive social groups
Oberzaucher et al. Everything is movement: on the nature of embodied communication
Mladenovic Computational modeling of user states and skills for optimizing BCI training tasks
Hong et al. The quantified self
Lancioni et al. Assistive technology
NL1042207B1 (en) An emotion content control system for combining emotion content signals for feedback interaction and a method for enhancing the interaction with the human or animal subject thereof
EP4343787A1 (en) Method and system for monitoring a person in an environment
Schipor et al. Making E-Mobility Suitable for Elderly
JP6963669B1 (en) Solution providing system and mobile terminal
WO2022118955A1 (en) Solution providing system
Oh Exploring Design Opportunities for Technology-Supported Yoga Practices at Home
Al-Tunaib Check for Creating Affording Situations with Animate Objects Chris Baber ID Sara Al-Tunaib, and Ahmed Khattab University of Birmingham, Birmingham B15 2TT, UK