WO2022176808A1 - 情報処理システム、情報処理装置、情報処理方法及び情報処理プログラム - Google Patents
情報処理システム、情報処理装置、情報処理方法及び情報処理プログラム Download PDFInfo
- Publication number
- WO2022176808A1 WO2022176808A1 PCT/JP2022/005712 JP2022005712W WO2022176808A1 WO 2022176808 A1 WO2022176808 A1 WO 2022176808A1 JP 2022005712 W JP2022005712 W JP 2022005712W WO 2022176808 A1 WO2022176808 A1 WO 2022176808A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- subject
- information processing
- level
- mood
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 126
- 238000003672 processing method Methods 0.000 title claims description 20
- 230000036651 mood Effects 0.000 claims abstract description 112
- 210000004556 brain Anatomy 0.000 claims abstract description 81
- 230000008921 facial expression Effects 0.000 claims abstract description 80
- 230000014509 gene expression Effects 0.000 claims description 106
- 230000002996 emotional effect Effects 0.000 claims description 93
- 230000008451 emotion Effects 0.000 claims description 51
- 238000012545 processing Methods 0.000 claims description 19
- 238000004458 analytical method Methods 0.000 claims description 16
- 238000013523 data management Methods 0.000 claims description 13
- 230000007177 brain activity Effects 0.000 claims description 11
- 230000006872 improvement Effects 0.000 claims description 11
- 238000005259 measurement Methods 0.000 claims description 11
- 230000007613 environmental effect Effects 0.000 claims description 10
- 230000007935 neutral effect Effects 0.000 claims description 10
- 238000000034 method Methods 0.000 claims description 9
- 239000000284 extract Substances 0.000 claims description 8
- 230000003595 spectral effect Effects 0.000 claims description 4
- 238000007726 management method Methods 0.000 claims 1
- 230000036541 health Effects 0.000 abstract description 11
- 238000009223 counseling Methods 0.000 abstract description 10
- 210000000467 autonomic pathway Anatomy 0.000 abstract description 2
- 230000003340 mental effect Effects 0.000 abstract description 2
- 238000010586 diagram Methods 0.000 description 22
- 238000004891 communication Methods 0.000 description 10
- 230000015654 memory Effects 0.000 description 10
- 230000006870 function Effects 0.000 description 7
- 125000004122 cyclic group Chemical group 0.000 description 6
- 238000001514 detection method Methods 0.000 description 6
- 238000003745 diagnosis Methods 0.000 description 5
- 208000005168 Intussusception Diseases 0.000 description 4
- 230000006996 mental state Effects 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 208000020016 psychiatric disease Diseases 0.000 description 4
- 230000006399 behavior Effects 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 238000012360 testing method Methods 0.000 description 3
- 206010012374 Depressed mood Diseases 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 230000003542 behavioural effect Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 238000003825 pressing Methods 0.000 description 2
- 238000010079 rubber tapping Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 241000557639 Araucaria bidwillii Species 0.000 description 1
- 238000012935 Averaging Methods 0.000 description 1
- 241000406668 Loxodonta cyclotis Species 0.000 description 1
- 208000007684 Occupational Stress Diseases 0.000 description 1
- 210000003403 autonomic nervous system Anatomy 0.000 description 1
- 230000002490 cerebral effect Effects 0.000 description 1
- 238000009225 cognitive behavioral therapy Methods 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 210000001002 parasympathetic nervous system Anatomy 0.000 description 1
- 230000005180 public health Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 238000005728 strengthening Methods 0.000 description 1
- 210000002820 sympathetic nervous system Anatomy 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/165—Evaluating the state of mind, e.g. depression, anxiety
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/15—Biometric patterns based on physiological signals, e.g. heartbeat, blood flow
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/174—Facial expression recognition
- G06V40/176—Dynamic expression
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/70—Multimodal biometrics, e.g. combining information from different biometric modalities
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L25/00—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
- G10L25/48—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use
- G10L25/51—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination
- G10L25/63—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination for estimating an emotional state
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L25/00—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
- G10L25/48—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use
- G10L25/51—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination
- G10L25/66—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination for extracting parameters related to health condition
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H10/00—ICT specially adapted for the handling or processing of patient-related medical or healthcare data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L25/00—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
- G10L25/48—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use
- G10L25/51—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination
- G10L25/57—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination for processing of video signals
Definitions
- the present invention analyzes and visualizes various data such as questionnaires, voice, facial expressions, pulse waves, heartbeats, etc. of subjects to realize early detection of mental disorders and contribute to solving social problems.
- the present invention relates to an information processing system, an information processing apparatus, an information processing method, and an information processing program (hereinafter referred to as "information processing system, etc.”).
- subjects use a medical interview test site on a network such as the Internet to answer a questionnaire about stress (stress check), and then video chat with a counselor and a pulse wave measuring device.
- a medical interview test site on a network such as the Internet to answer a questionnaire about stress (stress check), and then video chat with a counselor and a pulse wave measuring device.
- the subject's interview data, facial expression image data, voice data, pulse wave / heart rate data, etc. are acquired, and based on the various data acquired, stress level, brain fatigue level and mood value are calculated.
- the emotions mental state
- the subject can be visualized, and information processing to support diagnosis and treatment.
- information processing to support diagnosis and treatment.
- the fatigue/stress examination system described in Patent Document 1 analyzes electrocardiogram/pulse wave data measured by an electrocardiogram/pulse wave measuring instrument using an analysis server on the cloud side, and analyzes the balance and strength of the autonomic nervous system. Therefore, the state of stress can be grasped numerically, and the analysis data can be transmitted to the client terminal side and visually displayed on the client terminal side.
- the health value estimation system described in Patent Document 2 includes location information and movement information acquired from various sensors provided in mobile terminals such as smartphones, power on/off, application startup logs, behavior such as the number of times the phone is used. Based on the history, characteristic behaviors that appear during stress (behavioral features) are classified into multiple clusters and quantified. Based on pre-measured heart rate data, machine learning is used to learn the relationship between stress conditions and building an estimation model. It is something to do. Then, by comparing numerical values of behavioral characteristics newly acquired using a portable terminal such as a smartphone with the constructed estimation model, it is possible to estimate a health value indicating the subject's own health condition.
- the fatigue/stress examination system described in Patent Document 1 simultaneously measures the electrocardiogram and pulse wave of a subject, measures the state of the subject's autonomic nerves from the electrocardiogram/pulse wave data, and determines the degree of fatigue and stress tendency. is centrally managed as fatigue/analysis result data so that it can be quantified.
- the fatigue/analysis result data does not reflect subjective judgment results based on the subject's own voice and facial expression, which are conducted by interviews and interviews with doctors, industrial physicians, public health nurses, etc. There is a possibility that analysis of subject's emotions cannot be achieved with high accuracy.
- the health value estimation system described in Patent Document 2 builds an estimation model using accurately quantified data that can be more appropriate teacher data in supervised machine learning, and estimates the subject's own health value. be able to.
- the health value estimation system cannot accurately quantify depressed mood if the subject himself/herself is not aware of depression. Subjective judgment results based on voice and facial expressions cannot be used as teacher data.
- an estimation model that sufficiently reflects the subject's emotions (mental state) cannot be constructed, so the health value estimation system described in Patent Document 2 may not be able to achieve highly accurate analysis of the subject's emotions. be.
- the subject's quantitative data (pulse wave, heart rate, etc.) measured by a measuring instrument etc.), for example, stress check result data based on a stress check questionnaire (occupational stress simple questionnaire, non-patent document 1), counseling using communication tools such as video chat (video call) Acquire data such as the subject's voice and facial expression images, calculate the stress level, brain fatigue level, and mood level values from those data, and calculate these calculated values from the X axis, Y axis, and Z axis By plotting in a different three-dimensional space, the subject's emotions (mental state) etc. are visualized, and an information processing system etc. that can support diagnosis and treatment are provided.
- a stress check questionnaire occupational stress simple questionnaire, non-patent document 1
- video chat video call
- an information processing apparatus connected to a terminal device of a subject and for visualizing the emotion of the subject comprises at least the subject's voice, facial expression image, pulse a data management unit that acquires data about waves; Calculate the degree of brain fatigue based on the frequency of the voice, extract the emotion of the subject from the facial expression image to calculate the degree of mood, perform frequency analysis on the pulse wave by fast Fourier transform, and perform frequency analysis on the high frequency section and an emotional expression engine that extracts the low frequency section and calculates the stress level, a three-axis processing unit that displays a graph in which points are plotted at coordinates corresponding to the brain fatigue level, the mood level, and the stress level in a three-dimensional space consisting of the X-axis, Y-axis, and Z-axis;
- the audio data is obtained by recording at least part of a video call with the subject via the terminal device
- the data on the facial expression image is obtained by recording at least part of a video call with the subject via the terminal device,
- the data management unit associates the data related to the subject's voice, facial expression image, and pulse wave with the date and time when the data was acquired, and stores the information in the information processing apparatus. stored in a storage means,
- the three-axis processing unit displays a graph in which points are plotted chronologically at coordinates corresponding to the brain fatigue level, the mood level, and the stress level of the subject for each date and time in the three-dimensional space. characterized by
- the three-dimensional space is divided into a plurality of type classification categories
- the 3-axis processing unit notifies the category to which the coordinate point corresponding to the brain fatigue level, the mood level, and the stress level of the subject belongs, among the plurality of classification categories by type in the three-dimensional space. characterized by
- an improvement proposal to be proposed to the subject is determined for each of the plurality of type-specific categories.
- the emotional expression engine section is characterized by notifying the improvement proposal for the category to which the points of the coordinates corresponding to the brain fatigue level, the mood level, and the stress level of the subject in the three-dimensional space belong.
- the audio data is obtained by the subject using a predetermined fixed phrase displayed on the terminal device during a video call with the subject via the terminal device. It is characterized by data in which read-out voice is continuously recorded for at least a predetermined recording time.
- the emotion expression engine unit executes a brain activity index measurement algorithm for measuring a CEM value representing a brain activity index, thereby obtaining the subject's obtaining one or more CEM values per
- the degree of brain fatigue is an average value of one or more CEM values.
- the data on the pulse wave is data obtained by dividing the pulse wave measured by the pulse wave measuring device into one section with a predetermined time interval as one section. It is characterized by
- the emotion expression engine section divides the pulse wave within the one section with a Hamming window for each section of the pulse wave, and divides the pulse wave within the Hamming window into the For the pulse wave, calculate the pulse interval PPI, which is the interval from the peak of the pulse wave to the next peak, and the time,
- the emotional expression engine unit is a time-PPI graph in which points are plotted at coordinates corresponding to the pulse interval PPI and the time in a two-dimensional space of the time on the horizontal axis and the PPI on the vertical axis for each section of the pulse wave.
- the emotional expression engine unit interpolates between discrete values in the time-PPI graph to apply a fast Fourier transform FFT, and calculates the power spectral density PSD of the FFT result in the low frequency interval and the high frequency interval, respectively.
- FFT fast Fourier transform
- the stress level is based on at least one of the LF value, the HF value and the LF/HF value.
- the low frequency section is from 0.04 Hz to less than 0.15 Hz
- the high frequency section is from 0.15 Hz to less than 0.4 Hz.
- the data related to the facial expression image includes at least a moving image of the subject's facial expression during a video call with the subject via the terminal device. It is characterized by being data recorded continuously until a predetermined recording time.
- the emotional expression engine unit executes a facial expression recognition algorithm to generate a moving image of the subject's facial expression included in the data related to the facial expression image.
- count each of the multiple emotional expressions recognized from The emotional expression engine unit calculates a ratio for each of the plurality of emotional expressions, multiplies a predetermined weight for each of the plurality of emotional expressions by the ratio for each of the plurality of emotional expressions for each emotional expression, and Calculate the mood index for each expression,
- the mood index is characterized by being based on a value obtained by dividing a maximum mood index among the mood indexes for each of the emotional expressions by a total value of the mood indexes for each of the emotional expressions.
- the plurality of emotional expressions are happy, surprise, neutral, fear, Characterized by anger, disgust, and sadness.
- the data management unit acquires environmental data including at least temperature and humidity in addition to data relating to the subject's voice, facial expression image, and pulse wave.
- the emotion expression engine multiplies the brain fatigue level, the mood level, and the stress level by weighting factors determined based on the discomfort index calculated from the temperature and humidity included in the environmental data, respectively. It is characterized by adjusting each value of the brain fatigue level, the mood level, and the stress level.
- the data management unit includes, in addition to data related to the subject's voice, facial expression image, and pulse wave, an interview including a score of the stress check result of the subject. get the data,
- the emotional expression engine multiplies the brain fatigue level, the mood level, and the stress level by weighting factors determined according to the scores included in the medical interview data, respectively, to obtain the brain fatigue level and the mood level. , and each value of the stress degree is adjusted.
- the information processing method is executed in a server connectable to a terminal device of a subject via a network, acquiring at least data on the subject's voice, facial expression image, and pulse wave from the terminal device; Calculate the degree of brain fatigue based on the frequency of the voice, extract the emotion of the subject from the facial expression image to calculate the degree of mood, perform frequency analysis on the pulse wave by fast Fourier transform, and perform frequency analysis on the high frequency section and a step of extracting a low frequency section and calculating the stress level; displaying a graph in which points are plotted at coordinates corresponding to the brain fatigue level, the mood level, and the stress level in a three-dimensional space consisting of the X-axis, Y-axis, and Z-axis;
- the audio data is obtained by recording at least part of a video call with the subject via the terminal device
- the data on the facial expression image is obtained by recording at least part of a video call with the subject via the terminal device,
- the data on the pulse wave is
- the information processing system includes: the information processing device; a terminal device that can access the information processing device via a network; The terminal device transmits at least the data on the voice, the data on the facial expression image, and the data on the pulse wave to the information processing device, The information processing device receives the data on the voice, the data on the facial expression image, and the data on the pulse wave, and calculates the brain fatigue level, the mood level, and the stress level calculated based on the received data.
- a graph is sent to the terminal device, and points are plotted at the coordinates corresponding to the brain fatigue level, the mood level, and the stress level in a three-dimensional space consisting of the X-axis, Y-axis, and Z-axis. It is characterized by displaying.
- An embodiment of the information processing program according to the present invention is characterized in that the program is executed by a computer to cause the computer to function as each part of the information processing apparatus.
- Another embodiment of the information processing program according to the present invention is characterized in that the program is executed by a computer to cause the computer to execute each step of the information processing method.
- the information processing system and the like according to the present invention can be used not only for quantitative data such as pulse waves and heartbeats obtained from a pulse wave measuring device, but also for the results of stress checks, the voice and face of a subject in counseling using a video call.
- data such as facial expression images, calculating the stress level, brain fatigue level, and mood level values from those data, and plotting them in a three-dimensional space consisting of the X axis, Y axis, and Z axis, the subject's
- visualizing emotions and providing an information processing system that can support diagnosis and treatment it is possible to analyze the emotions of subjects with high accuracy, thereby realizing early detection of mental disorders. can contribute to solving social issues.
- FIG. 1 is a block diagram showing an example hardware configuration of an information processing apparatus according to an embodiment of the present invention
- FIG. 1 is a block diagram showing the configuration of an information processing device according to one embodiment of the present invention
- FIG. 4 is a diagram showing an example of data stored in a user information database of the information processing apparatus shown in FIG. 3 in tabular form
- FIG. It is a flowchart which shows the flow of the process which collects various data from a test subject's terminal device.
- FIG. 4 is a diagram showing an example of a user interface for conducting a stress check using a questionnaire
- FIG. 10 is a diagram showing an example of a user interface prompting user registration after a stress check;
- FIG. 10 is a diagram showing an example of a screen displaying stress check results in the form of a radar chart;
- FIG. 10 is a diagram showing an example of a screen displaying comments about stress check results.
- FIG. 4 is a diagram showing how a pulse wave measuring device is used to measure a pulse wave and heart rate from a subject's fingertip.
- FIG. 10 is a diagram showing an example of a screen display of a subject's terminal device when the subject's pulse wave and heart rate are measured by a pulse wave measuring instrument.
- FIG. 10 is a diagram showing an example of a screen display for acquiring an image of a subject's facial expression from a video call between a counselor during counseling and the subject.
- FIG. 10 is a diagram showing an example of a screen display for acquiring a subject's voice from a video call between a counselor during counseling and the subject.
- FIG. 4 is a schematic diagram showing the configuration of an emotional expression engine section that calculates various indices representing brain fatigue, mood, and stress from various collected data.
- FIG. 4 is a diagram for explaining weighting determined based on Russell's cyclic model of emotions;
- FIG. 4 is a diagram showing an example of calculating mood levels from various emotional expressions obtained based on Russell's cyclic model of emotions;
- FIG. 4 is a diagram showing an example of numerical conversion for plotting the obtained values of various indexes representing brain fatigue, mood and stress in a three-axis space.
- FIG. 4 is a diagram showing an example of a graph display of a three-dimensional space consisting of X-, Y-, and Z-axes plotting changes in a subject's emotion obtained by an emotion expression engine in chronological order
- FIG. 10 is a diagram showing an example of a graph display of a three-dimensional space composed of X, Y, and Z axes in which changes in emotions of another subject obtained by the emotion expression engine are plotted in chronological order
- FIG. 4 is a diagram showing an example of classification categories by type defined in a three-dimensional space consisting of a tension axis (X-axis), a brain fatigue axis (Y-axis), and a mood axis (Z-axis).
- FIG. 1 shows an example of the configuration of an information processing system according to one embodiment of the present invention.
- An information processing system for visualizing the emotions of a subject illustratively includes an information processing device 10 and n (n is an arbitrary integer value equal to or greater than 1) terminal devices 20-n.
- terminal devices 20-1, 20-2 to 20-n are shown as n terminal devices.
- terminal devices 20 when describing without distinguishing these n terminal devices, some reference numerals will be omitted and they will simply be referred to as "terminal devices 20".
- the information processing device 10 is, for example, a computer connectable to the network N such as a server.
- the terminal device 20 is a terminal connectable to the network N, such as a personal computer, a notebook computer, a smart phone, a mobile phone, or the like.
- the network N may be, for example, an open network such as the Internet, an intranet connected by a dedicated line, or a closed network.
- the network N is not limited to this, and a closed network and an open network can be used in combination, as appropriate, according to the required security level and the like.
- the information processing device 10 and the terminal device 20 are connected to the network N and can communicate with each other.
- the subject (user) can use the terminal device 20 to access the information processing device 10 and transmit the responses to the stress check questionnaire (interview sheet) to the information processing device 10 .
- the stress check questionnaire is, for example, a stress check questionnaire in the Ministry of Health, Labor and Welfare stress check implementation program (Non-Patent Document 1).
- the subject can also make a video call (video chat) with the counselor via the terminal device 20 in order to receive counseling from the counselor.
- the terminal device 20 can transmit data on the subject's pulse wave measured using a pulse wave measuring device to the information processing device 10 .
- a device that measures a pulse wave from a subject's fingertip can be used as the pulse wave measuring device (Non-Patent Document 2).
- the information processing device 10 can acquire at least data on the subject's voice, facial expression image, and pulse wave from the terminal device 20, and based on these data, the degree of brain fatigue, mood, stress, and It is possible to calculate an index that expresses the subject's emotion (mental state), such as degree.
- FIG. 2 is a block diagram showing an example of the hardware configuration of an information processing device according to one embodiment of the present invention.
- the reference numerals corresponding to the hardware of the information processing apparatus 10 are described without parentheses.
- the hardware configuration of the terminal device 20 is similar to that of the information processing device 10, the reference numerals corresponding to the hardware of the terminal device 20 are described with parentheses.
- the information processing device 10 is, for example, a server (computer), and illustratively includes a CPU (Central Processing Unit) 11, a memory 12 including a ROM (Read Only Memory), a RAM (Random Access Memory), etc., and a bus 13. , an input/output interface 14 , an input unit 15 , an output unit 16 , a storage unit 17 , and a communication unit 18 .
- a CPU Central Processing Unit
- ROM Read Only Memory
- RAM Random Access Memory
- the CPU 11 executes various processes according to programs recorded in the memory 12 or programs loaded from the storage unit 17 to the memory 12 .
- the CPU 11 can execute, for example, a program for causing a server (computer) to function as an information processing device capable of visualizing the subject's emotions and supporting diagnosis and treatment. It is also possible to implement at least part of the functions of the information processing apparatus in the form of hardware such as an application specific integrated circuit (ASIC).
- ASIC application specific integrated circuit
- the memory 12 also stores data necessary for the CPU 11 to execute various processes as appropriate.
- the CPU 11 and memory 12 are interconnected via a bus 13 .
- An input/output interface 14 is also connected to the bus 13 .
- An input unit 15 , an output unit 16 , a storage unit 17 and a communication unit 18 are connected to the input/output interface 14 .
- the input unit 15 can be realized by an input device such as a keyboard and a mouse that is independent of the main body that accommodates the other parts of the information processing device 10. Various information can be input according to the operation. Note that the input unit 15 may be configured with various buttons, a touch panel, a microphone, or the like.
- the output unit 16 is composed of a display, a speaker, etc., and outputs data related to text, still images, moving images, audio, and the like.
- the text data, still image data, moving image data, audio data, etc. output by the output unit 16 are output from a display, a speaker, or the like as characters, images, moving images, or audio so that the user can recognize them.
- the storage unit 17 is composed of a storage device such as a semiconductor memory such as a DRAM (Dynamic Random Access Memory), a solid state drive (SSD), a hard disk, or the like, and can store various data.
- a storage device such as a semiconductor memory such as a DRAM (Dynamic Random Access Memory), a solid state drive (SSD), a hard disk, or the like, and can store various data.
- DRAM Dynamic Random Access Memory
- SSD solid state drive
- hard disk or the like
- the communication unit 18 implements communication with other devices.
- the communication unit 18 can communicate with other devices (for example, the terminal devices 20-1, 20-2 to 20-n) via the network N with each other.
- the information processing apparatus 10 is appropriately provided with a drive as necessary.
- Removable media such as magnetic disks, optical disks, magneto-optical disks, or semiconductor memories are installed in the drives as appropriate.
- the removable media has a function to visualize the subject's emotions, etc. by calculating the stress level, brain fatigue level, and mood level of the subject and plotting them in a three-dimensional space consisting of the X-axis, Y-axis, and Z-axis.
- Programs for implementation and various data such as text data and image data are stored.
- Programs and various data read from the removable medium by the drive are installed in the storage unit 17 as required.
- the terminal device 20 illustratively includes a CPU 21, a memory 22, a bus 23, an input/output interface 24, an input unit 25, an output unit 26, a storage unit 27, a communication unit 28 and.
- a CPU 21 central processing unit
- a memory 22 a main memory
- a bus 23 an input/output interface 24
- an input unit 25 an output unit 26
- a storage unit 27 a communication unit 28
- Each of these units has the same function as each unit with the same name that is provided in the above-described information processing apparatus 10 and is different only in reference numerals. Therefore, redundant description is omitted.
- each hardware included in the terminal device 20 a display, and a speaker may be realized as an integrated device.
- FIG. 3 is a block diagram showing the configuration of an information processing apparatus according to one embodiment of the present invention.
- a server (computer) acquires at least the subject's voice, facial expression image, and pulse wave data from the terminal device 20, calculates the degree of brain fatigue based on the frequency of the voice, facial expression extracting the emotion of the subject from the image and calculating the mood level, performing frequency analysis on the pulse wave by fast Fourier transform, extracting the high frequency section and the low frequency section and calculating the stress level, X axis, In a three-dimensional space consisting of the Y axis and the Z axis, when a program for performing processing such as displaying a graph plotting points at coordinates corresponding to brain fatigue level, mood level, and stress level is executed,
- the server exemplarily functions as the information processing apparatus 10, and includes at least an emotion expression engine unit 111, a three-axis processing unit 112, and
- the storage unit 17 can be used to make the storage unit 17 function as the user information database 171 .
- the user information database 171 can be configured by an external storage device separate from the information processing apparatus 10, and cloud storage, for example, can be used as the external storage device. In these embodiments, the user information database 171 is used as one storage device, but it may be divided into two or more storage devices.
- the emotional expression engine unit 111 uses brain fatigue, mood, and stress as indicators of the subject's emotion. can be calculated. For example, the emotional expression engine unit 111 calculates the degree of brain fatigue based on the frequency of the voice, extracts the emotion of the subject from the facial expression image, calculates the mood, and converts the pulse wave to the frequency by fast Fourier transform. It is possible to analyze and extract the high frequency section and the low frequency section to calculate the stress level.
- the 3-axis processing unit 112 generates a graph in which points are plotted at coordinates corresponding to the brain fatigue level, the mood level, and the stress level calculated by the emotional expression engine unit 111, and displays the graph on the information processing device 10 or the terminal device 20. be able to.
- the 3-axis processing unit 112 for each date and time when data related to the subject's voice, facial expression image, and pulse wave is acquired, in a 3-dimensional space consisting of the X axis, the Y axis, and the Z axis. It is possible to generate a graph in which points are plotted in chronological order at coordinates corresponding to the degree of stress, the degree of mood, and the degree of stress, and display the graph on the information processing device 10 or the terminal device 20 .
- the data management unit 113 acquires at least data related to the subject's voice, facial expression image, and pulse wave from the terminal device 20, and stores the data in the storage means (for example, the user information database 171) of the information processing device 10. be able to.
- the data management unit 113 can store data related to the subject's voice, facial expression image, and pulse wave in the storage means (user information database 171) of the information processing apparatus in association with the date and time when the data was acquired. can.
- FIG. 4 shows an example of data stored in the user information database of the information processing device shown in FIG. 3 in tabular form.
- the table R1 associates and stores a user ID, which is information for identifying a subject (user), sex such as male, female, or others, and age.
- the user information database 171 can store tables R2 and R3, for example, in association with user information in table R1.
- table R2 records voice data, facial expression image data, pulse wave data, interview data including responses of the subject to the stress check, behavior of the subject, etc. received from the terminal device 20 in association with date and time.
- the date and time included in the table R2 are the date and time when data related to the subject's voice, facial expression image, and pulse wave were received from the terminal device 20, the date and time when the subject accessed the information processing device 10 using the terminal device 20, and the like. is.
- the subject's stress level, brain fatigue level, and mood level calculated by the emotion expression engine unit 111 are plotted on the X axis, Y axis, and It is stored as the Z-axis value.
- Standardization of the date and time can be realized, for example, by converting the date and time into UNIX (registered trademark) time.
- FIG. 5 is a flow chart showing the flow of processing for collecting various data from the subject's terminal device.
- the processing is executed by the terminal device 20, for example.
- the terminal device 20 collects interview data including responses to the stress check questionnaire of the subject, and transmits the data to the information processing device 10 (step S1). Further, in step S ⁇ b>1 , life log data recording the subject's life, activities, actions, etc. can be collected together with the interview data and transmitted to the information processing apparatus 10 .
- the terminal device 20 displays the interview results on the screen (step S2). 6 to 9 show screen displays of the terminal device 20 when the processes from step 1 to step 2 are executed.
- Fig. 6 shows an example of a user interface for conducting a stress check using a questionnaire. It is an example of a stress check questionnaire displayed on the screen of the terminal device 20 . Below the message “A. I would like to ask you about your job. Please select the most appropriate answer.”, the question “1. I have to do a lot of work” is displayed. The subject can select one of the options of "yes”, “somewhat”, “somewhat disagree", and "no” by performing a selection operation such as clicking or tapping. The same is true for other questions. The number of questions can be, for example, 57 items of the stress check questionnaire of the Ministry of Health, Labor and Welfare (Non-Patent Document 1).
- FIG. 7 shows an example of a user interface prompting user registration after the stress check. Following the message "The test is complete below. Let's register as a user and check the results.” Subject information is registered by pressing the registration button.
- FIG. 8 shows an example of a screen displaying the results of the stress check as a radar chart.
- the radar chart shows that the closer to the center, the higher the subject's stress.
- FIG. 9 shows an example of a screen displaying comments about the stress check results.
- the terminal device 20 displays, on the upper half of the screen, a comment from the counselor (expert) according to the interview result of the subject, such as "You seem to be in a slightly high-stress state now.” be able to.
- the terminal device 20 displays a message in the lower half of the screen that reads, "Currently, the number of registrants for Step 2, which allows chat consultation with experts and stress measurement using a measuring instrument, has reached its upper limit. Please use Step 2. If you are interested, please register from the registration page below.” will be displayed, and by performing a selection operation such as clicking or tapping "To user registration" at the bottom of the screen, more detailed personal information of the subject will be displayed. can encourage the registration of The subject's personal information transmitted from the terminal device 20 to the information processing device 10 is stored in, for example, a user information database by the data management unit 113 .
- FIG. 10 shows how a pulse wave measuring device is used to measure a pulse wave and a heartbeat from a subject's fingertip.
- the pulse wave measuring device 30 includes a body portion 32 , a waveform display portion 34 provided on the body portion 32 , and a measurement portion 36 .
- the subject By pressing the fingertip against the measurement unit 36 of the pulse wave measuring device 30, the subject can measure the pulse wave and heart rate of the subject.
- a waveform based on the pulse wave and heart rate is displayed on the .
- the measurement time by the pulse wave measuring device 30 can be set to 180 seconds (3 minutes) per time, for example.
- the terminal device 20 can be communicatively connected to the pulse wave measuring device 30 and can receive the subject's pulse wave data from the pulse wave measuring device 30 .
- FIG. 11 shows an example of the screen display of the subject's terminal device when the subject's pulse wave and heart rate are measured by the pulse wave measuring device.
- the terminal device 20, for example can display a waveform based on the subject's pulse wave data on the screen, and also displays a pulse interval (Peak-to-Peak Interval: PPI), low frequency section (Low Frequency: LF) / high frequency section (High Frequency: HF) of pulse wave data, etc. can also be displayed on the screen.
- PPI Puleak-to-Peak Interval
- LF Low Frequency
- HF High Frequency
- the terminal device 20 records at least a part of the video call (video chat) with the counselor or the expert to obtain data on the facial expression image of the subject. , to acquire data about voice (step S4).
- the subject's pulse wave can be continuously measured by the pulse wave measuring device in step 3, and the terminal device 20 can collect pulse wave data even during the video call.
- the terminal device 20 can collect environmental data including temperature, humidity, etc., and transmit it to the information processing device 10 (step 5).
- the processing of steps 4 and 5 can be performed in the information processing device 10 instead of the terminal device 20 .
- FIG. 12 shows an example of a screen display for acquiring an image of a subject's facial expression from a video call between a counselor during counseling and the subject.
- the subject's facial expression is displayed on the upper side of the screen shown in FIG. 12, and the counselor is displayed on the lower side of the screen.
- the terminal device 20 or the information processing device 10 records the video call during counseling until at least a predetermined recording time (for example, 15 minutes) is reached.
- the data on the facial expression image of the subject is data obtained by continuously recording the moving image of the subject's facial expression for at least a predetermined recording time during the video call with the subject via the terminal device 20. be.
- FIG. 13 shows an example of a screen display for acquiring the subject's voice from a video call between a counselor during counseling and the subject.
- fixed phrases for the subject to read aloud for example, "Once upon a time, an old man and an old woman lived in a certain place. The old man went to the mountains."
- the counselor is displayed at the bottom of the screen.
- the terminal device 20 or the information processing device 10 records the read aloud speech of the fixed phrase until it reaches at least a predetermined recording time (for example, about 40 seconds).
- the data about the voice is, during the video call with the subject via the terminal device 20, the voice read aloud by the subject of a predetermined fixed phrase displayed on the terminal device 20, which is continued for at least a predetermined recording time. This is recorded data.
- FIG. 14 is a schematic diagram showing the configuration of an emotional expression engine section that calculates various indexes representing brain fatigue, mood, and stress from various collected data.
- the emotional expression engine section 111 can be functionally divided into an emotional expression core section 111A and an overlapping unit section 111B.
- the emotional expression core unit 111A can calculate the brain fatigue level, mood level, and stress level, which are quantitative indexes related to brain fatigue, mood, and stress.
- the emotional expression core unit 111A can normalize the date and time, calculate the discomfort index, and the like.
- the intussusception unit 111B multiplies the brain fatigue level, the mood level, and the stress level by weighting coefficients determined based on the discomfort index calculated from the temperature and humidity included in the environmental data, thereby obtaining the emotional expression core part.
- the intussusception unit unit 111B acquires interview data including the score of the stress check result of the subject, and a weighting factor determined according to the score included in the interview data, By multiplying the brain fatigue level, the mood level, and the stress level, respectively, the values of the brain fatigue level, the mood level, and the stress level calculated by the emotional expression core unit 111A can be adjusted.
- the emotional expression engine unit 111 calculates the degree of brain fatigue in the emotional expression core unit 111A from the subject's voice data obtained as input data.
- the degree of brain fatigue is obtained by calculating the CEM (Cerebral Exponent Macro: CEM) value that represents the brain activity index, and the brain activity index measurement algorithm (SiCECA algorithm, non Patent document 3) can calculate a brain activity index (CEM value) from voice.
- the emotional expression engine unit 111 can acquire one or more (for example, about 2 to 5) CEM values per subject from the voice data of the subject by executing the brain activity index measurement algorithm.
- the degree of brain fatigue for example, corresponds to a value obtained by averaging one or more of these CEM values.
- Fig. 17 shows an example of numerical conversion for plotting the obtained values of various indexes representing brain fatigue, mood and stress on a three-axis space.
- four CEM values of 431.08, 360.73, 342.76, and 360.45 are acquired by the brain activity index measurement algorithm, and the brain fatigue level is 373.755, which is the average value of those CEM values.
- the emotional expression engine unit 111 calculates the mood level in the emotional expression core unit 111A from the subject's facial expression image data acquired as input data.
- the mood level is determined according to the number of counts of a plurality of emotional expressions recognized from moving images of facial expressions of the subject based on a facial expression recognition algorithm.
- a facial expression recognition algorithm an open software "Face classification and detection" (Non-Patent Document 4) algorithm can be used.
- the emotional expression engine unit 111 can recognize a plurality of emotional expressions from the moving image of the subject's facial expression included in the data related to the facial expression image.
- a plurality of emotional expressions are, for example, happy, surprise, neutral, fear, angry, and disgust in Russel's emotional cyclic model (Non-Patent Document 5). (disgust) and pessimism (sad).
- the emotional expression engine unit 111 executes a facial expression recognition algorithm (for example, open software "Face classification and detection") to recognize the subject's facial expression from the moving image included in the facial expression image data. Count each of the multiple emotional expressions given.
- the emotional expression engine unit 111 calculates a ratio for each of the plurality of emotional expressions, multiplies a predetermined weight for each of the plurality of emotional expressions by the ratio for each of the plurality of emotional expressions, Calculate the mood index of Predetermined weights for each of the plurality of emotional expressions can be determined based on Russell's cyclic model of emotions as shown in FIG. 15, and can be determined as shown in FIG.
- FIG. 15 is a diagram for explaining weighting determined based on Russell's cyclic model of emotions.
- FIG. 16 shows an example of calculating mood levels from various emotional expressions obtained based on Russell's cyclic model of emotions. Predetermined weights for each of the plurality of emotional expressions are weighted in the order of happy, surprise, neutral, fear, angry, disgust, and sad. can be adjusted. Referring to FIG. 16, for example, a weighting factor (weighting factor) for happy can be set to 100, surprise to 70, neutral to 50, and the like.
- the ratio for each of the multiple emotional expressions shows the highest ratio of neutral at 69.91001, followed by the ratio of happy at 6.299213.
- Multiplying the neutral fraction (F) of 69.91001 by a predetermined weighting factor (G) of 50 gives the mood index F ⁇ G value of 3495.501.
- the mood index F ⁇ G value of 629.9213 is calculated.
- the emotional expression engine unit 111 multiplies a predetermined weight for each of the plurality of emotional expressions by the ratio of each of the plurality of emotional expressions for each emotional expression to calculate the mood index for each emotional expression.
- the emotional expression engine unit 111 can set the mood as a value obtained by dividing the largest maximum mood index among the mood indexes for each emotional expression by the total value of the mood indexes for each emotional expression.
- a neutral mood index of 3495.501 is the maximum mood index.
- the display since there is no value related to anaerobic (disgust), the display is omitted.
- the emotional expression engine unit 111 calculates the stress level in the emotional expression core unit 111A from the data on the subject's pulse wave (heartbeat) obtained as input data.
- the pulse wave data is data obtained by dividing the pulse wave measured by the pulse wave measuring device 30 into sections each having a predetermined time interval (for example, 180 seconds).
- the emotion expression engine unit 111 divides the pulse wave in one section by a Hamming window for each section of the pulse wave. Calculate interval PPI and time.
- the emotional expression engine unit 111 generates a time-PPI graph in which points are plotted at coordinates corresponding to the pulse interval PPI and time in a two-dimensional space of time on the horizontal axis and PPI on the vertical axis for each section of the pulse wave.
- the emotion expression engine unit 111 performs interpolation such as linear interpolation and cubic spline interpolation between discrete values in the time-PPI graph, then applies the fast Fourier transform FFT, and calculates the power spectrum density PSD (Power Spectral Density (PSD) is integrated in the low frequency section and the high frequency section, respectively, to calculate the LF value corresponding to the low frequency component, and the HF value and LF/HF value corresponding to the high frequency component.
- PSD Power Spectral Density
- the emotional expression engine unit 111 can use the LF/HF value (sympathetic nervous system index) as the stress level.
- the stress level can be based on at least one of the LF value, the HF value and the LF/HF value.
- the maximum value of LF/HF can be set to 2
- the maximum value of HF parasympathetic nervous system index
- the average value is taken. In the example of FIG. 17, LF/HF values of 1.22 and 1.44 are calculated, and their average is 1.33.
- the maximum value (MAX) of stress is reversed so that it becomes the minimum (MIN), and (MAX2-standard value) is converted into one axis.
- the stress degree axis tension axis
- the mood axis miood axis
- the brain fatigue level axis brain fatigue axis
- the HF value is also used as the degree of stress, it is converted to one axis so that the stress MAX is the minimum and the relaxation MAX is the maximum through neutral.
- the low frequency section can be from 0.04 Hz or more to less than 0.15 Hz, and the high frequency section can be from 0.15 Hz or more to less than 0.4 Hz.
- the emotional expression engine unit 111 can acquire time (date and time), environmental data, etc. as input data in addition to data relating to the subject's voice, facial expression image, and pulse (heartbeat). can.
- the time (date and time) is converted into the time standardized by the emotional expression core unit 111A (for example, UNIX time). Accordingly, the degree of brain fatigue, the degree of mood, and the degree of stress can be multiplied by predetermined weighting factors in the intussusception unit section 111B.
- FIG. 18 shows an example of a graph display of a three-dimensional space consisting of the X-axis, Y-axis, and Z-axis plotting changes in a subject's emotion obtained by the emotion expression engine in chronological order.
- FIG. 19 shows an example of a graph display of a three-dimensional space consisting of X, Y, and Z axes in which changes in emotions of another subject obtained by the emotion expression engine are plotted in chronological order.
- the 3-axis processing unit 112 can generate and display a graph in which points are plotted at coordinates corresponding to the brain fatigue level, mood level, and stress level in a 3-dimensional space consisting of the X-axis, Y-axis, and Z-axis. .
- the 3-axis processing unit 112 can also display a graph in which points are plotted chronologically at coordinates corresponding to the subject's brain fatigue level, mood level, and stress level for each date and time in a 3-dimensional space. This makes it possible to visualize changes in the subject's emotions, thereby realizing highly accurate analysis of the subject's emotions.
- Such graph display of a three-dimensional space can be analyzed multidimensionally by integrating the three axes of brain fatigue, mood, and stress, and the time axis. easily displayed.
- diagnostic results are indicated by radar charts, etc., but correlations between factors are not often indicated.
- the correlation of the results can be displayed in an easy-to-understand manner as shown in FIGS. 18 and 19.
- FIG. The axis of the graph display of the three-dimensional space can be appropriately replaced according to the information that the user wants to know.
- stress is measured chronologically, classified into categories (clusters) by type according to patterns (trends), and future state prediction becomes possible.
- FIG. 20 shows an example of type classification categories defined in a three-dimensional space consisting of a tension axis (X axis), a brain fatigue axis (Y axis), and a mood axis (Z axis).
- the lower left corner is the starting point, and the closer to the starting point, the higher the stress (X-axis), the higher the degree of brain fatigue (Y-axis), and the lower the mood (Z-axis).
- FIGS. 18 and 19 FIG. In the example of the subject shown in FIG. 18, it can be confirmed that the mood is moving from the starting point, but in the example of another subject shown in FIG. It can be confirmed that the state of high brain fatigue (Y-axis) and depressed mood (Z-axis) continues, and if there is a risk of mental illness, it is possible to consider measures in advance.
- the type classification categories defined in the three-dimensional space shown in FIG. 20 can be defined, for example, as shown in the table below.
- the three-dimensional space is divided into a plurality of classification categories by type, and the 3-axis processing unit 112 selects the subject's degree of brain fatigue, mood , the category to which the coordinate point corresponding to the stress level belongs can be notified to the subject's terminal device 20 or the information processing device 10 or the like.
- an improvement plan to be proposed to the subject is determined for each of a plurality of types-specific categories, and the emotion expression engine unit 111 determines the point of coordinates corresponding to the subject's brain fatigue level, mood level, and stress level in the three-dimensional space.
- the improvement proposal for the category can be notified to the terminal device 20 of the subject, the information processing device 10, or the like.
- the emotional expression engine unit 111 can notify jogging, stretching, trekking, mindfulness, and yoga marked with a circle as improvement measures.
- the emotional expression engine unit 111 notifies stretching, yoga, and cognitive behavioral therapy marked with a circle as improvement measures.
- Appropriate improvement measures can be proposed according to the category by type.
- the information processing system and the like according to the present invention can be used not only for quantitative data such as pulse waves and heartbeats obtained from pulse wave measuring devices, but also for stress check results and counseling using video calls.
- Data such as the subject's voice and facial expression images are acquired, the stress level, brain fatigue level, and mood level are calculated from these data, and plotted in a three-dimensional space consisting of the X, Y, and Z axes.
- the information processing system, etc. can be used for a wide range of purposes, such as stress checks at companies, individuals, schools, etc., strengthening sports mentality, improving concentration during study, and mental measurement for entrance exams.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Psychiatry (AREA)
- Acoustics & Sound (AREA)
- Child & Adolescent Psychology (AREA)
- Hospice & Palliative Care (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Signal Processing (AREA)
- Computational Linguistics (AREA)
- Public Health (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Medical Informatics (AREA)
- Biophysics (AREA)
- Epidemiology (AREA)
- Heart & Thoracic Surgery (AREA)
- Educational Technology (AREA)
- Surgery (AREA)
- Biomedical Technology (AREA)
- Developmental Disabilities (AREA)
- Animal Behavior & Ethology (AREA)
- Social Psychology (AREA)
- Molecular Biology (AREA)
- Veterinary Medicine (AREA)
- Psychology (AREA)
- Pathology (AREA)
- Primary Health Care (AREA)
- Cardiology (AREA)
- Physiology (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Description
前記音声の周波数に基づいて脳疲労度を算出し、前記顔の表情画像から被検者の感情を抽出して気分度を算出し、前記脈派を高速フーリエ変換により周波数解析を行い、高周波区間と低周波区間を抽出してストレス度を算出する感情表現エンジン部と、
X軸、Y軸、Z軸からなる3次元空間において、前記脳疲労度、前記気分度、前記ストレス度に対応する座標に点をプロットしたグラフを表示する3軸処理部と
を含み、
前記音声に関するデータは、前記端末装置を介した前記被験者とのビデオ通話の少なくとも一部を録音して取得され、
前記顔の表情画像に関するデータは、前記端末装置を介した前記被験者とのビデオ通話の少なくとも一部を録画して取得され、
前記脈波に関するデータは、前記被験者の脈波を測定する脈波計測器から前記端末装置を介して取得されることを特徴とする。
前記3軸処理部は、前記3次元空間において、前記日時毎に、前記被験者の前記脳疲労度、前記気分度、前記ストレス度に対応する座標に点を時系列に従ってプロットしたグラフを表示することを特徴とする。
前記3軸処理部は、前記3次元空間における前記複数のタイプ別分類カテゴリーのうち、前記被験者の前記脳疲労度、前記気分度、前記ストレス度に対応する座標の点が属するカテゴリーを通知することを特徴とする。
前記感情表現エンジン部は、前記3次元空間における前記被験者の前記脳疲労度、前記気分度、前記ストレス度に対応する座標の点が属するカテゴリーに対する前記改善案を通知することを特徴とする。
前記脳疲労度は、1以上の前記CEM値の平均値であることを特徴とする。
前記感情表現エンジン部は、前記脈波の前記1区間毎に、横軸時刻及び縦軸PPIの2次元空間において、前記脈拍間隔PPIと前記時刻に対応する座標に点をプロットした時間―PPIグラフを生成し、
前記感情表現エンジン部は、前記時間―PPIグラフにおける離散値の間を補完して高速フーリエ変換FFTを適用し、前記FFTの結果のパワースペクトル密度PSDを、前記低周波区間及び高周波区間内でそれぞれ積算することで、前記低周波成分に相当するLF値、前記高周波成分に相当するHF値及びLF/HF値を算出し、
前記ストレス度は、前記LF値、前記HF値及び前記LF/HF値のうち少なくとも1つの値に基づくものであることを特徴とする。
前記高周波区間は、0.15Hz以上から0.4Hz未満までであることを特徴とする。
前記感情表現エンジン部は、前記複数の感情表現毎の割合を算出して、前記複数の感情表現の各々に対する所定の重みを、感情表現毎に前記複数の感情表現毎の割合に乗じて、感情表現毎の気分指数を算出し、
前記気分度は、前記感情表現毎の気分指数のうち最も大きい最大気分指数を前記感情表現毎の気分指数の合計値で割った値に基づくものであることを特徴とする。
前記感情表現エンジンは、前記環境データに含まれる気温、湿度から算出した不快指数に基づいて定められた重み係数を、前記脳疲労度、前記気分度、及び前記ストレス度にそれぞれ乗じることで、前記脳疲労度、前記気分度、及び前記ストレス度の各値を調整することを特徴とする。
前記感情表現エンジンは、前記問診データに含まれるスコアに応じて定められた重み係数を、前記脳疲労度、前記気分度、及び前記ストレス度にそれぞれ乗じることで、前記脳疲労度、前記気分度、及び前記ストレス度の各値を調整することを特徴とする。
前記端末装置から少なくとも前記被検者の音声、顔の表情画像、脈波に関するデータを取得する段階と、
前記音声の周波数に基づいて脳疲労度を算出し、前記顔の表情画像から被検者の感情を抽出して気分度を算出し、前記脈派を高速フーリエ変換により周波数解析を行い、高周波区間と低周波区間を抽出してストレス度を算出する段階と、
前記X軸、Y軸、Z軸からなる3次元空間において、前記脳疲労度、前記気分度、前記ストレス度に対応する座標に点をプロットしたグラフを表示する段階と
を含み、
前記音声に関するデータは、前記端末装置を介した前記被験者とのビデオ通話の少なくとも一部を録音して取得され、
前記顔の表情画像に関するデータは、前記端末装置を介した前記被験者とのビデオ通話の少なくとも一部を録画して取得され、
前記脈波に関するデータは、前記被験者の脈波を測定する脈波計測器から前記端末装置を介して取得されることを特徴とする。
前記情報処理装置と、
前記情報処理装置にネットワークを介してアクセス可能な端末装置と
を含み、
前記端末装置は、少なくとも前記音声に関するデータ、前記顔の表情画像にデータ、前記脈波に関するデータを前記情報処理装置に送信し、
前記情報処理装置は、前記音声に関するデータ、前記顔の表情画像にデータ、前記脈波に関するデータを受け取り、受け取った各データに基づいて算出された前記脳疲労度、前記気分度、前記ストレス度を前記端末装置に送信して、X軸、Y軸、Z軸からなる3次元空間において、前記脳疲労度、前記気分度、前記ストレス度に対応する座標に点をプロットしたグラフを前記端末装置に表示することを特徴とする。
11、21:CPU
12、22:メモリ
13、23:バス
14、24:入出力インターフェース
15、25:入力部
16、26:出力部
17、27:記憶部
18、28:通信部
20、20-1,20-2,20-n :端末装置
30 :脈波計測器
32 :本体部
34 :波形表示部
36 :計測部
111 :感情表現エンジン部
111A :感情表現コア部
111B :重積ユニット部
112 :3軸処理部
113 :データ管理部
171 :利用者情報データベース
Claims (31)
- 被験者の端末装置と接続され、前記被験者の感情を可視化するための情報処理装置であって、
少なくとも前記被検者の音声、顔の表情画像、脈波に関するデータを取得するデータ管理部と、
前記音声の周波数に基づいて脳疲労度を算出し、前記顔の表情画像から被検者の感情を抽出して気分度を算出し、前記脈派を高速フーリエ変換により周波数解析を行い、高周波区間と低周波区間を抽出してストレス度を算出する感情表現エンジン部と、
X軸、Y軸、Z軸からなる3次元空間において、前記脳疲労度、前記気分度、前記ストレス度に対応する座標に点をプロットしたグラフを表示する3軸処理部と
を含み、
前記音声に関するデータは、前記端末装置を介した前記被験者とのビデオ通話の少なくとも一部を録音して取得され、
前記顔の表情画像に関するデータは、前記端末装置を介した前記被験者とのビデオ通話の少なくとも一部を録画して取得され、
前記脈波に関するデータは、前記被験者の脈波を測定する脈波計測器から前記端末装置を介して取得されることを特徴とする情報処理装置。 - 前記データ管理部は、前記被験者の前記音声、顔の表情画像、脈波に関するデータとともに、当該データを取得した日時を関連付けて前記情報処理装置の記憶手段に記憶し、
前記3軸処理部は、前記3次元空間において、前記日時毎に、前記被験者の前記脳疲労度、前記気分度、前記ストレス度に対応する座標に点を時系列に従ってプロットしたグラフを表示することを特徴とする請求項1に記載の情報処理装置。 - 前記3次元空間は、複数のタイプ別分類カテゴリーに区分けされ、
前記3軸処理部は、前記3次元空間における前記複数のタイプ別分類カテゴリーのうち、前記被験者の前記脳疲労度、前記気分度、前記ストレス度に対応する座標の点が属するカテゴリーを通知することを特徴とする請求項1又は2に記載の情報処理装置。 - 前記複数のタイプ別カテゴリー毎に、前記被験者に提案する改善案が定められ、
前記感情表現エンジン部は、前記3次元空間における前記被験者の前記脳疲労度、前記気分度、前記ストレス度に対応する座標の点が属するカテゴリーに対する前記改善案を通知することを特徴とする請求項3に記載の情報処理装置。 - 前記音声に関するデータは、前記端末装置を介した前記被験者とのビデオ通話の際に、前記端末装置に表示された所定の定型文を前記被験者が読み上げた音声を、少なくとも所定の録音時間まで継続して録音したデータであることを特徴とする請求項1から4のいずれか1項に記載の情報処理装置。
- 前記感情表現エンジン部は、脳活性度指数を表すCEM値を測定する脳活性度指数測定アルゴリズムを実行することで、前記音声に関するデータから前記被験者あたり1以上の前記CEM値を取得し、
前記脳疲労度は、1以上の前記CEM値の平均値であることを特徴とする請求項5に記載の情報処理装置。 - 前記脈波に関するデータは、所定の時間間隔を1区画として、前記脈波計測器で測定された脈波を前記1区画毎に区切られたデータであることを特徴とする請求項1から6のいずれか1項に記載の情報処理装置。
- 前記感情表現エンジン部は、前記脈波の前記1区間毎に、前記1区画内の前記脈波をハミング窓で区切り、前記ハミング窓内の前記脈波について、一拍の脈波のピークから次のピーク間隔である脈拍間隔PPIと時刻を算出し、
前記感情表現エンジン部は、前記脈波の前記1区間毎に、横軸時刻及び縦軸PPIの2次元空間において、前記脈拍間隔PPIと前記時刻に対応する座標に点をプロットした時間―PPIグラフを生成し、
前記感情表現エンジン部は、前記時間―PPIグラフにおける離散値の間を補完して高速フーリエ変換FFTを適用し、前記FFTの結果のパワースペクトル密度PSDを、前記低周波区間及び高周波区間内でそれぞれ積算することで、前記低周波成分に相当するLF値、前記高周波成分に相当するHF値及びLF/HF値を算出し、
前記ストレス度は、前記LF値、前記HF値及び前記LF/HF値のうち少なくとも1つの値に基づくものであることを特徴とする請求項7に記載の情報処理装置。 - 前記低周波区間は、0.04Hz以上から0.15Hz未満までであり、
前記高周波区間は、0.15Hz以上から0.4Hz未満までであることを特徴とする請求項1から8のいずれか1項に記載の情報処理装置。 - 前記顔の表情画像に関するデータは、前記端末装置を介した前記被験者とのビデオ通話の際に、前記被験者の顔の表情の動画像を、少なくとも所定の録画時間まで継続して録画したデータであることを特徴とする請求項1から9のいずれか1項に記載の情報処理装置。
- 前記感情表現エンジン部は、顔の表情認識アルゴリズムを実行することで、前記顔の表情画像に関するデータに含まれる前記被験者の顔の表情の動画像から認識された複数の感情表現をそれぞれカウントし、
前記感情表現エンジン部は、前記複数の感情表現毎の割合を算出して、前記複数の感情表現の各々に対する所定の重みを、感情表現毎に前記複数の感情表現毎の割合に乗じて、感情表現毎の気分指数を算出し、
前記気分度は、前記感情表現毎の気分指数のうち最も大きい最大気分指数を前記感情表現毎の気分指数の合計値で割った値に基づくものであることを特徴とする請求項10に記載の情報処理装置。 - 前記複数の感情表現は、ラッセル(Russel)の感情円環モデルにおける幸福(happy)、驚き(surprise)、中立(neutral)、恐怖(fear)、怒り(angry)、嫌気(disgust)、悲観(sad)であることを特徴とする請求項11に記載の情報処理装置。
- 前記データ管理部は、前記被検者の音声、顔の表情画像、脈波に関するデータの他に、少なくとも気温、湿度を含む環境データを取得し、
前記感情表現エンジンは、前記環境データに含まれる気温、湿度から算出した不快指数に基づいて定められた重み係数を、前記脳疲労度、前記気分度、及び前記ストレス度にそれぞれ乗じることで、前記脳疲労度、前記気分度、及び前記ストレス度の各値を調整することを特徴とする請求項1から12のいずれか1項に記載の情報処理装置。 - 前記データ管理部は、前記被検者の音声、顔の表情画像、脈波に関するデータの他に、前記被験者のストレスチェック結果のスコアを含む問診データを取得し、
前記感情表現エンジンは、前記問診データに含まれるスコアに応じて定められた重み係数を、前記脳疲労度、前記気分度、及び前記ストレス度にそれぞれ乗じることで、前記脳疲労度、前記気分度、及び前記ストレス度の各値を調整することを特徴とする請求項1から12のいずれか1項に記載の情報処理装置。 - 被験者の端末装置とネットワークを介して接続可能なサーバにおいて実行される情報処理方法であって、
前記端末装置から少なくとも前記被検者の音声、顔の表情画像、脈波に関するデータを取得する段階と、
前記音声の周波数に基づいて脳疲労度を算出し、前記顔の表情画像から被検者の感情を抽出して気分度を算出し、前記脈派を高速フーリエ変換により周波数解析を行い、高周波区間と低周波区間を抽出してストレス度を算出する段階と、
前記X軸、Y軸、Z軸からなる3次元空間において、前記脳疲労度、前記気分度、前記ストレス度に対応する座標に点をプロットしたグラフを表示する段階と
を含み、
前記音声に関するデータは、前記端末装置を介した前記被験者とのビデオ通話の少なくとも一部を録音して取得され、
前記顔の表情画像に関するデータは、前記端末装置を介した前記被験者とのビデオ通話の少なくとも一部を録画して取得され、
前記脈波に関するデータは、前記被験者の脈波を測定する脈波計測器から前記端末装置を介して取得されることを特徴とする情報処理方法。 - 前記被検者の音声、顔の表情画像、脈波に関するデータを取得する段階は、前記被験者の前記音声、顔の表情画像、脈波に関するデータとともに、当該データを取得した日時を関連付けて前記サーバの記憶手段に記憶する段階を含み、
前記グラフを表示する段階は、前記3次元空間において、前記日時毎に、前記被験者の前記脳疲労度、前記気分度、前記ストレス度に対応する座標に点を時系列に従ってプロットしたグラフを表示する段階を含むことを特徴とする請求項15に記載の情報処理方法。 - 前記3次元空間は、複数のタイプ別分類カテゴリーに区分けされ、
前記グラフを表示する段階は、前記3次元空間における前記複数のタイプ別分類カテゴリーのうち、前記被験者の前記脳疲労度、前記気分度、前記ストレス度に対応する座標の点が属するカテゴリーを通知する段階を含むことを特徴とする請求項15又は16に記載の情報処理方法。 - 前記複数のタイプ別カテゴリー毎に、前記被験者に提案する改善案が定められ、
前記脳疲労度、前記気分度、前記ストレス度を算出する段階は、前記3次元空間における前記被験者の前記脳疲労度、前記気分度、前記ストレス度に対応する座標の点が属するカテゴリーに対する前記改善案を通知する段階を含むことを特徴とする請求項17に記載の情報処理方法。 - 前記音声に関するデータは、前記端末装置を介した前記被験者とのビデオ通話の際に、前記端末装置に表示された所定の定型文を前記被験者が読み上げた音声を、少なくとも所定の録音時間まで継続して録音したデータであることを特徴とする請求項15から18のいずれか1項に記載の情報処理方法。
- 前記脳疲労度、前記気分度、前記ストレス度を算出する段階は、脳活性度指数を表すCEM値を測定する脳活性度指数測定アルゴリズムを実行することで、前記音声に関するデータから前記被験者あたり1以上の前記CEM値を取得する段階を含み、
前記脳疲労度は、1以上の前記CEM値の平均値であることを特徴とする請求項19に記載の情報処理方法。 - 前記脈波に関するデータは、所定の時間間隔を1区画として、前記脈波計測器で測定された脈波を前記1区画毎に区切られたデータであることを特徴とする請求項15から20のいずれか1項に記載の情報処理方法。
- 前記脳疲労度、前記気分度、前記ストレス度を算出する段階は、
前記脈波の前記1区間毎に、前記1区画内の前記脈波をハミング窓で区切り、前記ハミング窓内の前記脈波について、一拍の脈波のピークから次のピーク間隔である脈拍間隔PPIと時刻を算出する段階と、
前記脈波の前記1区間毎に、横軸時刻及び縦軸PPIの2次元空間において、前記脈拍間隔PPIと前記時刻に対応する座標に点をプロットした時間―PPIグラフを生成する段階と、
前記時間―PPIグラフにおける離散値の間を補完して高速フーリエ変換FFTを適用し、前記FFTの結果のパワースペクトル密度PSDを、前記低周波区間及び高周波区間内でそれぞれ積算することで、前記低周波成分に相当するLF値、前記高周波成分に相当するHF値及びLF/HF値を算出する段階と
を含み、
前記ストレス度は、前記LF値、前記HF値及び前記LF/HF値のうち少なくとも1つの値に基づくものであることを特徴とする請求項21に記載の情報処理方法。 - 前記低周波区間は、0.04Hz以上から0.15Hz未満までであり、
前記高周波区間は、0.15Hz以上から0.4Hz未満までであることを特徴とする請求項15から22のいずれか1項に記載の情報処理方法。 - 前記顔の表情画像に関するデータは、前記端末装置を介した前記被験者とのビデオ通話の際に、前記被験者の顔の表情の動画像を、少なくとも所定の録画時間まで継続して録画したデータであることを特徴とする請求項15から23のいずれか1項に記載の情報処理方法。
- 前記脳疲労度、前記気分度、前記ストレス度を算出する段階は、
顔の表情認識アルゴリズムを実行することで、前記顔の表情画像に関するデータに含まれる前記被験者の顔の表情の動画像から認識された複数の感情表現をそれぞれカウントする段階と、
前記複数の感情表現毎の割合を算出して、前記複数の感情表現の各々に対する所定の重みを、感情表現毎に前記複数の感情表現毎の割合に乗じて、感情表現毎の気分指数を算出する段階と
を含み、
前記気分度は、前記感情表現毎の気分指数のうち最も大きい最大気分指数を前記感情表現毎の気分指数の合計値で割った値に基づくものであることを特徴とする請求項24に記載の情報処理方法。 - 前記複数の感情表現は、ラッセル(Russel)の感情円環モデルにおける幸福(happy)、驚き(surprise)、中立(neutral)、恐怖(fear)、怒り(angry)、嫌気(disgust)、悲観(sad)であることを特徴とする請求項25に記載の情報処理方法。
- 前記被検者の音声、顔の表情画像、脈波に関するデータを取得して記憶する段階は、前記被検者の音声、顔の表情画像、脈波に関するデータの他に、少なくとも気温、湿度を含む環境データを取得する段階を含み、
前記脳疲労度、前記気分度、前記ストレス度を算出する段階は、前記環境データに含まれる気温、湿度から算出した不快指数に基づいて定められた所定の重み係数を、前記脳疲労度、前記気分度、及び前記ストレス度にそれぞれ乗じることで、前記脳疲労度、前記気分度、及び前記ストレス度の各値を調整する段階を含むことを特徴とする請求項15から26のいずれか1項に記載の情報処理方法。 - 前記被検者の音声、顔の表情画像、脈波に関するデータを取得して記憶する段階は、前記被検者の音声、顔の表情画像、脈波に関するデータの他に、前記被験者のストレスチェック結果のスコアを含む問診データを取得する段階を含み、
前記脳疲労度、前記気分度、前記ストレス度を算出する段階は、前記問診データに含まれるスコアに応じて定められた重み係数を、前記脳疲労度、前記気分度、及び前記ストレス度にそれぞれ乗じることで、前記脳疲労度、前記気分度、及び前記ストレス度の各値を調整する段階を含むことを特徴とする請求項15から26のいずれか1項に記載の情報処理方法。 - 請求項1から15のいずれか1項に記載の情報処理装置と、
前記情報処理装置にネットワークを介してアクセス可能な端末装置と
を含み、
前記端末装置は、少なくとも前記音声に関するデータ、前記顔の表情画像にデータ、前記脈波に関するデータを前記情報処理装置に送信し、
前記情報処理装置は、前記音声に関するデータ、前記顔の表情画像にデータ、前記脈波に関するデータを受け取り、受け取った各データに基づいて算出された前記脳疲労度、前記気分度、前記ストレス度を前記端末装置に送信して、X軸、Y軸、Z軸からなる3次元空間において、前記脳疲労度、前記気分度、前記ストレス度に対応する座標に点をプロットしたグラフを前記端末装置に表示することを特徴とする情報処理システム。 - コンピュータによって実行されることで、前記コンピュータを請求項1から14のいずれか1項に記載の情報管理装置の各部として機能させることを特徴とする情報処理プログラム。
- コンピュータによって実行されることで、前記コンピュータに請求項15から29のいずれか1項に記載の情報処理方法の各段階を行わせることを特徴とする情報処理プログラム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/277,691 US20240008785A1 (en) | 2021-02-17 | 2022-02-14 | Information processing system, information processing device, information processing method, and information processing program |
JP2023500826A JPWO2022176808A1 (ja) | 2021-02-17 | 2022-02-14 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-023407 | 2021-02-17 | ||
JP2021023407 | 2021-02-17 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022176808A1 true WO2022176808A1 (ja) | 2022-08-25 |
Family
ID=82930611
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/005712 WO2022176808A1 (ja) | 2021-02-17 | 2022-02-14 | 情報処理システム、情報処理装置、情報処理方法及び情報処理プログラム |
Country Status (3)
Country | Link |
---|---|
US (1) | US20240008785A1 (ja) |
JP (1) | JPWO2022176808A1 (ja) |
WO (1) | WO2022176808A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023074589A1 (ja) * | 2021-10-28 | 2023-05-04 | 株式会社センシング | 情報処理装置、情報処理システム、及びプログラム |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117958821B (zh) * | 2024-03-11 | 2024-07-23 | 中国人民解放军海军特色医学中心 | 一种用于情绪感知的探究实验*** |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013027570A (ja) * | 2011-07-28 | 2013-02-07 | Panasonic Corp | 心理状態評価装置、心理状態評価システム、心理状態評価方法およびプログラム |
US20130298044A1 (en) * | 2004-12-30 | 2013-11-07 | Aol Inc. | Mood-based organization and display of co-user lists |
JP2015009047A (ja) * | 2013-07-01 | 2015-01-19 | 国立大学法人長岡技術科学大学 | 感性状態判定装置、感性状態判定方法及び感性状態判定用コンピュータプログラム |
JP2019004924A (ja) * | 2017-06-20 | 2019-01-17 | 株式会社東芝 | システム及び方法 |
JP2019185230A (ja) * | 2018-04-04 | 2019-10-24 | 学校法人明治大学 | 会話処理装置、会話処理システム、会話処理方法及びプログラム |
JP2020074805A (ja) * | 2018-11-05 | 2020-05-21 | 株式会社安藤・間 | ドライバーの状態推定方法及び装置 |
-
2022
- 2022-02-14 US US18/277,691 patent/US20240008785A1/en active Pending
- 2022-02-14 JP JP2023500826A patent/JPWO2022176808A1/ja active Pending
- 2022-02-14 WO PCT/JP2022/005712 patent/WO2022176808A1/ja active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130298044A1 (en) * | 2004-12-30 | 2013-11-07 | Aol Inc. | Mood-based organization and display of co-user lists |
JP2013027570A (ja) * | 2011-07-28 | 2013-02-07 | Panasonic Corp | 心理状態評価装置、心理状態評価システム、心理状態評価方法およびプログラム |
JP2015009047A (ja) * | 2013-07-01 | 2015-01-19 | 国立大学法人長岡技術科学大学 | 感性状態判定装置、感性状態判定方法及び感性状態判定用コンピュータプログラム |
JP2019004924A (ja) * | 2017-06-20 | 2019-01-17 | 株式会社東芝 | システム及び方法 |
JP2019185230A (ja) * | 2018-04-04 | 2019-10-24 | 学校法人明治大学 | 会話処理装置、会話処理システム、会話処理方法及びプログラム |
JP2020074805A (ja) * | 2018-11-05 | 2020-05-21 | 株式会社安藤・間 | ドライバーの状態推定方法及び装置 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023074589A1 (ja) * | 2021-10-28 | 2023-05-04 | 株式会社センシング | 情報処理装置、情報処理システム、及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2022176808A1 (ja) | 2022-08-25 |
US20240008785A1 (en) | 2024-01-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Carneiro et al. | New methods for stress assessment and monitoring at the workplace | |
Terzis et al. | Measuring instant emotions based on facial expressions during computer-based assessment | |
US10376197B2 (en) | Diagnosing system for consciousness level measurement and method thereof | |
Kaklauskas et al. | Web-based biometric computer mouse advisory system to analyze a user's emotions and work productivity | |
Nacke | An introduction to physiological player metrics for evaluating games | |
WO2022176808A1 (ja) | 情報処理システム、情報処理装置、情報処理方法及び情報処理プログラム | |
Harrison | The Emotiv mind: Investigating the accuracy of the Emotiv EPOC in identifying emotions and its use in an Intelligent Tutoring System | |
Cernea et al. | EEG-based measurement of subjective parameters in evaluations | |
Gervasi et al. | Applications of affective computing in human-robot interaction: State-of-art and challenges for manufacturing | |
US20130172693A1 (en) | Diagnosing system for consciousness level measurement and method thereof | |
Mattsson et al. | Evaluating four devices that present operator emotions in real-time | |
Andrisevic et al. | Detection of heart murmurs using wavelet analysis and artificial neural networks | |
Ahmadi et al. | Quantifying occupational stress in intensive care unit nurses: An applied naturalistic study of correlations among stress, heart rate, electrodermal activity, and skin temperature | |
Weiß et al. | Effects of image realism on the stress response in virtual reality | |
CN106175800A (zh) | 一种基于人体运动状态数据的心境状态量化评估方法 | |
JP2017224256A (ja) | 診断システム、診断方法、サーバ装置、及びプログラム | |
KR20090027027A (ko) | 인지적 자극 하에서 측정된 생체신호를 이용한 정신적질환분석방법 | |
JP4378455B2 (ja) | 心理状態測定装置 | |
KR100397188B1 (ko) | 실시간 종합 감성 평가 방법 및 그 시스템 | |
Xavier et al. | A Hybrid Evaluation Approach for the Emotional State of Information Systems Users. | |
US10820851B2 (en) | Diagnosing system for consciousness level measurement and method thereof | |
Jaiswal et al. | Effective assessment of cognitive load in real-world scenarios using wrist-worn sensor data | |
Kane et al. | Technological developments in assessment | |
Küster et al. | Measuring emotions online: Expression and physiology | |
Al-Musawi et al. | Implementation and user testing of a system for visualizing continuous health data and events |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22756126 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023500826 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18277691 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22756126 Country of ref document: EP Kind code of ref document: A1 |