CN112842266A - Sleep stage identification method based on human body monitoring sleep data - Google Patents

Sleep stage identification method based on human body monitoring sleep data Download PDF

Info

Publication number
CN112842266A
CN112842266A CN202011639274.7A CN202011639274A CN112842266A CN 112842266 A CN112842266 A CN 112842266A CN 202011639274 A CN202011639274 A CN 202011639274A CN 112842266 A CN112842266 A CN 112842266A
Authority
CN
China
Prior art keywords
sleep
data
stage
staging
unit time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011639274.7A
Other languages
Chinese (zh)
Other versions
CN112842266B (en
Inventor
李杜
傅其祥
伍假真
吴文韬
彭浩堃
陈香丽
徐迪
黄容
李博雅
胡毅超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Zennze Technology Co ltd
Original Assignee
Hunan Dongsheng Nanxiang Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Dongsheng Nanxiang Intelligent Technology Co ltd filed Critical Hunan Dongsheng Nanxiang Intelligent Technology Co ltd
Priority to CN202011639274.7A priority Critical patent/CN112842266B/en
Priority claimed from CN202011639274.7A external-priority patent/CN112842266B/en
Publication of CN112842266A publication Critical patent/CN112842266A/en
Application granted granted Critical
Publication of CN112842266B publication Critical patent/CN112842266B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • A61B5/4812Detecting sleep stages or cycles

Abstract

The invention provides a sleep staging stage identification method based on human body monitoring sleep data, which comprises the following steps: step S1, performing sleep data training to obtain a focus parameter required by non-contact human sleep state data monitoring; step S2, using the trained point focusing parameters to perform sleep stage identification and judgment, specifically comprising the following steps: step S21, recognizing a sleep state; step S22, recognizing sleep staging stages; and step S23, forming the complete sleep cycle time. The invention adopts a novel and concise algorithm to divide each stage of the human sleep cycle and generate a complete sleep analysis report to evaluate the sleep quality, thereby helping to solve the problem of clinical diseases related to sleep.

Description

Sleep stage identification method based on human body monitoring sleep data
Technical Field
The invention relates to the technical field of big data application and artificial intelligence, in particular to a sleep staging stage identification method based on human body monitoring sleep data.
Background
In 1953, the study of Nathaniel Kleitman and Aserinsky discovers the eye movement characteristics in the sleep process, and two sleep concepts of rapid eye movement sleep (RMSE) and non-rapid eye movement sleep (NRMSE) are proposed according to the eye movement characteristics. In 1957, comment and Kleitman performed a complete overnight sleep study, recorded electroencephalograms with periodic characteristics, and proposed the division criteria of sleep cycles accordingly. In 1968, Rechtschschuffen and AnthonyKales divided human sleep into rapid eye movement sleep and non-rapid eye movement sleep and further subdivided into periods of falling asleep, light sleep, moderate sleep and deep sleep. The domestic research on sleep starts in the 70 th 20 th century, and then a large number of scholars invest in sleep research, so that the sleep research has rapid development in China.
Disclosure of Invention
The invention provides a sleep staging stage identification method based on human body monitoring sleep data, which divides each stage of a human sleep cycle by adopting a novel and simple algorithm to generate a complete sleep analysis report so as to evaluate the sleep quality and further help solve the problem of clinical diseases related to sleep.
In order to achieve the above object, the sleep staging stage identification method based on human body monitoring sleep data provided by the invention comprises the following steps:
step S1, performing sleep data training to obtain a focus parameter required by non-contact human sleep state data monitoring;
step S2, using the trained point focusing parameters to perform sleep stage identification and judgment, specifically comprising the following steps:
step S21, sleep state recognition: judging whether the human body is in a sleep state within unit time according to the real index data of the sleep stage of the human body and the sleep state initial judgment rule;
step S22, sleep staging stage identification: carrying out data processing on the index data result in the unit time, and judging the sleep staging stage: a deep sleep stage, a light sleep stage, a rapid eye movement stage, or a waking stage;
step S23, forming a complete sleep cycle time: and (4) splicing the time periods of the deep sleep stage, the light sleep stage, the rapid eye movement stage and the waking stage according to the time sequence obtained in the step (S22) to form a complete sleep time sequence.
Preferably, step S1 specifically includes the following steps:
step S11, extracting a training data set required by non-contact human sleep state data monitoring;
and step S12, training the sleep data according to the training data set, and solving the focus point parameter required by the non-contact human sleep state data monitoring.
Preferably, the training data set in step S11 is specifically an uploaded history data set labeled manually and actively, and the data set of the attribute value x of the multipoint parameter includes data sets of a detection distance between the device and the human body, a respiration rate, a heart rate, and a signal intensity.
Preferably, the step S11 is to obtain historical known whole-night complete sleep data, which is uploaded by the device completely and generates real sleep monitoring data, including historical known whole-night complete sleep data and known sleep stage standard data, where the known sleep stage standard data includes sleep specific time information of sleep onset sleep time, wake-up time, deep sleep, light sleep, and rapid eye movement;
the step S12 specifically includes the following steps:
step S121, processing the sleep data according to a training data set of historical known overnight complete sleep data, and solving a focus parameter required by non-contact human body sleep state data monitoring;
and S122, verifying the required point focusing parameters according to the known sleep stage standard data to obtain a verification set.
Preferably, the step S121 of obtaining the required point focusing parameters of the model specifically includes the following steps:
step S1211, preprocessing data;
calculating the weight w of each attribute value x of each sleep time point in the training data set:
score=|x-max(x)|/std(x)
Figure BDA0002877854430000021
wherein theta is 0.00001
Wherein std (x) is the standard deviation of the attribute value x, max (x) is the maximum value of the attribute value x, score is the intermediate parameter for obtaining the weight, Σ score is the sum of the intermediate parameters of all the attribute values x, and max (score) is the maximum value of the intermediate parameters of all the attribute values x;
calculating a clustering result pointvalue Truth of the weight w of each attribute value x by using a weighted average function WeiightedMedian:
Truth=WeightedMedian(x,w)
each attribute value x data corresponds to a weight, and the attribute value x data is accumulated from the starting point of the attribute value x data until 1/2 of the data sum is reached, so that the attribute value x data point is selected as the convergence value Truth of the training data set;
step S1212, selecting a feature variable: calculating the standard difference value of the clustering result point
Figure BDA0002877854430000031
To determine the error fluctuation range interval of the spot.
Preferably, the step S22 specifically includes the following steps:
step S221, judging sleep staging stages within N times of unit time period: performing window sliding in a period which is N times of unit time, traversing and judging the sleep staging stage in the period which is N times of the unit time according to the sleep staging stage rule;
step S222, splicing N times of unit time period data, and updating the sleep staging stage again: and according to the sleep state updating rule, scoring the sleep staging in the unit time period which is N times that of the sleep staging, and updating the sleep staging again.
Preferably, the unit time in step S21 is 1S, and the N-times period of the unit time in step S221 is specifically 60 times period.
Preferably, the sleep state initial determination rule in step S21 is specifically: if the signal intensity is less than the detection distance between the equipment and the human body (the signal intensity focus point + the signal intensity focus point standard deviation) and the heart rate is greater than 0, the respiratory rate is greater than 0, and the signal intensity is-5, judging that the equipment is in a sleep state in the unit time; otherwise, judging that the device is in the non-sleep state in the unit time.
Preferably, the sleep staging rules in step S221 specifically include:
when the state is 'sleep' and the occurrence frequency within N times of unit time period is more than or equal to N times of unit, the sleep staging stage is judged to be 'deep sleep';
when the state is 'sleep', and the occurrence frequency in the N times of unit time period is less than N times of units and is greater than or equal to 2/3N times of units, judging the sleep staging stage as 'light sleep';
when the state is 'sleep', and the occurrence frequency in the N times of unit time period is less than 2/3N times of units and is more than or equal to 1/2N times of units, the sleep staging stage is judged to be 'rapid eye movement';
and when the state is 'sleep', and the occurrence frequency in the N times of unit time period is less than 1/2N times of units and is more than or equal to 0, judging that the sleep staging stage is 'wakeful'.
Preferably, the updating the sleep state rule in step S222 specifically includes the following steps:
step S2221, defining a set X as four attribute values of distance, respiration rate, heart rate and signal intensity respectively corresponding to a deep sleep stage, a light sleep stage and a rapid eye movement stage in sleep staging stages;
step S2222, the four attribute values in the set X are normalized:
Figure BDA0002877854430000041
step S2223, summing the normalized standard deviations of the four attributes (std (the detection distance between the device and the human body) + std (respiratory rate) + std (heart rate) + std (signal strength))
Figure BDA0002877854430000042
Wherein std (the detection distance between the device and the human body) is a distance standard deviation, std (respiration rate) is a respiration rate standard deviation, std (heart rate) is a heart rate standard deviation, std (signal strength) is a signal strength standard deviation, and score2 is the sum of the standard deviations of the four attributes;
step S2224, updating the sleep staging stage according to the standard deviation sum score 2:
when in a sleep staging that is not "awake," and score < t1, the updated sleep staging is "deep sleep";
when in a non-waking sleep stage, and score ≧ t1 and score < t2, the renewed sleep stage is "light sleep";
when in a non "awake" sleep stage, and the score ≧ t2, the updated sleep stage is "rapid eye movement".
Wherein, the set sleep stage threshold rule of each stage is t1< t2, t1 and t2, which are historical training experience data.
The invention adopts a novel and simple algorithm to divide each stage of the human sleep cycle, divides the overnight sleep data into four states of waking, rapid eye movement, light sleep and deep sleep, and generates a complete sleep analysis report to evaluate the sleep quality, thereby helping to solve the problem of clinical diseases related to sleep.
Drawings
FIG. 1 is a schematic diagram of a sleep staging stage identification method based on human body monitoring sleep data according to the present invention;
fig. 2 is a schematic diagram of obtaining a weighted convergence point value Truth according to a preferred embodiment of a sleep stage identification method based on human body monitoring sleep data according to the present invention;
fig. 3 is a diagram illustrating sleep stage identification according to a preferred embodiment of a sleep stage identification method based on human body monitoring sleep data according to the present invention.
Detailed Description
In order to make the technical problems, technical solutions and advantages of the present invention more apparent, the following detailed description is given with reference to the accompanying drawings and specific embodiments.
Aiming at the existing problems, the invention provides a sleep staging stage identification method based on human body monitoring sleep data, as shown in figure 1, comprising the following steps:
step S1, performing sleep data training to obtain a focus parameter required by non-contact human sleep state data monitoring;
and step S2, using the trained point focusing parameters to perform sleep stage identification judgment.
Step S1 specifically includes the following steps:
step S11, extracting a training data set required by non-contact human sleep state data monitoring; the training data set is specifically an uploaded history data set labeled manually and actively, and the data set of the attribute value x of the point focusing parameter comprises data sets of the detection distance between the equipment and the human body, the respiration rate, the heart rate and the signal intensity; for example: i know that x is a night before going to sleep, and y is a night after getting up the next day, the training data are data of ' xx: xx: xx point to ' xx: xx: xx ' point uploaded by the equipment (needing manual extraction for training). The specific structure of the acquired data is as follows:
Figure BDA0002877854430000051
the method comprises the steps that historical known overnight complete sleep data are obtained, the data are uploaded completely by equipment and real sleep monitoring data are generated, the historical known overnight complete sleep data comprise known overnight complete sleep data and known sleep stage standard data, and the known sleep stage standard data comprise specific time information of sleep starting time, wake-up time, deep sleep, light sleep and rapid eye movement;
step S12, training the sleep data according to the training data set, and solving the focus point parameter required by the non-contact human sleep state data monitoring;
step S12 specifically includes the following steps:
step S121, processing the sleep data according to a training data set of historical known overnight complete sleep data, and solving a focus parameter required by non-contact human body sleep state data monitoring;
and S122, verifying the required point focusing parameters according to the known sleep stage standard data to obtain a verification set.
The step S121 of obtaining the point focusing parameters required by the model specifically includes the following steps:
step S1211, preprocessing data;
calculating the weight w of each attribute value x of each sleep time point in the training data set:
score=|x-max(x)|/std(x)
Figure BDA0002877854430000061
wherein theta is 0.00001
Wherein std (x) is the standard deviation of the attribute value x, max (x) is the maximum value of the attribute value x, score is the intermediate parameter for obtaining the weight, Σ score is the sum of the intermediate parameters of all the attribute values x, and max (score) is the maximum value of the intermediate parameters of all the attribute values x;
calculating a clustering result pointvalue Truth of the weight w of each attribute value x by using a weighted average function WeiightedMedian:
Truth=WeightedMedian(x,w)
each attribute value x data corresponds to a weight, the attribute value x data is accumulated from the starting point until 1/2 of the sum of the data is reached, namely, the total value is calculated firstly, the sum is added in sequence, and in addition, the sum reaches 1/2 of the total value, and then the attribute value x data point is selected as the convergence value Truth of the training data set;
for example, given the weight of each point out of the data set, then order:
Figure BDA0002877854430000062
as shown in fig. 2 and the table above, according to the fast sorting idea, a number is found, then the number sequence is divided into two segments, i.e. left and right segments, according to the sum of the weights of the two segments, the left half or right half number sequence is recursively called, the middle w is 0.328, f (v) is 0.550, and that point is the so-called "true value", i.e. we find the convergence point Truth is 3.
Step S1212, selecting a feature variable: calculating the standard difference value of the clustering result point
Figure BDA0002877854430000071
To determine the error fluctuation range interval of the spot.
In the process of constructing the algorithm model in the last stage, the required parameters of the model training output, namely the point convergence and the point convergence standard deviation, are obtained. Because the data of the model participating in training is real sleep data, and the result trained by the model is real index data of the human sleep stage, the parameters can be used as important basis for judging the sleep state (sleeping or non-sleeping). Next, the initial sleep state is judged by performing granularity division with time of 1s, and the following rule is used to judge whether the sleep state is in the granularity of 1 s:
as shown in fig. 3, step S2 specifically includes the following steps:
step S21, sleep state recognition: judging whether the human body is in a sleep state within unit time according to the real index data of the sleep stage of the human body and the sleep state initial judgment rule; the unit time is 1 s;
the sleep state initial judgment rule is used for judging whether a model point convergence condition is met, and specifically comprises the following steps: if the signal intensity is less than the detection distance between the equipment and the human body (the signal intensity focus point + the signal intensity focus point standard deviation) and the heart rate is greater than 0, the respiratory rate is greater than 0, and the signal intensity is-5, judging that the equipment is in a sleep state in the unit time; otherwise, judging that the device is in the non-sleep state in the unit time.
Figure BDA0002877854430000072
Step S22, sleep staging stage identification: carrying out data processing on the index data result in the unit time, and judging the sleep staging stage: a deep sleep stage, a light sleep stage, a rapid eye movement stage, or a waking stage;
the step S22 specifically includes the following steps:
step S221, judging sleep staging stages within N times of unit time period: performing window sliding in a period which is N times of unit time, traversing and judging the sleep staging stage in the period which is N times of the unit time according to the sleep staging stage rule; the N times period of the unit time is 60 times period; namely: the data result of step S21 is slid in a 60S time period window, and the sleep states are classified into deep sleep, light sleep, rapid eye movement, and awake. And traversing from the 1 st s of the data, sliding in a window with 60s as a period, wherein each window is independent, and repeated data are not overlapped in each window interval.
In this embodiment, the 60s window is used as a window to slide in a cycle, and the sleep staging stage is determined by the number of sleep times in a cycle.
The sleep staging stage rules are specifically:
when the state is 'sleep' and the occurrence frequency within N times of unit time period is more than or equal to N times of unit, the sleep staging stage is judged to be 'deep sleep';
when the state is 'sleep', and the occurrence frequency in the N times of unit time period is less than N times of units and is greater than or equal to 2/3N times of units, judging the sleep staging stage as 'light sleep';
when the state is 'sleep', and the occurrence frequency in the N times of unit time period is less than 2/3N times of units and is more than or equal to 1/2N times of units, the sleep staging stage is judged to be 'rapid eye movement';
and when the state is 'sleep', and the occurrence frequency in the N times of unit time period is less than 1/2N times of units and is more than or equal to 0, judging that the sleep staging stage is 'wakeful'. In the embodiment, the method comprises the following steps:
rules Staging of sleep
The state is 'sleep' and the occurrence frequency in 60s is more than or equal to 60 Deep sleep
The state is 'sleep' and the occurrence frequency in 60s is less than 60 and is more than or equal to 40 Superficial sleep
The state is 'sleep' and the occurrence frequency in 60s is less than 40 and more than or equal to 30 Rapid eye movement
The state is 'sleep' and the occurrence frequency in 60s is less than 40 and more than or equal to 0 Sobering up
Step S222, splicing N times of unit time period data, and updating the sleep staging stage again: and according to the sleep state updating rule, scoring the sleep staging in the unit time period which is N times that of the sleep staging, and updating the sleep staging again.
And performing normalized weighted score analysis through the preliminarily judged states of deep sleep, light sleep and rapid eye movement, and accurately updating the sleep state. Extracting each sleep state obtained just before, calculating data sleep data scores under the sleep stage by using a scoring function, and updating the sleep states (deep sleep, light sleep and rapid eye movement) again according to the data scores.
In this embodiment, 60s cycle data is spliced to form sleep stages, and the sleep stages are judged to be updated again for each sleep stage data, for example: deep sleep is changed to light sleep, and light sleep is changed to rapid eye movement.
The updating of the sleep state rule specifically comprises the following steps:
step S2221, defining a set X as four attribute values of distance, respiration rate, heart rate and signal intensity respectively corresponding to a deep sleep stage, a light sleep stage and a rapid eye movement stage in sleep staging stages;
step S2222, the four attribute values in the set X are normalized:
Figure BDA0002877854430000091
step S2223, summing the normalized standard deviations of the four attributes (std (the detection distance between the device and the human body) + std (respiratory rate) + std (heart rate) + std (signal strength))
Figure BDA0002877854430000092
Wherein std (the detection distance between the device and the human body) is a distance standard deviation, std (respiration rate) is a respiration rate standard deviation, std (heart rate) is a heart rate standard deviation, std (signal strength) is a signal strength standard deviation, and score2 is the sum of the standard deviations of the four attributes;
step S2224, updating the sleep staging stage according to the standard deviation sum score 2:
when in a sleep staging that is not "awake," and score < t1, the updated sleep staging is "deep sleep";
when in a sleep staging that is not "awake," and score > t1, and score < t2, the more recent sleep staging is "light sleep";
when in a sleep staging that is not "awake" and score > t2, the sleep staging is updated to "rapid eye movement". Wherein, the set sleep stage threshold rule of each stage is t1< t2, t1 and t2, which are historical training experience data. In the embodiment, the method comprises the following steps:
rules Sleep staging updates
Sleep staging not ` awake ` and scored<0.15 Deep sleep
Sleep staging not ` awake ` and scored>Score 0.15 and<0.2 superficial sleep
Sleep staging not ` awake ` and scored>=0.2 Rapid eye movement
Step S23, forming a complete sleep cycle time: and (4) splicing the time periods of the deep sleep stage, the light sleep stage, the rapid eye movement stage and the waking stage according to the time sequence obtained in the step (S22) to form a complete sleep time sequence.
The invention adopts a novel and simple algorithm to divide each stage of the human sleep cycle, divides the overnight sleep data into four states of waking, rapid eye movement, light sleep and deep sleep, and generates a complete sleep analysis report to evaluate the sleep quality, thereby helping to solve the problem of clinical diseases related to sleep.
The invention provides a sleep stage identification method based on human body monitoring sleep data, and provides a sleep data analysis method for sleep stage identification. The sleep staging stage identification algorithm firstly extracts historical known whole-night complete sleep data to train to obtain a convergence point required by a model, then repeatedly verifies the model through data of real known sleep stages (sleep report indexes such as sleep starting and ending time, deep sleep, light sleep, rapid eye movement, waking state time period and the like) to obtain the most reasonable and accurate convergence point parameter, and finally processes 24-hour sleep monitoring data uploaded by equipment by using the trained convergence point and the model to generate a complete sleep staging stage time sequence.
While the foregoing is directed to the preferred embodiment of the present invention, it will be understood by those skilled in the art that various changes and modifications may be made without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (10)

1. A sleep staging stage identification method based on human body monitoring sleep data is characterized by comprising the following steps:
step S1, performing sleep data training to obtain a focus parameter required by non-contact human sleep state data monitoring;
step S2, using the trained point focusing parameters to perform sleep stage identification and judgment, specifically comprising the following steps:
step S21, sleep state recognition: judging whether the human body is in a sleep state within unit time according to the real index data of the sleep stage of the human body and the sleep state initial judgment rule;
step S22, sleep staging stage identification: carrying out data processing on the index data result in the unit time, and judging the sleep staging stage: a deep sleep stage, a light sleep stage, a rapid eye movement stage, or a waking stage;
step S23, forming a complete sleep cycle time: and (4) splicing the time periods of the deep sleep stage, the light sleep stage, the rapid eye movement stage and the waking stage according to the time sequence obtained in the step (S22) to form a complete sleep time sequence.
2. The sleep stage identification method based on the human body monitoring sleep data as claimed in claim 1, wherein the step S1 specifically includes the following steps:
step S11, extracting a training data set required by non-contact human sleep state data monitoring;
and step S12, training the sleep data according to the training data set, and solving the focus point parameter required by the non-contact human sleep state data monitoring.
3. The sleep stage identification method based on human body monitoring sleep data as claimed in claim 2, wherein the training data set in step S11 is specifically an uploaded history data set labeled manually and actively, and the data set of the attribute value x of the multipoint parameter includes data sets of a detection distance between the device and the human body, a respiration rate, a heart rate, and a signal intensity.
4. The sleep stage identification method according to claim 3, wherein the step S11 is specifically to obtain historical known whole-night complete sleep data, which is uploaded by the device to be complete and generate real sleep monitoring data, including historical known whole-night complete sleep data and known sleep stage standard data, the known sleep stage standard data including sleep specific time information of sleep onset time, wake-up time, deep sleep, light sleep, and rapid eye movement;
the step S12 specifically includes the following steps:
step S121, processing the sleep data according to a training data set of historical known overnight complete sleep data, and solving a focus parameter required by non-contact human body sleep state data monitoring;
and S122, verifying the required point focusing parameters according to the known sleep stage standard data to obtain a verification set.
5. The sleep staging identification method based on human monitoring sleep data as claimed in claim 4, wherein the step S121 of finding the convergence point parameters required by the model specifically includes the steps of:
step S1211, preprocessing data;
calculating the weight w of each attribute value x of each sleep time point in the training data set:
score=|x-max(x)|/std(x)
Figure FDA0002877854420000021
wherein theta is 0.00001
Wherein std (x) is the standard deviation of the attribute value x, max (x) is the maximum value of the attribute value x, score is the intermediate parameter for obtaining the weight, Σ score is the sum of the intermediate parameters of all the attribute values x, and max (score) is the maximum value of the intermediate parameters of all the attribute values x;
calculating a clustering result pointvalue Truth of the weight w of each attribute value x by using a weighted average function WeiightedMedian:
Truth=WeightedMedian(x,w)
each attribute value x data corresponds to a weight, and the attribute value x data is accumulated from the starting point of the attribute value x data until 1/2 of the data sum is reached, so that the attribute value x data point is selected as the convergence value Truth of the training data set;
step S1212, selecting a feature variable: calculating the standard difference value of the clustering result point
Figure FDA0002877854420000022
To determine the error fluctuation range interval of the spot.
6. The sleep stage identification method based on the human monitoring sleep data as claimed in claim 3, wherein the step S22 specifically comprises the following steps:
step S221, judging sleep staging stages within N times of unit time period: performing window sliding in a period which is N times of unit time, traversing and judging the sleep staging stage in the period which is N times of the unit time according to the sleep staging stage rule;
step S222, splicing N times of unit time period data, and updating the sleep staging stage again: and according to the sleep state updating rule, scoring the sleep staging in the unit time period which is N times that of the sleep staging, and updating the sleep staging again.
7. The sleep staging identification method based on human monitored sleep data as claimed in claim 6, wherein the unit time in step S21 is 1S, and the N times period of the unit time in step S221 is 60 times period.
8. The sleep stage identification method based on human body monitoring sleep data as claimed in claim 5, wherein the sleep state initial determination rule in the step S21 is specifically: if the signal intensity is less than the detection distance between the equipment and the human body (the signal intensity focus point + the signal intensity focus point standard deviation) and the heart rate is greater than 0, the respiratory rate is greater than 0, and the signal intensity is-5, judging that the equipment is in a sleep state in the unit time; otherwise, judging that the device is in the non-sleep state in the unit time.
9. The sleep staging identification method based on human monitoring sleep data as claimed in claim 6, wherein the sleep staging rules in step S221 are specifically:
when the state is 'sleep' and the occurrence frequency within N times of unit time period is more than or equal to N times of unit, the sleep staging stage is judged to be 'deep sleep';
when the state is 'sleep', and the occurrence frequency in the N times of unit time period is less than N times of units and is greater than or equal to 2/3N times of units, judging the sleep staging stage as 'light sleep';
when the state is 'sleep', and the occurrence frequency in the N times of unit time period is less than 2/3N times of units and is more than or equal to 1/2N times of units, the sleep staging stage is judged to be 'rapid eye movement';
and when the state is 'sleep', and the occurrence frequency in the N times of unit time period is less than 1/2N times of units and is more than or equal to 0, judging that the sleep staging stage is 'wakeful'.
10. The sleep stage identification method based on human body monitoring sleep data as claimed in claim 9, wherein the step S222 of updating the sleep state rule specifically includes the following steps:
step S2221, defining a set X as four attribute values of distance, respiration rate, heart rate and signal intensity respectively corresponding to a deep sleep stage, a light sleep stage and a rapid eye movement stage in sleep staging stages;
step S2222, the four attribute values in the set X are normalized:
Figure FDA0002877854420000031
step S2223, summing the normalized standard deviations of the four attributes (std (the detection distance between the device and the human body) + std (respiratory rate) + std (heart rate) + std (signal strength))
Figure FDA0002877854420000041
Wherein std (the detection distance between the device and the human body) is a distance standard deviation, std (respiration rate) is a respiration rate standard deviation, std (heart rate) is a heart rate standard deviation, std (signal strength) is a signal strength standard deviation, and score2 is the sum of the standard deviations of the four attributes;
step S2224, updating the sleep staging stage according to the standard deviation sum score 2:
when in a sleep staging that is not "awake," and score < t1, the updated sleep staging is "deep sleep";
when in a non-waking sleep stage, and score ≧ t1 and score < t2, the renewed sleep stage is "light sleep";
when in a non "awake" sleep stage, and the score ≧ t2, the updated sleep stage is "rapid eye movement".
Wherein, the set sleep stage threshold rule of each stage is t1< t2, t1 and t2, which are historical training experience data.
CN202011639274.7A 2020-12-31 Sleep stage identification method based on human body monitoring sleep data Active CN112842266B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011639274.7A CN112842266B (en) 2020-12-31 Sleep stage identification method based on human body monitoring sleep data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011639274.7A CN112842266B (en) 2020-12-31 Sleep stage identification method based on human body monitoring sleep data

Publications (2)

Publication Number Publication Date
CN112842266A true CN112842266A (en) 2021-05-28
CN112842266B CN112842266B (en) 2024-05-14

Family

ID=

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114145717A (en) * 2021-12-08 2022-03-08 四川北易信息技术有限公司 Sleep state analysis method based on PPG heart rate characteristic parameters and motion quantity
CN116369868A (en) * 2023-06-07 2023-07-04 青岛大学附属医院 Sleep stage monitoring method and device based on big data

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007070247A2 (en) * 2005-11-29 2007-06-21 Venture Gain, L.L.C. Residual-based monitoring of human health
CN101272732A (en) * 2005-05-10 2008-09-24 索尔克生物学研究所 Automated detection of sleep and waking states
JP2011083393A (en) * 2009-10-14 2011-04-28 Osaka Bioscience Institute Apparatus and method for automatically identifying sleep stage, and computer program for the same
US20110218454A1 (en) * 2008-11-14 2011-09-08 Philip Low Methods of Identifying Sleep & Waking Patterns and Uses
KR20140120513A (en) * 2013-04-03 2014-10-14 삼성전자주식회사 Apparatus and method for determining sleep stages
CN104834946A (en) * 2015-04-09 2015-08-12 清华大学 Method and system for non-contact sleep monitoring
EP2919142A1 (en) * 2014-03-14 2015-09-16 Samsung Electronics Co., Ltd Electronic apparatus and method for providing health status information
CN106073713A (en) * 2016-06-17 2016-11-09 美的集团股份有限公司 A kind of method and apparatus determining sleep stage
JP2016209327A (en) * 2015-05-11 2016-12-15 沖電気工業株式会社 Sleep depth estimation device, sleep depth estimation method and program
CN106510641A (en) * 2016-12-19 2017-03-22 姚健欣 Method and system for detecting different sleep stages of human body
CN106618526A (en) * 2016-11-17 2017-05-10 杭州伯仲信息科技有限公司 Method and system for monitoring sleep
US20180064388A1 (en) * 2016-09-06 2018-03-08 Fitbit, Inc. Methods and systems for labeling sleep states
CN109717835A (en) * 2018-12-21 2019-05-07 南京理工大学 A kind of sound of snoring position detection method based on microphone array
KR20190100888A (en) * 2018-02-21 2019-08-29 연세대학교 원주산학협력단 Sleep stage classification device and method using pulse oximeter
CN110192862A (en) * 2019-05-31 2019-09-03 湖南省顺鸿智能科技有限公司 A kind of contactless humanbody breathing detection method and device based on radar
CN110236491A (en) * 2019-05-16 2019-09-17 华南师范大学 A kind of sleep stage monitoring method
CN111067503A (en) * 2019-12-31 2020-04-28 深圳安视睿信息技术股份有限公司 Sleep staging method based on heart rate variability
CN112006652A (en) * 2019-05-29 2020-12-01 深圳市睿心由科技有限公司 Sleep state detection method and system

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101272732A (en) * 2005-05-10 2008-09-24 索尔克生物学研究所 Automated detection of sleep and waking states
WO2007070247A2 (en) * 2005-11-29 2007-06-21 Venture Gain, L.L.C. Residual-based monitoring of human health
US20110218454A1 (en) * 2008-11-14 2011-09-08 Philip Low Methods of Identifying Sleep & Waking Patterns and Uses
CN102438515A (en) * 2008-11-14 2012-05-02 索尔克生物学研究所 Methods of identifying sleep and waking patterns and uses
JP2011083393A (en) * 2009-10-14 2011-04-28 Osaka Bioscience Institute Apparatus and method for automatically identifying sleep stage, and computer program for the same
KR20140120513A (en) * 2013-04-03 2014-10-14 삼성전자주식회사 Apparatus and method for determining sleep stages
EP2919142A1 (en) * 2014-03-14 2015-09-16 Samsung Electronics Co., Ltd Electronic apparatus and method for providing health status information
CN104834946A (en) * 2015-04-09 2015-08-12 清华大学 Method and system for non-contact sleep monitoring
JP2016209327A (en) * 2015-05-11 2016-12-15 沖電気工業株式会社 Sleep depth estimation device, sleep depth estimation method and program
CN106073713A (en) * 2016-06-17 2016-11-09 美的集团股份有限公司 A kind of method and apparatus determining sleep stage
US20180064388A1 (en) * 2016-09-06 2018-03-08 Fitbit, Inc. Methods and systems for labeling sleep states
CN106618526A (en) * 2016-11-17 2017-05-10 杭州伯仲信息科技有限公司 Method and system for monitoring sleep
CN106510641A (en) * 2016-12-19 2017-03-22 姚健欣 Method and system for detecting different sleep stages of human body
KR20190100888A (en) * 2018-02-21 2019-08-29 연세대학교 원주산학협력단 Sleep stage classification device and method using pulse oximeter
CN109717835A (en) * 2018-12-21 2019-05-07 南京理工大学 A kind of sound of snoring position detection method based on microphone array
CN110236491A (en) * 2019-05-16 2019-09-17 华南师范大学 A kind of sleep stage monitoring method
CN112006652A (en) * 2019-05-29 2020-12-01 深圳市睿心由科技有限公司 Sleep state detection method and system
CN110192862A (en) * 2019-05-31 2019-09-03 湖南省顺鸿智能科技有限公司 A kind of contactless humanbody breathing detection method and device based on radar
CN111067503A (en) * 2019-12-31 2020-04-28 深圳安视睿信息技术股份有限公司 Sleep staging method based on heart rate variability

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
于莹;王蓓;马家睿;王行愚;: "基于改进K均值聚类及其距离修正的睡眠分期方法", 计算机应用, no. 1 *
井晓茹;胡晏婷;王俊;: "睡眠分期的符号转移熵分析", 北京生物医学工程, no. 04 *
贾花萍: "K-means聚类神经网络分类器在睡眠脑电分期中的应用研究", 河南科学, vol. 30, no. 6, pages 730 - 732 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114145717A (en) * 2021-12-08 2022-03-08 四川北易信息技术有限公司 Sleep state analysis method based on PPG heart rate characteristic parameters and motion quantity
CN116369868A (en) * 2023-06-07 2023-07-04 青岛大学附属医院 Sleep stage monitoring method and device based on big data
CN116369868B (en) * 2023-06-07 2023-08-11 青岛大学附属医院 Sleep stage monitoring method and device based on big data

Similar Documents

Publication Publication Date Title
CN106714220B (en) One kind being based on MEA-BP neural network WSN method for detecting abnormality
CN112869711B (en) Automatic sleep staging and migration method based on deep neural network
CN111631688B (en) Algorithm for automatic sleep staging
CN109299690B (en) Method capable of improving video real-time face recognition precision
CN109243494A (en) Childhood emotional recognition methods based on the long memory network in short-term of multiple attention mechanism
CN107205652A (en) The sleep analysis system with automatic mapping is generated with feature
CN110584596A (en) Sleep stage classification method based on dual-input convolutional neural network and application
CN111407243A (en) Pulse signal pressure identification method based on deep learning
CN115316991B (en) Self-adaptive recognition early warning method for irritation emotion
CN113889252B (en) Remote internet big data intelligent medical system based on vital sign big data clustering core algorithm and block chain
CN109645989A (en) A kind of depth of anesthesia estimation method and system
CN103258545A (en) Pathological voice subdivision method
CN104408470A (en) Gender detection method based on average face preliminary learning
CN107766898A (en) The three classification mood probabilistic determination methods based on SVM
CN111415099A (en) Poverty-poverty identification method based on multi-classification BP-Adaboost
US20230237699A1 (en) Method and system for itelligently controlling children&#39;s usage of screen terminal
CN110321555A (en) A kind of power network signal classification method based on Recognition with Recurrent Neural Network model
CN111685779A (en) Schizophrenia disorder screening method based on portable EEG equipment
CN113440122A (en) Emotion fluctuation monitoring and identification big data early warning system based on vital signs
CN114847958A (en) Stress and fatigue monitoring method and system based on electrocardiosignals
CN112842266A (en) Sleep stage identification method based on human body monitoring sleep data
CN112842266B (en) Sleep stage identification method based on human body monitoring sleep data
CN109545198A (en) A kind of Oral English Practice mother tongue degree judgment method based on convolutional neural networks
CN112617761B (en) Sleep stage staging method for self-adaptive focalization generation
CN116763324A (en) Single-channel electroencephalogram signal sleep stage method based on multiple scales and multiple attentions

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20240410

Address after: 410000 Hotel and Apartment 501, Phase II, Xianglu International Garden, 61 Lufeng Road, High-tech Development Zone, Changsha City, Hunan Province

Applicant after: HUNAN ZENNZE TECHNOLOGY CO.,LTD.

Country or region after: China

Address before: 618, building 1, Xiangyu wisdom, 579 Chezhan North Road, Dongfeng Road Street, Kaifu District, Changsha City, Hunan Province, 410000

Applicant before: Hunan Dongsheng Nanxiang Intelligent Technology Co.,Ltd.

Country or region before: China

TA01 Transfer of patent application right
GR01 Patent grant