US20170055898A1 - Determining Sleep Stages and Sleep Events Using Sensor Data - Google Patents

Determining Sleep Stages and Sleep Events Using Sensor Data Download PDF

Info

Publication number
US20170055898A1
US20170055898A1 US15/249,108 US201615249108A US2017055898A1 US 20170055898 A1 US20170055898 A1 US 20170055898A1 US 201615249108 A US201615249108 A US 201615249108A US 2017055898 A1 US2017055898 A1 US 2017055898A1
Authority
US
United States
Prior art keywords
sleep
heart rate
data
sleep stage
time period
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US15/249,108
Other versions
US10321871B2 (en
Inventor
Amrit Bandyopadhyay
Gilmer Blankenship
Raghu Upender
Madhvi Upender
Chris Giles
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Awarables Inc
Original Assignee
Awarables Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Awarables Inc filed Critical Awarables Inc
Priority to US15/249,108 priority Critical patent/US10321871B2/en
Assigned to Awarables, Inc. reassignment Awarables, Inc. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GILES, CHRIS, BANDYOPADHYAY, AMRIT, BLANKENSHIP, GILMER, UPENDER, MADHVI, UPENDER, RAGHU
Publication of US20170055898A1 publication Critical patent/US20170055898A1/en
Application granted granted Critical
Publication of US10321871B2 publication Critical patent/US10321871B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • A61B5/4812Detecting sleep stages or cycles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/0022Monitoring a patient using a global network, e.g. telephone networks, internet
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02405Determining heart rate variability
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02438Detecting, measuring or recording pulse rate or heart rate with portable devices, e.g. worn by the patient
    • A61B5/0402
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/0816Measuring devices for examining respiratory frequency
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/30Input circuits therefor
    • A61B5/303Patient cord assembly, e.g. cable harness
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/318Heart-related electrical modalities, e.g. electrocardiography [ECG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/318Heart-related electrical modalities, e.g. electrocardiography [ECG]
    • A61B5/346Analysis of electrocardiograms
    • A61B5/349Detecting specific parameters of the electrocardiograph cycle
    • A61B5/352Detecting R peaks, e.g. for synchronising diagnostic apparatus; Estimating R-R interval
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • A61B5/4818Sleep apnoea
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/486Bio-feedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6804Garments; Clothes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/681Wristwatch-type devices
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7246Details of waveform analysis using correlation, e.g. template matching or determination of similarity
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7253Details of waveform analysis characterised by using transforms
    • A61B5/7257Details of waveform analysis characterised by using transforms using Fourier transforms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7278Artificial waveform generation or derivation, e.g. synthesising signals from measured signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/12Healthy persons not otherwise provided for, e.g. subjects of a marketing survey
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/02Operational features
    • A61B2560/0242Operational features adapted to measure environmental factors, e.g. temperature, pollution
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/06Arrangements of multiple sensors of different types
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • A61B5/02416Detecting, measuring or recording pulse rate or heart rate using photoplethysmograph signals, e.g. generated by infrared radiation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/0826Detecting or evaluating apnoea events
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/30Input circuits therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4058Detecting, measuring or recording for evaluating the nervous system for evaluating the central nervous system
    • A61B5/4064Evaluating the brain
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6823Trunk, e.g., chest, back, abdomen, hip
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6898Portable consumer electronic devices, e.g. music players, telephones, tablet computers

Definitions

  • the present disclosure relates to systems to monitor and analyze the quality and quantity of a person's sleep.
  • a person's sleep can be assessed with a polysomnogram (PSG), which is a multi-channel procedure carried out in a sleep laboratory. Typically, the procedure requires labor-intensive technician support, resulting in an expensive process.
  • PSG polysomnogram
  • the studies are typically performed for a single night in a Sleep Laboratory and sometimes also during the day to study daytime sleepiness, e.g., with a Multiple Sleep Latency Test (MSLT).
  • MSLT Multiple Sleep Latency Test
  • the results of the sleep study are primarily (i) indices related to apnea events, such as an Apnea-Hypopnea Index (AHI), and (ii) sleep staging outputs that indicate the stages of sleep that occurred.
  • AHI Apnea-Hypopnea Index
  • a typical PSG requires multiple EEG (electroencephalogram) channels, an EOG (electrooculogram), an EKG (electrocardiogram), an EMG (electromyography), and analysis of data from other sensors.
  • EEG epidermal pressure
  • EOG electroencephalogram
  • EKG electrocardiogram
  • EMG electromyography
  • a computer-implemented method includes obtaining sensor data from biometric and environmental sensors, and using sensor fusion to predict stages of sleep, detect sleep events, determine sleep metrics and score the sleep sessions.
  • sleep stages can be determined from heart rate measurements and inertial sensors.
  • the techniques described in the present application can extend the benefits of sleep analysis with clinically validated mechanisms to achieve similar results as those obtained in a sleep lab in the comfort of the home environment over multiple nights.
  • the home environment is the normal sleep environment of the subject, providing a more realistic assessment of the subject's sleep behavior. Recording over several nights permits a more accurate analysis that can be used to extract long-term sleep patterns.
  • the systems described herein can use a variety of techniques to analyze sleep-related sensor data and estimate sleep stages and sleep events from the data.
  • the stages of a person's sleep can be determined using, for example, one or more of heart rate data, heart rate variability (HRV) data, and sensor data indicating motion.
  • HRV heart rate variability
  • sleep stages including REM, slow wave sleep or deep sleep (N3), light sleep (N1, N2) and Wake are estimated using processes including but not limited to sensor signal pattern matching, learning, HRV frequency analysis, sensor fusion, approximate entropy detection, and/or rule-based decision making.
  • the processing techniques discussed herein can address several technical challenges in the implementation of a small wearable body data recorder that is not tethered to external computers.
  • the processing techniques discussed herein allow the small, battery-operated body data recorder to produce analysis results that can effectively identify sleep stages while minimizing power consumption, network communication bandwidth, and computation while maintaining a high degree of accuracy in assigning sleep stages.
  • the body data recorder may communicate with other devices over a low-power or low-bandwidth wireless connection, such as Bluetooth, which can significantly constrain the amount of data that can be reliably transmitted.
  • the body data recorder may be designed to avoid wired connections with outside devices to avoid inconveniencing the subject, since wires connecting to other devices could potentially alter the subject's behavior and negatively affect the measurements. Further, transferring significant amounts of data is power inefficient for the battery-operated body data recorder. By performing at least the initial processing of EKG and heartbeat data locally within the wearable body data recorder, the amount of data that needs to be stored or transmitted and the amount of power consumed during transmission of data is minimized. Rather than storing or sending data for every EKG sample, the processing techniques discussed herein allow the results of beat detection, frequency analysis, and approximate entropy calculations, among others, to be stored and transmitted instead. The body data recorder may send this data further processing, e.g., by a mobile device or a remote computer system.
  • the body data recorder itself completes the processing discussed here to generate sleep stage labels.
  • the use of a trained classifier, and the use of generated histograms as well as the other values used provide an accurate technique for determining sleep stages while also providing computational efficiency that allows the techniques to be implemented on a small, power and processing-constrained device.
  • the estimation of sleep stages is made by a sleep stage classifier that has been trained using examples of sensor data and corresponding data indicating actual sleep stages.
  • the method includes a procedure for training the system using an examples of input signals representing sensor measurements and corresponding data indicating actual sleep stages.
  • the method can include evaluating a set of input signals using the trained system to predict sleep stages.
  • a computer-implemented method may use raw heart rate, smoothed heart rate, LF/HF, and approximate entropy as input signals to said system. Signals can be passed along with ground truth to the training system to generate a sleep stage evaluation engine, which can evaluate new input signals to produce a prediction of sleep stage.
  • sensor data may be obtained from a wearable device that houses sensors to capture EKG data and sensors to capture motion data such as accelerometers and gyroscopes.
  • EKG data instead of using EKG data, an estimation of heart rate is directly taken as input to the method.
  • EKG data and heart rate data can be taken from sources that are worn or not in contact with the subject.
  • a sensor using cardioballistic EKG technology may be included in bedding that does not require skin contact.
  • EKG data, motion data, breathing rate data and sound data are provided as input to the engine.
  • the heart rate data may be obtained in the form of pulse-rate data from a wrist-worn device or pulse-oximeter.
  • sensor data or biometric data is obtained from polysomnography data recorded in a sleep laboratory.
  • a method performed by one or more computing devices includes: obtaining, by the one or more computing devices, sensor data generated by one or more sensors over a time period while a person is sleeping; dividing, by the one or more computing devices, the time period into a series of intervals; analyzing, by the one or more computing devices, heart rate and the changes in the heart rate of the person indicated by the sensor data over the intervals; based on the analysis of the heart rate changes, assigning, by the one or more computing devices, sleep stage labels to different portions of the time period; and providing, by the one or more computing devices, an indication of the assigned sleep stage labels.
  • inventions include corresponding systems, apparatus, and computer programs, configured to perform the actions of the methods, encoded on computer storage devices.
  • a system of one or more computing devices can be so configured by virtue of software, firmware, hardware, or a combination of them installed on the system that in operation cause the system to perform the actions.
  • One or more computer programs can be so configured by virtue having instructions that, when executed by data processing apparatus, cause the apparatus to perform the actions. Implementations can include one or more of the features discussed below
  • the sleep stage labels include wake, rapid eye movement (REM) sleep, light sleep, and deep or slow-wave sleep.
  • REM rapid eye movement
  • assigning the sleep stage labels includes determining, for each of the intervals, (i) a likelihood that the interval corresponds to REM sleep, and (ii) an indication whether the interval is classified as REM sleep.
  • assigning the sleep stage labels includes determining, for each of the intervals, (i) a likelihood that the interval corresponds to light sleep, and (ii) an indication whether the interval is classified as light sleep.
  • assigning the sleep stage labels includes determining, for each of the intervals, (i) a likelihood that the interval corresponds to slow-wave sleep, and (ii) an indication whether the interval is classified as slow-wave sleep.
  • assigning the sleep stage labels includes determining, for each of the intervals, (i) a likelihood that the interval corresponds to a wake stage, and (ii) an indication whether the interval is classified as a wake stage.
  • assigning the sleep stage labels includes determining, for each of the intervals, (i) a likelihood that the interval corresponds to the person being awake or asleep, and (ii) an indication whether the person is classified as being awake or asleep.
  • the sensor data includes EKG signal data from EKG sensor.
  • analyzing the changes in the heart rate of the person includes determining heart rate variability characteristic scores for different portions of the sleep session, and the sleep stage labels are assigned based at least in part on the heart rate variability scores.
  • analyzing the changes in the heart rate of the person includes determining measures of randomness of heartbeat data for different sliding windows of the time period, wherein the sleep stage labels are assigned based at least in part on the heart rate variability scores.
  • the measure of randomness of heartbeat data is computed by determining a measure of approximate entropy based on the heartbeat data.
  • analyzing the heart rate of the person includes evaluating the value of the EKG signal in each interval with respect to the values in a neighborhood of the interval, wherein the neighborhood for each interval is a time window that extends from a first time threshold that precedes the interval to a second time threshold that follows the interval.
  • the obtained sensor data corresponds to a sleep session of the person; and analyzing the heart rate of the person includes evaluating an absolute value of the EKG signal with respect to the rest of the heart rate values for the sleep session and predetermined thresholds.
  • analyzing the changes in the heart rate of the person includes: generating a heart-rate variability (HRV) signal; performing a frequency analysis of the HRV signal; and examining a ratio of low frequency components of the HRV signal to high frequency components of the HRV signal.
  • HRV heart-rate variability
  • the sensor data includes movement data from one or more motion sensors; and the method includes determining correlations of the movement data with sensor data indicating heart beats of the person, wherein the sleep stage labels are assigned based at least in part on the movement data and the correlations between the movement data and the sensor data indicating the heart beats of the person.
  • assigning a sleep stage label to a particular portion of the time period includes: obtaining likelihood scores from multiple different sleep stage analysis functions; and performing a sleep stage classification decision for the particular portion of the time period based on a combination of the likelihood scores from the multiple different sleep stage analysis functions.
  • multiple sleep stage functions and likelihoods are fused into a single sleep stage label by evaluating a classification function.
  • assigning a sleep stage label to a particular portion of the time period includes: providing multiple distinct signals that are separately correlated to sleep stages as input to a sleep stage classifier; and obtaining a single sleep stage label for the particular portion of the time period based on output of the sleep stage classifier.
  • the method includes: obtaining data sets indicating signals or function outputs, the data sets being labeled with sleep stage labels corresponding to the data sets; and training a sleep stage classifier based on the data sets to produce output indicating likelihoods that input data corresponds to a sleep stage from among a set of sleep stages.
  • the method includes: determining, for each of the signals or function outputs, a signal range histogram for the signal or function output; and using the signal range histograms to train a sleep stage classifier.
  • the method includes determining a signal range histogram for a signal, wherein the signal range histogram indicates, for each particular signal range of multiple signal ranges, a count of examples which have a particular sleep label assigned and have a value of the signal in the particular signal range.
  • determining the signal range histogram for the signal includes determining counts for each sleep label of the multiple sleep labels, for each signal range of the multiple signal ranges.
  • the method includes training or using a sleep stage classifier configured to receive, as input, signals indicating measurements during a sleep session of a person; and wherein the sleep stage classifier is configured to generate, for each interval of the sleep session, (i) a sleep stage probability distribution that indicates a likelihood for each of multiple different sleep stages, and (ii) a sleep stage label.
  • a sleep stage classifier configured to receive, as input, signals indicating measurements during a sleep session of a person; and wherein the sleep stage classifier is configured to generate, for each interval of the sleep session, (i) a sleep stage probability distribution that indicates a likelihood for each of multiple different sleep stages, and (ii) a sleep stage label.
  • the sleep stage classifier is configured to generate the sleep stage probability distribution and the sleep stage label for a particular interval by: for each of the signals provided as input to the sleep stage classifier, accessing a histogram for the signal and determining a count indicated by the histogram for each sleep stage label corresponding to a value of the signal during the particular interval; and computing, for each of the sleep stages, a total count across all signals.
  • data used to train a sleep stage classifier and to classify sleep stage labels includes include (i) signals including data used in polysomnograms and breathing rate data, and (ii) time-correlations of sleep stages among the signals.
  • analyzing the changes in the heart rate of the person includes analyzing overlapping windows of the sensor data.
  • the sensor data includes heartbeat data from a heartbeat sensor.
  • the heartbeat sensor is an infrared sensor or an optical sensor.
  • the heartbeat sensor is a pulse rate sensor.
  • analyzing the changes in the heart rate of the person includes sampling the EKG signal at a rate between 100 Hz to 2 KHz; and applying a low pass filter to the sampled signal.
  • dividing the time period into a series of intervals includes diving the time period into adjacent periods each having a same duration.
  • dividing the time period into a series of intervals includes overlapping sliding windows having the same duration, with a sliding window being centered at each sample of heart rate data.
  • analyzing the changes in heart rate includes detecting heartbeats indicated by the sensor data.
  • detecting the heartbeats includes: detecting peaks in an EKG signal; determining a derivative signal from of the EKG signal by determining, for each sample of the EKG signal, a difference of between a current sample and the immediately previous sample; and identifying R waves based on applying one or more thresholds to the derivative signal.
  • detecting the heartbeats includes: identifying local maxima and local minima in an EKG signal; computing ranges between the maximum value of the EKG signal and minimum value of the EKG signal within each of multiple windows of the EKG signal; and assigning, to each of the multiple windows, a probability of the window representing a beat.
  • detecting heartbeats includes: evaluating data indicating a set of heartbeats determined by a heartbeat detection process; determining, based on the evaluation, a likelihood score indicating a likelihood that a beat was omitted from the detected heartbeats by the heartbeat detection process; and based on the likelihood score, missed in beat detection and labeling additional beats when computed to be likely.
  • analyzing the changes in heart rate includes determining R-R intervals for the detected heartbeats, the R-R intervals indicating an amount of time between adjacent R wave peaks in the EKG signal.
  • analyzing the changes in heart rate includes averaging the R-R intervals for beats detected within each second and then computing a measure of beats-per-minute corresponding to the average R-R interval.
  • analyzing the changes in heart rate includes determining an average heart rate for each of multiple windows of the EKG signal.
  • each of the multiple windows has a same duration, and the duration of each window is between 0.5 seconds and 5 seconds.
  • obtaining sensor data generated by one or more sensors over the time period while a person is sleeping includes obtaining sensor data for a sleep session that represents a single night.
  • obtaining the sensor data, dividing the time period into a series of intervals, and analyzing the heart rate and the changes in the heart rate are performed by a wearable, battery-operated device, wherein the wearable device includes sensors that detect the sensor data.
  • assigning the sleep stage labels and providing the indication of the assigned sleep stage labels are performed by the wearable device.
  • the method includes providing, by the wearable device, results of analyzing the heart rate and the changes in the heart rate to a second device for transmission to a server system; and wherein assigning the sleep stage labels and providing the indication of the assigned sleep stage labels are performed by the server system.
  • the second device is a mobile phone
  • providing the indication of the assigned sleep stage labels includes providing, by the server system, the indication to the mobile phone for display by the mobile phone.
  • FIG. 1 is a flow diagram that illustrates an example of a method for estimating sleep stages from sensor data.
  • FIG. 2 is a flow diagram that illustrates an example of a method for using heart rate variability frequency analysis to determine sleep stage patterns.
  • FIG. 3 is a chart illustrating an example of raw EKG data, filtered signals, and detected heart beats generated by a body data recorder (BDR) such as the one shown in FIGS. 11A and 11B .
  • BDR body data recorder
  • FIG. 4 is a chart illustrating a zoomed-in view of a portion of the data from the chart of FIG. 3 .
  • FIG. 5 is a chart illustrating sleep stages with respect to approximate entropy (ApEn) measures.
  • FIG. 6 is a chart illustrating an example of heart rate variability (HRV) low-frequency vs. high-frequency (LF/HF) ratios and indicators of sleep stages determined from EKG signals.
  • HRV heart rate variability
  • LF/HF high-frequency
  • FIG. 7 illustrates examples of charts showing HRV frequency spectra for a healthy subject for wake stages (top left), REM Stages (top right), slow wave sleep (SWS) stages (bottom left), and S2 sleep stages (bottom right).
  • FIG. 8 illustrates examples of charts showing HRV frequency spectra for a child with ADHD for wake stages (top left), REM Stages (top right), slow wave sleep (SWS) stages (bottom left), and S2 sleep stages (bottom right).
  • FIG. 9 is a chart illustrating an example of staging (Light-S1 & S2, Deep-SWS, REM, Wake) for a healthy subject using the techniques disclosed herein, versus results of manual staging.
  • the chart demonstrates effective staging of the disclosed techniques, including for REM and SWS stages.
  • FIG. 10 is a chart illustrating correlation of motion data with Sleep/Wake detection during sleep.
  • FIGS. 11A and 11B are illustrations of an example of a wearable body data recorder.
  • FIG. 12 is an illustration of an example of a T-shirt housing wearable sensors, for use with a body data recorder.
  • FIG. 13 is a conceptual drawing illustrating example of the components of a sleep monitoring system.
  • sensors that can be housed in a comfortable, wearable device.
  • the techniques disclosed herein allow clinically validated home sleep monitoring of this type. It would also provide an effective consumer electronic device to assess personalized health.
  • sensors including EEGs, heart rate (e.g., EKGs or PPGs—Photoplethysmograms) and heart rate variability (HRV), oxygen saturation, activity sensors, EMGs, etc., can be used to enable automatic sleep staging mechanisms.
  • HRV heart rate variability
  • the selection and combination of sensors coupled with effective processing methods can impact the accuracy and quality of the sleep analysis and staging results. Multiple sensors can be included in a wearable device or system after taking into account their accuracy, cost, form-factor, and invasiveness.
  • Heart rate and heart rate variability signals can be used in a staging process closely reflecting several of the signals in the EEGs.
  • EEGs are the key signals used for staging in sleep labs.
  • HRV information integrates sympathetic and parasympathetic activity of the autonomic nervous system that varies across sleep stages and therefore can be an effective indicator of sleep-wake physiology.
  • the technology provides many different features and advantages. For example, it allows a system to take an EKG signal or heart-rate signal and determine sleep stages from it.
  • the sleep stage detected can include REM, deep or slow wave sleep, and light sleep.
  • the sleep stages can be determined automatically by the system, within technician scoring error for manual sleep studies. Additional aspects of heart rate can be used. For example, randomness of the heart rate signal, HRV frequency analysis, and/or approximate entropy measures can be used to determine the sleep stages. As another example, relative increases in heart rate and/or absolute value of the heart rate signal may be used by the system to classify sleep into stages. The system may also convert EKG signals to heart rate measures, which can then be further processed.
  • the heart rate data used can be obtained from any appropriate heart-rate sensing device, pulse rate sensing device, or EKG sensing device.
  • EKG data is of low quality, or there are gaps or noise in the data
  • the techniques allow the conversion from EKG signals to heart beats and heart rate even when one or more peaks or valleys in the EKG signal are missing.
  • the system can determine likelihoods of missed beats, and then supplement the detected beats with additional beats that were likely obscured or missing in the raw data.
  • Heart rate data and other data can be provided to a trained classifier that provides outputs indicative of likelihoods that the input signals correspond to different sleep stages.
  • a sleep stage classifier can be trained to fuse different sensor signals into a single estimate. This process may train the system using any of various signals used in polysomnograms (PSGs) such as EEGs (electroencephalograms), EMGs (electromyograms), EOGs (electrooculograms), and pulse-oximetry. Other signals, such as motion data, sound data, breathing rate data, and so on can also be used as signals to the classifier.
  • PSGs polysomnograms
  • Other signals such as motion data, sound data, breathing rate data, and so on can also be used as signals to the classifier.
  • Examples of data tagged with sleep stage labels can be used to train the classifier to estimate sleep stages from various sets of data.
  • the classifier or a decision framework can learn to fuse arbitrary sets of signals and functions that are separately related to sleep into a single sleep stage prediction.
  • the sleep stage information allows a number of other measurements to be determined. For example, a sleep onset latency, a count of awakenings and duration of awakenings during a sleep session, and a wake time can be determined.
  • a sleep onset latency a count of awakenings and duration of awakenings during a sleep session
  • a wake time can be determined.
  • One of the valuable results is to identify REM states and wake states.
  • the sleep staging can be enhanced using motion sensing, especially for the detection of wake stages and REM stages.
  • FIG. 1 is a flow diagram that illustrates an example of a method 100 for estimating sleep stages from sensor data.
  • the operations of the method 100 may be performed by a wearable body data recorder, such as the one described below and illustrated in FIGS. 11A and 11B .
  • the operations may be performed by another device, such as a phone or computer, that receives sensor data from the wearable device.
  • EKG data for a subject is acquired while the subject is sleeping.
  • sensor data indicating motion of the subject can also be also be acquired.
  • method 100 can take as input an EKG signal sampled at any of multiple rates. Sampling at 100 Hz to 2 KHz or more is highly effective for heart beat detection. At lower rates, beat detection is feasible with the possibility of some missed or extra beats during noisy phases in the signal. In these cases, extra processing can be used to filter out extra beats or probabilistically detect the possibility of missed beats. A signal with some noise can be filtered using a 6th order low pass Butterworth filter to remove the noise and reveal the QRS complexes.
  • step 110 motion quantity and quality are determined, and events in epochs are determined.
  • Data from one or more motion sensors such as an accelerometer or gyroscope, can be recorded by a body data recorder.
  • the timing of movement data can be recorded so it can be correlated with the EKG signal and/or heartbeat sensor signal detected at the same time.
  • Other sensor data can also be detected and aligned with heart data. For example, data from a microphone, a light sensor, and other sensors can be detected and correlated with heart data for use in determining sleep stages.
  • heart beats and R-R intervals are determined from the sensor data.
  • the R-R intervals represent periods between the peaks of adjacent R waves of the EKG signal.
  • the EKG signal can be processed by a beat detection method that examines the signal for the QRS complexes. This can be achieved using peak-detection where the R-peaks are detected above a threshold value with a minimum threshold separation in time to prevent qualification of invalid peaks.
  • the derivative of the signal is first computed by finding the difference of every sample from the previous sample in the EKG time series. The peak detection can then be applied on this derived signal using peak and separation thresholds customized for the derivative signal.
  • the results of beat detection can then be used to determine the R-R intervals i.e. the time between consecutive beats.
  • the heartbeat can be computed as a time-series where the heart rate for an interval of i seconds is 60/i bpm.
  • the heart rate is computed every second (i.e., 1 Hz). This can be achieved by averaging the R-R intervals for beats detected within each second and then computing the beats-per-minute (bpm) corresponding to the average R-R interval.
  • One-second time periods with no R-R intervals can be replaced with the preceding second's BPM computation.
  • a probabilistic method can be used for beat detection.
  • overlapping slider windows can be iterated over.
  • the slider window can be set to be the maximum distance between a peak and valley in the EKG beat signal.
  • computations can be made to determine the range, max, min values.
  • the differences of the range, max, and mins can be computed from the average range, maximum value, and minimum value in all the slider windows. These differences can be converted into probabilities that it is a peak, valley, or peak-valley range based on the statistical observations from the source.
  • a weighted mean of the range, peak, and valley probabilities can be used to combine these individual probabilities, and to assign a probability that this window is a beat.
  • a peak detection on these probabilities with a separation threshold, the minimum beat separation, can be used to filter out the overlapping windows.
  • the beat-detection data is converted to heart rate values, and various checks or filtering can be performed.
  • An additional iteration over all the detected beats can be performed.
  • the average or median recent heart rate can be computed, and compared via a ratio to the current beat-to-beat heart rate. If the ration is very close to 0.5, it is highly likely that a beat has been missed, and a beat can be inserted at the peak or valley. Similar method can be used for sparse extra beats detected. The above method allows for detection of beats when occasionally either a peak or valley is not captured in a beat data or the entire beat is not properly captured due to device limitations.
  • the heart rate data which in some implementations is computed once per second, can be checked for inconsistencies and invalid data such as outages in the heart rate stream due to any device malfunctions or incorrect beat detections. This check can be performed for all possible heart rate sources that include EKG interpretations, pulse rate recording, or heart rate reported by third-party devices. Invalid heart rate data can be detected by checking for default values that the heart rate reverts to such as zero for certain devices, and spikes and fluctuations to values that are not within the range of possible heart rate values. Invalid detection can be customized to the source EKG.
  • a 6th order low pass Butterworth filter is used to preprocess the heart rate signal for specific subroutines in the method. In some implementations, this is implemented using a cutoff frequency of 0.01 Hz. Alternately, in some implementations, the filter is bypassed with customized filters and thresholds for different EKG or heart rate sources to account for the differences between these devices.
  • FIGS. 3 and 4 illustrate examples of raw EKG data, filtered signals, and detected heart beats. Detection of the EKG data, as well as generation of the filtered signals and detected heartbeats can be performed by a wearable body data recorder (BDR) such as the one shown in FIGS. 11A and 11B .
  • BDR wearable body data recorder
  • the processing techniques discussed herein allow the small, battery-operated body data recorder to produce analysis results that can effectively identify sleep stages while minimizing power consumption, network communication bandwidth, and computation while maintaining a high degree of accuracy in assigning sleep stages. At the sampling rates needed for effective sleep stage determination, e.g., often 200 Hz-2000 Hz, a significant amount of data is generated.
  • the body data recorder may communicate with other devices over a low-power or low-bandwidth wireless connection, such as Bluetooth, which can significantly constrain the amount of data that can be reliably transmitted.
  • the body data recorder may be designed to avoid wired connections with outside devices to avoid inconveniencing the subject, since wires connecting to other devices could potentially alter the subject's behavior and negatively affect the measurements. Further, transferring significant amounts of data is power inefficient for the battery-operated body data recorder. By performing at least the initial processing of EKG and heartbeat data locally within the wearable body data recorder, the amount of data that needs to be stored or transmitted and the amount of power consumed during transmission of data is minimized.
  • the processing techniques discussed herein allow the results of beat detection, frequency analysis, and approximate entropy calculations, among others, to be stored and transmitted instead.
  • the body data recorder may send this data further processing, e.g., by a mobile device or a remote computer system. In some implementations, the body data recorder itself completes the processing discussed here to generate sleep stage labels.
  • step 125 the data is analyzed to detect relative increases in heart rate.
  • Analysis of heart rate data reveals that an increase in heart rate is often correlated with REM and/or Wake states. These increases, especially for REM stages, can be rather subtle and the absolute value may be lower than during other stages or even the average heartbeat for the night, since the heart rate may drop through the night due to a subject's circadian rhythm. This pattern can be detected by analyzing the increase with respect to the values before and after the detected increase.
  • the time series can be divided into epochs to analyze the numerical trends in the data.
  • the epoch size is 60 seconds.
  • the epochs can be iterated over while calculating, for each epoch, the mean, variance, maximum value, and minimum value for heart rates in the epoch.
  • Reference values before and after each epoch can be created by using a threshold time before and the after to analyze the neighborhood. In some implementations, this threshold is 10 minutes. The minimum of all the epoch averages in the last ten minutes before the epoch can be used to find the “previous” reference, and the minimum of all the epoch averages in the 10 minutes following the epoch can be used as the “post” reference.
  • the number of values averaged is truncated when the epoch does not have the full time period before or after it owing to being an early or late epoch. Additionally, for the first and last epoch, the previous reference and post reference respectively, are set to the epoch's own heart rate average.
  • step 130 heart rate variability (HRV) is analyzed.
  • HRV heart rate variability
  • REM stages have been analyzed and observed to have more “chaos” i.e., less predictability than the heart rate values in the other stages. This phenomenon can mathematically be described as tending to have higher entropy and has been investigated by the scientific community in analyses of heart health and heart failure.
  • FIG. 2 illustrates an example of a process 200 for using heart rate variability frequency analysis to determine sleep stage patterns.
  • This process 200 can involve a power spectral analysis after a Fast Fourier Transform (FFT) of the HRV signal for each sleep stage can be performed.
  • FFT Fast Fourier Transform
  • This reveals the phenomenon of SWS or deep sleep stages (S3 and S4) generally demonstrating a decrease in Low Frequency (LF) components and an increase in High Frequency (HF) components when compared to the other stages of Wake, REM, and S2.
  • LF Low Frequency
  • HF High Frequency
  • This phenomenon is due to the increased vagal activity during SWS stages and the activation of the parasympathetic nervous system (PSNS) over the sympathetic nervous system (SNS) during other stages. This can also be identified as the cardio-pulmonary-coupling (CPC). Incidentally, the lack of this type of activity has been noted in post-myocardial infarction patients.
  • HRV is derived from heart rate data.
  • a Fast Fourier Transform is performed on windows of the data.
  • the window size is 5 minutes.
  • the value of N in the embodiment is the number of HRV samples in this time window.
  • a power spectral density measurement is determined.
  • the power is computed at 300 frequencies, i.e. the window size, evenly spaced between 0 and 1 Hz
  • step 220 power measurements are summed over certain frequency ranges.
  • two ranges may be defined, with one range defined as a low-frequency or LF Range, and another range defined as a high-frequency or HF Range.
  • the LF Range is 0.04-0.15 Hz
  • the HF Range is defined as 0.15-0.4 Hz.
  • the LF Power can be computed by summing the powers in the power spectral density for the frequencies that lie in the LF Range (i.e. the corresponding elements in P yy for the range of frequency elements in f).
  • the HF Power can similarly be computed for the HF Range. For each window the LF/HF, and LF % i.e., LF/(LF+HF) can be computed.
  • the LF/HF Ratio is computed.
  • the LF/HF ratio can be obtained by computing a ratio of the power in the LF Range divided by the power in the HF Range. This ratio may be computed for each of various segments of the data, for example, over each epoch or window of the data set. In some implementations, the ratio is calculated for equal periods of 0.5 seconds to 5 seconds, and may be computed for overlapping windows or segments of the data.
  • step 135 approximate entropy can be computed.
  • An approximate entropy algorithm can be used to estimate this heart rate characteristic.
  • this method can operate over 300 values of 1 Hz heart rate values derived from the higher rate EKG signal (i.e., 5 minute windows).
  • This heart rate time series within the epoch and two parameters m, the length of compared run of data, and r, a threshold related to the amount of variation in values can be used by the method to compute the approximate entropy.
  • each smaller sub-window of values e.g., 5 in this example, can be used, shifting it over each other sequence of the same length, subtracting the corresponding elements and counting the number of incidences of sliding windows where the maximum difference in corresponding elements is less than a comparison threshold.
  • An averaging of the logarithmic sums of these values for all subwindows when combined over two different window sizes can be used to compute the approximate entropy and reveal the lack of predictability in the REM and Wake stages.
  • the mathematic steps can be executed as follows.
  • a time series of data u(1), u(2), . . . , u(N) is considered for entropy analysis.
  • a sequence of vectors is created x(1), x(2), . . . , x(N ⁇ m+1) where
  • x ( i ) [ u ( i ), u ( i+ 1), . . . , u ( i ⁇ m+ 1)]
  • the sequence of vectors is used to create for each i,1 ⁇ i ⁇ N ⁇ m+1
  • the value ApEn representing approximate entropy, can be computed for each window in the sleep session time series for heart rate as an indicator of unpredictability for stage prediction.
  • other chaos theory methods such as discrete fluctuation analysis (DFA) can be used to create a comparable chaos indicator for the heart rate time series.
  • DFA discrete fluctuation analysis
  • step 140 cost function factors and weights are generated. Since multiple parameters can contribute to the prediction of each stage such as REM, SWS, etc., each of the contributing parameters can be represented as a factor varying between 0 and 1 facilitating direct comparison, weighting, and probabilistic manipulation.
  • the following factors are determined and used in combination to predict sleep stages.
  • the mean of the window heart rate time series (meanHR) can be computed.
  • the meanHR is compared to a fixed reference HRRestReference, which represents the expected heart rate value for resting heart rate during sleep, and to the mean of heart rates in the window time series that fall in a low percentile, HRLowReference.
  • HRRestReference represents the expected heart rate value for resting heart rate during sleep
  • HRLowReference a low percentile
  • the resting heart rate can be used as a different value by age of the subject. In some implementations this low percentile is 50.
  • the differences of the meanHR and these references, respectively, are converted into factors, HR Rest Comparison Factor, and a HR Low Comparison Factor using a scoring function that is defined by:
  • these parameters are computed using F(meanHR ⁇ HRRestReference, 0.5, 10) and F(meanHR ⁇ HRLowReference, 0.5, 10) respectively. Similar factors can be computed using the same scoring function to score the mean of all the Epoch ValueIncreaseEstimation for all epochs in the window under consideration, resulting in the WindowValueIncreaseEstimation factor.
  • the ApEn High Factor is computed using F(ApEn, w ApEn , c w Apen ).
  • the ApEnHighFactor is computed as F(ApEn, 0.5, ApEnMean) and ApEnLowComparisonFactor is computed as g(ApEn-ApEnMeanLow25Percentile, 0, ApEnMeanLow50Percentile) where ApEnMeanLow25Percentile and ApEnMeanLow50Percentile are the mean of the lowest 25% and 50% of all ApEn values, respectively.
  • WindowValueIncreaseEstimationFactor F(WindowValueIncreaseEstimation, 0.5, 3). In other implementations, such as for data with more peaks in the heart rate values, the value WindowValueIncreaseEstimationFactor is described as F(WindowValueIncreaseEstimation, 0.5, 5).
  • step 145 probability scores indicating the likelihoods for different sleep stages are calculated. These scores can be determined using input from multiple types of sensors. For example, the information from motion sensors can be combined with information determined from heart rate information to produce a more accurate result. Different Probabilistic factors are created for REM or Wake probability combining the WindowValueIncreaseEstimationFactor and High Entropy factor as a mean, and for SWS combining LF/HF ratio factor and low entropy factors using a geometric mean. In another embodiment the SWS probability is directly assigned as the LF/HF ratio factor. A chart illustrating an example of LF/HF ratios and indicators of sleep stages determined using those ratios is shown in FIG. 6 .
  • the REM or Wake probability can be considered to indicate that the stage is either Wake or REM where another discriminator can be used to distinguish between the two. Additionally, to discriminate between Wake and REM additional factors can be created to predict Wake states, which often demonstrate more dramatic spikes from resting heart rates (55-66 bpm in adults) to sudden wake heart rates of more than 80 bpm and periods of much higher entropy and variance. In devices where accelerometers and gyroscopes are available, these can be used to compute posture and activity levels over windows and use the presence of activity to enhance the probability of Wake states since REM stages, in contrast, involve highest level of inactivity and physical paralysis.
  • the HRV frequency analysis can also be similarly converted into factors by computing the means of the lowest 25 and 50 percentile of all the LF/HF values.
  • f The difference of the LF/HF value for each window from the lowest 50 percentile means (d w,50 ) can be found in addition to the difference between the lowest 25 percentile and 50 percentile means (d 25,50 ).
  • the LF/HF Low Factor is computed asF(d f,50 , w Freq , d 25,50 ).
  • a predictive probability can be estimated for the probability of the window being a SWS stage (SWSpx) by combining this factor and the minimum value of the ApEn measure for the window by using a geometric mean of ApEn Low Factor and LF/HF Low Factor with equal weights. This can be expressed with the following equation:
  • this factor is computed by weighing the difference between 0.5 and the LF/HF ratio, with higher differences generating a higher SWS probability.
  • a predictive probability of the window being a REM or Wake stage, REMWakepx can be computed by first using a scoring aggregation such as the weighted mean of ApEn High Factor and WindowValueIncreaseEstimation Factor. enhanced by a manipulation such as
  • an additional REMvsWakepx can be created to extract the difference in these stages with respect to sensor data.
  • This factor uses different thresholds to exploit the general trends of higher heart-rates and entropy for Wake than in REM. Additionally, the stage duration factor is taken into account with Wake being long periods or micro-awakenings of a few seconds or a minute, and REM usually being between 5 mins to an hour.
  • the sensor data comprises a motion sensor, it can be used to effectively set or bolster the REM vs Wake probabilities. Detecting movement or motion, in bed or walking, imply Wake states as REM is a highly physically paralyzed state. This motion probability can be used to discriminate between Wake and REM.
  • REMWake Value High Factor can be created to determine how high the window values are using a geometric mean of HR Rest Comparison Factor, and a HR Low Comparison Factor. Further, if this factor is higher than a threshold the REMWakepx can be enhanced by a manipulation such as
  • FIGS. 7 and 8 Examples of HRV frequency spectra are shown in FIGS. 7 and 8 . Each of these figures illustrates examples of charts showing HRV frequency spectra for a for wake stages (top left), REM Stages (top right), slow wave sleep (SWS) stages (bottom left), and S2 sleep stages (bottom right).
  • FIG. 7 shows data for a healthy subject
  • FIG. 8 shows data for a child with ADHD.
  • stage decisions and stage assignments are made using a rule-based framework.
  • a top layer decision-making algorithm weighs and resolves the computed predictive probabilities of stages in each window, comparing the current window's prediction with the previous/ongoing stage, the probability of transition, and rules that govern the possible transitions and the amount of stable prediction time required to actually confirm a change in stage.
  • Each window can be assigned a stage of Light (S1 or S2), Deep (Slow Wave Sleep—S3 or S4), REM or Wake.
  • SWSpx slow wave sleep probability
  • REMWakepx REM or Wake probability
  • REMvsWakepx REMvsWakepx
  • Rules can be enforced necessitating the detection of light sleep stage before REM or SWS stages, ignoring Wake fluctuations within REM when less than a certain time threshold.
  • the previous stage is considered and assigned a probability to continue and the stage probabilities of the current window are compared to each other.
  • the highest probability stage if higher than a threshold, e.g., 0 . 5 in some implementations, is considered as a possible stage for the window. If the highest probability stage is the same as the previous stage the method makes the assignment to the stage and continues to the next window. If the stage is different from the previous stage it is considered as a possible stage transition and checks are made to determine if the transition if feasible.
  • the probability of REMVsWakepx is checked. If this is above a threshold, such as 0.5 in some implementations, the stage is considered a Wake stage else a REM stage.
  • a threshold such as 0.5 in some implementations, the stage is considered a Wake stage else a REM stage.
  • the method is being used for additional checks in patterns can be performed to minimize stage fluctuations and add additional rules that are relevant for the application cases.
  • the windows used for these computations can range from discrete windows to sample by sample overlapping windows based on the processor speed and memory available on the system they are implemented on.
  • the stage prediction is made by a stage classifier trained on truth data, e.g., examples of inputs having actual corresponding sleep stages labelled.
  • classifiers include, for example, neural networks, maximum entropy classifiers, support vector machines, and decision trees.
  • Methods and systems are described for classifying segments of a set of signals to a set of discrete decisions about said signals in the form of sleep stages.
  • the method includes a procedure for training the system to predict sleep stages based on a set of input signals.
  • the input signals may include sensor data or other parameters calculated from sensor data, e.g., values indicating a subject's movement, heart rate, HRV, etc.
  • Examples of these sets of signals that have been labeled with a truth signal can be used to train the classifier.
  • the trained classifier can then be used to evaluate a set of input signals using the trained system to predict sleep stage.
  • raw heart rate, smoothed heart rate, LF/HF, and approximate entropy as input signals to the classifier.
  • Signals can be passed along with ground truth to the training system to generate a sleep stage evaluation engine, which can evaluate new input signals to produce a prediction of sleep stage.
  • a method for signal classifier training is described.
  • the purpose of the training step is to supply a set of ground truth data for some number of subjects, along with arbitrary data that corresponds with the ground truth data in time.
  • the system described will build a set of histograms over this data which will be used during evaluation to compute a probability distribution describing sleep stage at some point in time.
  • N s be the number of subjects to be used for training, for which ground truth data is available.
  • N f be the number of input functions to be used for training and classification.
  • F ij (t) be a set of N f functions for each individual i which correlates with each S i (t) in time. These functions are arbitrary, but could represent directly, or a variation on, sensor data taken during the sleep study.
  • N f ⁇ 4 histograms H js with N bins where each bin value is equal to the sum of number of samples of j for all individuals i in which F ij (t) falls into the bin for a given sleep stage s.
  • Stage s is known by looking up the ground truth in S i (t).
  • the bin's index is computed as
  • histograms represent a trained system which can be used for classification. From an intuitive point of view, they represent for any given sleep stage s and function j, how the function's value at any time t correlate to a given sleep stage.
  • New training data can be added to an already trained system, so long as the values of the new training data fall within Fminjand Fmaxjof the already trained system. If they do not, the system can be retrained using the old and new training data.
  • the input is the same set of input functions F ij (t), except without ground truth S i (t).
  • stage count As
  • P s (t) can then be used to estimate sleep stage in an individual at some point in time.
  • F i0 (t) be raw heart rate as received from a heart rate monitoring device.
  • F i2 (t) be LF/HF of F i0 (t) where LF is the low frequency component of the Fourier transformation of F i0 (t) and HF the high frequency component.
  • other sensor data can be used as input. This includes but is not limited to: audio, inertial measurements, EEG, temperature, and any mathematical transformations of these measurements. Breath detection can be performed using an audio signal, which can be transformed into breathing rate and used as an input to the system described. Inertial measurements could be used directly, as there is likely to be more movement during the wake phases, which makes the signal directly useful.
  • EEG is used by sleep technicians to produce the ground truth input to this system, thus they are also directly applicable.
  • the intermediate probabilities and factors described above such as Apen Low Factor, SWSpx, REMWakepx, LF/HF Low Factor, WindowValueIncreaseEstimation Factor are also used to train and classify stages.
  • functions representing EEG based stage probabilities, for increased delta waves or alpha waves observed, and EOG based stage probabilities for REM stages are created and used as an input to the stage classifier. This can allow the same fusion method described herein to predict sleep stages using new sensors as the technology to make them suitable to a particular application emerges.
  • a sleep stage probability function over time can be used as input, where the value of this function changes based on known constraints of normal human sleep. For example, deep sleep is more likely to occur earlier in sleep, and thus a deep sleep probability function would have a high value when sleep is first detected, and trail off toward zero over time or as deep sleep phases are detected.
  • This relative time sleep stage function can indicate that REM, especially longer REM periods are much more likely in the second half of the sleep session, with some smaller REM periods in the first half.
  • a function representing durations of a sleep stage in the time neighborhood can be used to assign the probabilities of longer deep sleep sessions, with possible other stage interruptions in the first half of the night and similar sessions of REM in the second half. Additionally, it can be used to limit the fact that REM sessions of greater than 1 hr are highly unlikely and may suggest a stage such as Wake.
  • a final decision for sleep stage at some time can be made.
  • a stage is chosen for each point in time as the max over s of P s (t).
  • constraints can be applied when making a decision. Constraints can include those which do not allow specific transitions to occur since they are unlikely based on knowledge of human sleep behavior. For example, a transition from awake to deep sleep is unlikely, so even when such a deep phase is computed to have the highest probability using this method, the next highest probability stage is chosen due to this constraint.
  • a neural network can be used to compute a final decision for sleep stage.
  • the input to this network would be the probability functions PP s (t) for some time t, as well as some number of decisions for previous times.
  • This network would be trained using the same ground truth data used to train the classification system described.
  • the output from the network would be a sleep stage decision for a single point in time.
  • the methods of combining multiple inputs can be used to classify finer stages such as S1, S2, SWS, REM, and Wake as data is available. They can also be used to train and classify simpler high precision Wake vs Sleep classification. Additionally, this can be performed by post combining all the sleep specific stages of light, deep and REM into a SLEEP category.
  • FIG. 9 shows the method's performance compared to sleep studies in a sleep clinic.
  • the system takes as input the two-lead EKG signal in the PSG Sleep Studies.
  • the techniques disclosed herein and results of sleep lab manual analysis were found to agree a majority of the time, for example, 95% of the time for Wake, 82% of the time for REM, and 62% of the time for SWS.
  • the source of the data can be from PSGs in a sleep laboratory or sensors worn on the body.
  • the source of the sensor data is a wearable device.
  • the methods can be used on any device providing EKG signals or derived heart rate.
  • the data could also be pulse rate from devices such as pulse oximeters, pulse-meters or pulse monitors in the form of watches such as the Apple Watch, Samsung Gear watches, and other fitness monitors. Some of these devices may use photoplethysmography (PPGs) to estimate pulse rate.
  • PPGs photoplethysmography
  • Any off-the-shelf motion sensor with even a single axis accelerometer can be used to enable the methods disclosed herein, along with light sensors and microphones with varying sensitivities when available.
  • Any EKG source or derived heart rate sensor or pulse-rate monitor such as wearable watches can be used by the methods disclosed herein to analyze sleep events and detect sleep staging.
  • the microphone audio signals when available, can be used to detect sleep apnea events and breathing patterns and rates.
  • This can be implemented using pattern matching methods or neural networks trained on data from patients with obstructive sleep apnea. In this process, stretches or windows of audio data are normalized to a fixed time length and tested on neural networks trained for apnea detection.
  • the above sleep stage classification functions can be improved using factors that represent that apnea events are much less likely during deep sleep. This data is available in several databases of clinical trials.
  • HRV heart rate variability
  • HRV can be further used to bolster the methods tracking the characteristics in heart rate patterns and HRV frequency spectrums when the subject stops breathing, restarts breathing and other apnea related events.
  • accelerometer or gyroscope data can be used to improve discrimination between Wake and REM stages. In addition, they can be used to approximate sleep and wake states, based on motion detection. Step detection where the posture of the person has changed to an upright position and detection of walking signatures can confirm the subject to be awake except in the extreme case of sleep walking. These can be used to enhance REMvsWakepx. Movement during lying down or sitting can be used to determine whether a person is asleep or lying down or sitting still. Movement variances and magnitudes can also be used to determine how restful a subject's sleeping patterns are. This can be achieved by aggregating motion sensor data, accelerometer and gyroscope data in epochs over the sleep session.
  • this epoch size is 30 seconds.
  • the variances, maximums, minimums, averages are computed for each epoch and then aggregated for bigger windows.
  • this larger window can be 5 minutes.
  • This hierarchical comparison and aggregation can be used for larger windows such as combining 5-minute windows into 30-minute larger windows. This can enable movement and their effects to be analyzed by logic and reason in these larger windows. This allows a small movement while changing posture preceded and followed by no movement to be categorized as a posture change rather than a long wake state. Multiple occurrences of movement are therefore weighted higher than single spikes of movement, which may occur while changing position or other movement in the sleep environment unrelated to the subject.
  • motion data can include raw acceleration data. From this data, and potentially other sensors, various cardinal postures can be detected. From the accelerometer data and/or the detected postures, a sleep/wake prediction can be generated. This information may be used to validate staging results using heart rate, as a factor in the calculations for stage likelihoods, as input to a sleep stage classifier, or in another manner.
  • the source of the data is a miniature wearable device 1100 , referred to herein as a Body Data Recorder (BDR).
  • BDR Body Data Recorder
  • the BDR which has been reduced to practice as a dual-sided 4-layer circuit board designed to achieve a miniature form factor ( ⁇ 1 inch 2 ) housing 3-axis accelerometers, 3-axis gyroscopes, microphones, heart rate sensors, temperature, and light sensors, among others.
  • An image of a U.S. quarter dollar coin is included in FIGS. 11A and 11B as a visual size reference showing the scale of the device 1100 .
  • the combination of sensors has been selected since the data provided can be used for detection of sleep staging with acceptable scoring accuracy, and for recording ambient signals of light and sound to assess the suitability of the environment and in tracking adherence to clinician recommendations.
  • the device also supports detecting and quantifying apnea events and snoring using a built-in microphone. Specifically, the above features can be achieved and provided in a small form factor at an affordable price. The small form factor allows the device to be worn in a location such as the chest all night or day for multiple nights without causing discomfort. This can be critical to the target application.
  • the body data recorder has been designed to be small, low power, and to fit comfortably in clothing, where the contacts can even be provided using conductive polymer.
  • the BDR can operate independent of other devices, without any cords or cables connected to a computer or other device not worn by the patient.
  • the parts selected and populated on the dual-sided board for this embodiment include the following specific components, though alternatives will be apparent to those skilled in the art: an AD8232 providing single lead heart rate monitoring and serves as an integrated conditioning block for ECG and instrumentation amplification sampling up to 2 kHz, Invensense 6 degree of freedom Inertial Measurement Unit (MPU6500), Knowles MEMS omni-directional microphone with 100 Hz ⁇ 10 kHz and ⁇ 18 dB ⁇ 3 dB sensitivity, TEMT-6000 high photo sensitivity ambient light sensor from Vishay Electronics, TMP102 temperature sensor from Texas Instruments, U.FL connector for delocalization of sensor pads via a miniature RF connector for high-frequency signals up to 6 GHz manufactured by Hirose Electric Group, Communications: microSD, low power bluetooth (BLE) from Microchip, 1000 mah Li polymer battery, sensor cables, and sensor pads (3M) and 3-D Printed prototype packaging.
  • a circuit board 1101 on which data storage, sensors, processing devices, etc. are mounted can be places in a protective housing
  • Any appropriate off-the-shelf motion sensor with even a single axis accelerometer can be used to enable the methods disclosed herein, along with light and microphones with varying sensitivities, when available.
  • Any EKG source or derived heart rate sensor or pulse-rate monitor such as wearable watches can be used by the methods disclosed herein to analyze sleep events and detect sleep staging.
  • the two EKG sensor pads 1102 can be worn below the pectoral muscles (i.e., on the rib cage—left and right). This location has been selected after experimentation to reduce muscle activation artifacts in the EKG signal.
  • the system produces similar results for being worn with the right electrode on the pectorals and the left under the pectorals and several other electrode placements.
  • these sensor pads are provided via conductive polymer that can be built into a t-shirt or garment.
  • the device can support both 3-lead and 2-lead EKG sensor pads, where the thigh is where the 3 rd EKG lead is connected for higher EKG amplitude.
  • the device is equipped with Wi-Fi to communicate with routers, mobile phones to transfer data.
  • Wi-Fi can be used in this embodiment to upload sensor or processed data to servers for storage or to execute further processing to achieve goals such as stage classification or apnea event detection.
  • the light sensor is a TAOS TFL2561FN, and the microphone an Invensense INMP401.
  • a digital signal processor from Microchip is used to ensure high sampling rate logging to the microSD card of all sensors, long device life and upload and transfer via Wi-Fi.
  • An ESP8266 Wi-Fi module is used for the Wi-Fi communications.
  • the methods described herein can be implemented completely or partially on these or other similar devices, or on mobile devices, such as smartphones.
  • the processing can be shared among these devices and even a remote server to aid processing power and device battery life.
  • a comprehensive sleep monitoring and analytics system can include a device that records sleep data, a mechanism for the subject to enter data, a communications link for data and entries to be transferred to a computer system, the methods disclosed herein to convert the data into sleep results, such as sleep stages and events, and methods to generate clinical reports from the data and results.
  • a body data recorder 1200 may include a t-shirt 1202 or other garment that houses multiple sensors.
  • the t-shirt can be worn by the subject during sleep sessions, or even all day if the subject chooses.
  • a system 1300 for measuring sleep information and performing sleep staging can include a BDR 1302 with multiple sensors, a mobile device 1304 that communicates with the BDR 1302 , and a server 1306 .
  • the data from multiple sensors of the BDR 1302 is recorded and can be transferred to a mobile application of the mobile device 1304 via a wireless link such as Bluetooth or low energy Bluetooth.
  • the mobile application also enables the subject to make the diary entries of sleep times, moods, habits, and sleep satisfaction questionnaires.
  • the mobile application also enables the subject to take reaction time tests.
  • the mobile application can upload the data to the server 1306 over a link such as Wi-Fi or cell links.
  • the data can be transferred from the device or the mobile app via Bluetooth or Wi-Fi to a computer system or via a cable connection.
  • the server 1306 can host the methods to translate received data into sleep results i.e., sleep stages (hypnograms) and sleep events (e.g., apnea detection, sleep walking, snoring).
  • raw sensor data is provided to the server 1306 .
  • the BDR 1302 may perform at least some of the processing of the heart beat data, which reduces the amount of data that must be transferred.
  • the BDR 1302 and the server 1306 may together perform the functions described with respect to FIGS. 1 and 2 .
  • the methods to generate sleep results and reports can be housed on the computer system. They can additionally or alternatively be hosted on or communicated to the mobile application of the mobile device 1304 or even on the BDR 1302 .
  • the results i.e., hypnograms, long term views, sleep events etc. can be displayed daily and over long periods of time for the subject to view along with their Sleep Score, Subjective Sleep Score and any other sub-scores. These can be also be plot against performance and habits to enable the subject to identify trends in their sleep and correlations between different sleep variables (e.g., total sleep time vs coffee intake).
  • the system may be prescribed to the subject by a clinician to be used for a certain period of time, usually multiple nights (e.g., 7-10 days).
  • the subject can take the system to their home environment, or elsewhere, and wear the device while sleeping and interact with the app.
  • the data is processed by the mobile application and uploaded to the server over the communications link, where the data is converted into a clinical report that is delivered to the clinician's office.
  • the report can include adherence tracking to assist in behavior changes and therapy and efficacy tracking.
  • the wearable device can also be worn beyond sleep sessions or even all day, where coupled with analytical processes, activity and performance can also be monitored.
  • occurrences of stress and anxiety are also detected and scored. Stress and anxiety can be distinguished from high intensity physical exercise using a combination of HRV analysis and neural networks. For specific cases such as developmental disorders, patterns such as rocking, seizures etc. can be detected and even mapped by time and location when location systems are available to the analytics.
  • the clinician can view the hypnograms revealing the sleep stages as a part of a report generated. These can include hypnograms for multiple days.
  • the clinician based on these can infer if the subject's sleep architecture has abnormalities, e.g., lacks SWS sleep, low REM sleep, large number of awakenings, apnea events etc.
  • the clinician may use the results of staging and events to offer the subject actionable suggestions (e.g., Cognitive Behavioral Therapy—CBT). These may include actions such as decreasing the amount of time spent in bed to reduce laying in bed awake, lowering the temperature of the room, increasing exercise in the morning, etc.
  • CBT Cognitive Behavioral Therapy
  • the methods herein can be implemented on devices such as smartwatches, smartphones or health monitors that provide pulse rate or heart rate and possibly motion data.
  • the methods can be used to determine stages and events, score sleep quality and display it to the user as part of a health monitoring application.
  • the user can adjust their lifestyle patterns based on this sleep information displayed over multiple days or longer periods of time. Additionally, this information can be supplied to clinicians through patient-clinician correspondence or integration into Electronic Health Records (EHRs). This can be even more relevant for consumer devices with FDA clearance or similar certification aimed more at clinical data.
  • EHRs Electronic Health Records
  • the results of staging in consumer and other applications can be used by a system for various applications. Some of these may include sending a message to a third-party system about what sleep stage the subject is in, waking up consumers who wish to lucid dream during certain REM or dreaming stages, and displaying sleep stages to the subject the next morning or in real-time. Displays can be made to the subject for the results of the methods disclosed herein using any computer system with a screen such as a laptop or mobile phone.
  • Embodiments of the invention and all of the functional operations described in this specification may be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them.
  • Embodiments of the invention may be implemented as one or more computer program products, i.e., one or more modules of computer program instructions encoded on a computer-readable medium for execution by, or to control the operation of, data processing apparatus.
  • the computer readable medium may be a non-transitory computer readable storage medium, a machine-readable storage device, a machine-readable storage substrate, a memory device, a composition of matter effecting a machine-readable propagated signal, or a combination of one or more of them.
  • data processing apparatus encompasses all apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, or multiple processors or computers.
  • the apparatus may include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them.
  • a propagated signal is an artificially generated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal that is generated to encode information for transmission to suitable receiver apparatus.
  • a computer program (also known as a program, software, software application, script, or code) may be written in any form of programming language, including compiled or interpreted languages, and it may be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
  • a computer program does not necessarily correspond to a file in a file system.
  • a program may be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code).
  • a computer program may be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
  • the processes and logic flows described in this specification may be performed by one or more programmable processors executing one or more computer programs to perform functions by operating on input data and generating output.
  • the processes and logic flows may also be performed by, and apparatus may also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit).
  • FPGA field programmable gate array
  • ASIC application specific integrated circuit
  • processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer.
  • a processor will receive instructions and data from a read only memory or a random access memory or both.
  • the essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data.
  • a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks.
  • mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks.
  • a computer need not have such devices.
  • a computer may be embedded in another device, e.g., a tablet computer, a mobile telephone, a personal digital assistant (PDA), a mobile audio player, a Global Positioning System (GPS) receiver, to name just a few.
  • Computer readable media suitable for storing computer program instructions and data include all forms of non-volatile memory, media, and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto optical disks; and CD ROM and DVD-ROM disks.
  • the processor and the memory may be supplemented by, or incorporated in, special purpose logic circuitry.
  • embodiments of the invention may be implemented on a computer having a display device, e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user may provide input to the computer.
  • a display device e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor
  • keyboard and a pointing device e.g., a mouse or a trackball
  • Other kinds of devices may be used to provide for interaction with a user as well; for example, feedback provided to the user may be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user may be received in any form, including acoustic, speech, or tactile input.
  • Embodiments of the invention may be implemented in a computing system that includes a back end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front end component, e.g., a client computer having a graphical user interface or a Web browser through which a user may interact with an implementation of the invention, or any combination of one or more such back end, middleware, or front end components.
  • the components of the system may be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network (“LAN”) and a wide area network (“WAN”), e.g., the Internet.
  • LAN local area network
  • WAN wide area network
  • the computing system may include clients and servers.
  • a client and server are generally remote from each other and typically interact through a communication network.
  • the relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
  • HTML file In each instance where an HTML file is mentioned, other file types or formats may be substituted. For instance, an HTML file may be replaced by an XML, JSON, plain text, or other types of files. Moreover, where a table or hash table is mentioned, other data structures (such as spreadsheets, relational databases, or structured files) may be used.

Abstract

Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for determining sleep stages and sleep events using sensor data. In some implementations, sensor data is obtained over a time period while a person is sleeping. The time period is divided into a series of intervals. Heart rate and changes in the heart rate are analyzed over the intervals. Based on the analysis of the heart rate changes, sleep stage labels are assigned to different portions of the time period. An indication of the assigned sleep stage labels is provided.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application claims priority to U.S. Provisional Application No. 62/211,261, filed on Aug. 28, 2015. The entire contents of U.S. Provisional Application No. 62/211,261 are incorporated herein by reference.
  • GOVERNMENT RIGHTS
  • The subject matter disclosed herein was made with government support under the award/contract/grant number 1416220, awarded by the National Science Foundation. The U.S. government may have certain rights in the subject matter disclosed herein.
  • TECHNICAL FIELD
  • The present disclosure relates to systems to monitor and analyze the quality and quantity of a person's sleep.
  • BACKGROUND
  • A person's sleep can be assessed with a polysomnogram (PSG), which is a multi-channel procedure carried out in a sleep laboratory. Typically, the procedure requires labor-intensive technician support, resulting in an expensive process. The studies are typically performed for a single night in a Sleep Laboratory and sometimes also during the day to study daytime sleepiness, e.g., with a Multiple Sleep Latency Test (MSLT). The results of the sleep study are primarily (i) indices related to apnea events, such as an Apnea-Hypopnea Index (AHI), and (ii) sleep staging outputs that indicate the stages of sleep that occurred.
  • Sleep stages reported as a result of a sleep study often follow either the Rechtschaffen and Kales (R&K) scoring system or the American Academy of Sleep Medicine (AASM) system established in 2007. In the R&K system the stages of sleep are S1, S2, S3, S4, REM (Rapid Eye Movement), and Wake. In the AASM format, S3 and S4 were combined into a single stage, N3, with the stages of sleep being N1, N2, N3, REM, and Wake. However, a typical PSG requires multiple EEG (electroencephalogram) channels, an EOG (electrooculogram), an EKG (electrocardiogram), an EMG (electromyography), and analysis of data from other sensors. As a result, a PSG can be a rather invasive procedure and is typically administered for only a single session or two.
  • SUMMARY
  • The present application describes methods and systems for automated fusion of sensor data to determine sleep staging and sleep metrics. In some implementations, a computer-implemented method includes obtaining sensor data from biometric and environmental sensors, and using sensor fusion to predict stages of sleep, detect sleep events, determine sleep metrics and score the sleep sessions. For example, sleep stages can be determined from heart rate measurements and inertial sensors.
  • The techniques described in the present application can extend the benefits of sleep analysis with clinically validated mechanisms to achieve similar results as those obtained in a sleep lab in the comfort of the home environment over multiple nights. The home environment is the normal sleep environment of the subject, providing a more realistic assessment of the subject's sleep behavior. Recording over several nights permits a more accurate analysis that can be used to extract long-term sleep patterns.
  • The systems described herein can use a variety of techniques to analyze sleep-related sensor data and estimate sleep stages and sleep events from the data. The stages of a person's sleep can be determined using, for example, one or more of heart rate data, heart rate variability (HRV) data, and sensor data indicating motion. In some implementations, sleep stages including REM, slow wave sleep or deep sleep (N3), light sleep (N1, N2) and Wake are estimated using processes including but not limited to sensor signal pattern matching, learning, HRV frequency analysis, sensor fusion, approximate entropy detection, and/or rule-based decision making.
  • The processing techniques discussed herein can address several technical challenges in the implementation of a small wearable body data recorder that is not tethered to external computers. In particular, the processing techniques discussed herein allow the small, battery-operated body data recorder to produce analysis results that can effectively identify sleep stages while minimizing power consumption, network communication bandwidth, and computation while maintaining a high degree of accuracy in assigning sleep stages. At the EKG sampling rates needed for effective sleep stage determination, e.g., often between 200 Hz-2000 Hz, a significant amount of data is generated. The body data recorder may communicate with other devices over a low-power or low-bandwidth wireless connection, such as Bluetooth, which can significantly constrain the amount of data that can be reliably transmitted. The body data recorder may be designed to avoid wired connections with outside devices to avoid inconveniencing the subject, since wires connecting to other devices could potentially alter the subject's behavior and negatively affect the measurements. Further, transferring significant amounts of data is power inefficient for the battery-operated body data recorder. By performing at least the initial processing of EKG and heartbeat data locally within the wearable body data recorder, the amount of data that needs to be stored or transmitted and the amount of power consumed during transmission of data is minimized. Rather than storing or sending data for every EKG sample, the processing techniques discussed herein allow the results of beat detection, frequency analysis, and approximate entropy calculations, among others, to be stored and transmitted instead. The body data recorder may send this data further processing, e.g., by a mobile device or a remote computer system.
  • In some implementations, the body data recorder itself completes the processing discussed here to generate sleep stage labels. The use of a trained classifier, and the use of generated histograms as well as the other values used provide an accurate technique for determining sleep stages while also providing computational efficiency that allows the techniques to be implemented on a small, power and processing-constrained device.
  • In some implementations, the estimation of sleep stages is made by a sleep stage classifier that has been trained using examples of sensor data and corresponding data indicating actual sleep stages.
  • Methods and systems are described for classifying segments of a set of signals to a set of discrete decisions about said signals in the form of sleep stages. The method includes a procedure for training the system using an examples of input signals representing sensor measurements and corresponding data indicating actual sleep stages. The method can include evaluating a set of input signals using the trained system to predict sleep stages.
  • In some implementations, a computer-implemented method may use raw heart rate, smoothed heart rate, LF/HF, and approximate entropy as input signals to said system. Signals can be passed along with ground truth to the training system to generate a sleep stage evaluation engine, which can evaluate new input signals to produce a prediction of sleep stage.
  • In some implementations, sensor data may be obtained from a wearable device that houses sensors to capture EKG data and sensors to capture motion data such as accelerometers and gyroscopes.
  • In some implementations, instead of using EKG data, an estimation of heart rate is directly taken as input to the method. Both EKG data and heart rate data can be taken from sources that are worn or not in contact with the subject. For example, a sensor using cardioballistic EKG technology may be included in bedding that does not require skin contact.
  • In some implementations, EKG data, motion data, breathing rate data and sound data, are provided as input to the engine. In some implementations, the heart rate data may be obtained in the form of pulse-rate data from a wrist-worn device or pulse-oximeter.
  • In some implementations, sensor data or biometric data is obtained from polysomnography data recorded in a sleep laboratory.
  • In one general aspect, a method performed by one or more computing devices includes: obtaining, by the one or more computing devices, sensor data generated by one or more sensors over a time period while a person is sleeping; dividing, by the one or more computing devices, the time period into a series of intervals; analyzing, by the one or more computing devices, heart rate and the changes in the heart rate of the person indicated by the sensor data over the intervals; based on the analysis of the heart rate changes, assigning, by the one or more computing devices, sleep stage labels to different portions of the time period; and providing, by the one or more computing devices, an indication of the assigned sleep stage labels.
  • Other embodiments include corresponding systems, apparatus, and computer programs, configured to perform the actions of the methods, encoded on computer storage devices. A system of one or more computing devices can be so configured by virtue of software, firmware, hardware, or a combination of them installed on the system that in operation cause the system to perform the actions. One or more computer programs can be so configured by virtue having instructions that, when executed by data processing apparatus, cause the apparatus to perform the actions. Implementations can include one or more of the features discussed below
  • In some implementations, the sleep stage labels include wake, rapid eye movement (REM) sleep, light sleep, and deep or slow-wave sleep.
  • In some implementations, assigning the sleep stage labels includes determining, for each of the intervals, (i) a likelihood that the interval corresponds to REM sleep, and (ii) an indication whether the interval is classified as REM sleep.
  • In some implementations, assigning the sleep stage labels includes determining, for each of the intervals, (i) a likelihood that the interval corresponds to light sleep, and (ii) an indication whether the interval is classified as light sleep.
  • In some implementations, assigning the sleep stage labels includes determining, for each of the intervals, (i) a likelihood that the interval corresponds to slow-wave sleep, and (ii) an indication whether the interval is classified as slow-wave sleep.
  • In some implementations, assigning the sleep stage labels includes determining, for each of the intervals, (i) a likelihood that the interval corresponds to a wake stage, and (ii) an indication whether the interval is classified as a wake stage.
  • In some implementations, assigning the sleep stage labels includes determining, for each of the intervals, (i) a likelihood that the interval corresponds to the person being awake or asleep, and (ii) an indication whether the person is classified as being awake or asleep.
  • In some implementations, the sensor data includes EKG signal data from EKG sensor.
  • In some implementations, analyzing the changes in the heart rate of the person includes determining heart rate variability characteristic scores for different portions of the sleep session, and the sleep stage labels are assigned based at least in part on the heart rate variability scores.
  • In some implementations, analyzing the changes in the heart rate of the person includes determining measures of randomness of heartbeat data for different sliding windows of the time period, wherein the sleep stage labels are assigned based at least in part on the heart rate variability scores.
  • In some implementations, the measure of randomness of heartbeat data is computed by determining a measure of approximate entropy based on the heartbeat data.
  • In some implementations, analyzing the heart rate of the person includes evaluating the value of the EKG signal in each interval with respect to the values in a neighborhood of the interval, wherein the neighborhood for each interval is a time window that extends from a first time threshold that precedes the interval to a second time threshold that follows the interval.
  • In some implementations, the obtained sensor data corresponds to a sleep session of the person; and analyzing the heart rate of the person includes evaluating an absolute value of the EKG signal with respect to the rest of the heart rate values for the sleep session and predetermined thresholds.
  • In some implementations, analyzing the changes in the heart rate of the person includes: generating a heart-rate variability (HRV) signal; performing a frequency analysis of the HRV signal; and examining a ratio of low frequency components of the HRV signal to high frequency components of the HRV signal.
  • In some implementations, the sensor data includes movement data from one or more motion sensors; and the method includes determining correlations of the movement data with sensor data indicating heart beats of the person, wherein the sleep stage labels are assigned based at least in part on the movement data and the correlations between the movement data and the sensor data indicating the heart beats of the person.
  • In some implementations, assigning a sleep stage label to a particular portion of the time period includes: obtaining likelihood scores from multiple different sleep stage analysis functions; and performing a sleep stage classification decision for the particular portion of the time period based on a combination of the likelihood scores from the multiple different sleep stage analysis functions.
  • In some implementations, multiple sleep stage functions and likelihoods are fused into a single sleep stage label by evaluating a classification function.
  • In some implementations, assigning a sleep stage label to a particular portion of the time period includes: providing multiple distinct signals that are separately correlated to sleep stages as input to a sleep stage classifier; and obtaining a single sleep stage label for the particular portion of the time period based on output of the sleep stage classifier.
  • In some implementations, the method includes: obtaining data sets indicating signals or function outputs, the data sets being labeled with sleep stage labels corresponding to the data sets; and training a sleep stage classifier based on the data sets to produce output indicating likelihoods that input data corresponds to a sleep stage from among a set of sleep stages.
  • In some implementations, the method includes: determining, for each of the signals or function outputs, a signal range histogram for the signal or function output; and using the signal range histograms to train a sleep stage classifier.
  • In some implementations, the method includes determining a signal range histogram for a signal, wherein the signal range histogram indicates, for each particular signal range of multiple signal ranges, a count of examples which have a particular sleep label assigned and have a value of the signal in the particular signal range.
  • In some implementations, determining the signal range histogram for the signal includes determining counts for each sleep label of the multiple sleep labels, for each signal range of the multiple signal ranges.
  • In some implementations, the method includes training or using a sleep stage classifier configured to receive, as input, signals indicating measurements during a sleep session of a person; and wherein the sleep stage classifier is configured to generate, for each interval of the sleep session, (i) a sleep stage probability distribution that indicates a likelihood for each of multiple different sleep stages, and (ii) a sleep stage label.
  • In some implementations, the sleep stage classifier is configured to generate the sleep stage probability distribution and the sleep stage label for a particular interval by: for each of the signals provided as input to the sleep stage classifier, accessing a histogram for the signal and determining a count indicated by the histogram for each sleep stage label corresponding to a value of the signal during the particular interval; and computing, for each of the sleep stages, a total count across all signals.
  • In some implementations, data used to train a sleep stage classifier and to classify sleep stage labels includes include (i) signals including data used in polysomnograms and breathing rate data, and (ii) time-correlations of sleep stages among the signals.
  • In some implementations, analyzing the changes in the heart rate of the person includes analyzing overlapping windows of the sensor data.
  • In some implementations, the sensor data includes heartbeat data from a heartbeat sensor.
  • In some implementations, the heartbeat sensor is an infrared sensor or an optical sensor.
  • In some implementations, the heartbeat sensor is a pulse rate sensor.
  • In some implementations, analyzing the changes in the heart rate of the person includes sampling the EKG signal at a rate between 100 Hz to 2 KHz; and applying a low pass filter to the sampled signal.
  • In some implementations, dividing the time period into a series of intervals includes diving the time period into adjacent periods each having a same duration.
  • In some implementations, dividing the time period into a series of intervals includes overlapping sliding windows having the same duration, with a sliding window being centered at each sample of heart rate data.
  • In some implementations, analyzing the changes in heart rate includes detecting heartbeats indicated by the sensor data.
  • In some implementations, detecting the heartbeats includes: detecting peaks in an EKG signal; determining a derivative signal from of the EKG signal by determining, for each sample of the EKG signal, a difference of between a current sample and the immediately previous sample; and identifying R waves based on applying one or more thresholds to the derivative signal.
  • In some implementations, detecting the heartbeats includes: identifying local maxima and local minima in an EKG signal; computing ranges between the maximum value of the EKG signal and minimum value of the EKG signal within each of multiple windows of the EKG signal; and assigning, to each of the multiple windows, a probability of the window representing a beat.
  • In some implementations, detecting heartbeats includes: evaluating data indicating a set of heartbeats determined by a heartbeat detection process; determining, based on the evaluation, a likelihood score indicating a likelihood that a beat was omitted from the detected heartbeats by the heartbeat detection process; and based on the likelihood score, missed in beat detection and labeling additional beats when computed to be likely.
  • In some implementations, analyzing the changes in heart rate includes determining R-R intervals for the detected heartbeats, the R-R intervals indicating an amount of time between adjacent R wave peaks in the EKG signal.
  • In some implementations, analyzing the changes in heart rate includes averaging the R-R intervals for beats detected within each second and then computing a measure of beats-per-minute corresponding to the average R-R interval.
  • In some implementations, analyzing the changes in heart rate includes determining an average heart rate for each of multiple windows of the EKG signal.
  • In some implementations, each of the multiple windows has a same duration, and the duration of each window is between 0.5 seconds and 5 seconds.
  • In some implementations, obtaining sensor data generated by one or more sensors over the time period while a person is sleeping includes obtaining sensor data for a sleep session that represents a single night.
  • In some implementations, obtaining the sensor data, dividing the time period into a series of intervals, and analyzing the heart rate and the changes in the heart rate are performed by a wearable, battery-operated device, wherein the wearable device includes sensors that detect the sensor data.
  • In some implementations, assigning the sleep stage labels and providing the indication of the assigned sleep stage labels are performed by the wearable device.
  • In some implementations, the method includes providing, by the wearable device, results of analyzing the heart rate and the changes in the heart rate to a second device for transmission to a server system; and wherein assigning the sleep stage labels and providing the indication of the assigned sleep stage labels are performed by the server system.
  • In some implementations, the second device is a mobile phone, and providing the indication of the assigned sleep stage labels includes providing, by the server system, the indication to the mobile phone for display by the mobile phone.
  • This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to limitations that solve any or all disadvantages noted in any part of this disclosure.
  • The details of one or more embodiments of the subject matter described in this specification are set forth in the accompanying drawings and the description below. Other features, aspects, and advantages of the subject matter will become apparent from the description, the drawings, and the claims.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a flow diagram that illustrates an example of a method for estimating sleep stages from sensor data.
  • FIG. 2 is a flow diagram that illustrates an example of a method for using heart rate variability frequency analysis to determine sleep stage patterns.
  • FIG. 3 is a chart illustrating an example of raw EKG data, filtered signals, and detected heart beats generated by a body data recorder (BDR) such as the one shown in FIGS. 11A and 11B.
  • FIG. 4 is a chart illustrating a zoomed-in view of a portion of the data from the chart of FIG. 3.
  • FIG. 5 is a chart illustrating sleep stages with respect to approximate entropy (ApEn) measures.
  • FIG. 6 is a chart illustrating an example of heart rate variability (HRV) low-frequency vs. high-frequency (LF/HF) ratios and indicators of sleep stages determined from EKG signals.
  • FIG. 7 illustrates examples of charts showing HRV frequency spectra for a healthy subject for wake stages (top left), REM Stages (top right), slow wave sleep (SWS) stages (bottom left), and S2 sleep stages (bottom right).
  • FIG. 8 illustrates examples of charts showing HRV frequency spectra for a child with ADHD for wake stages (top left), REM Stages (top right), slow wave sleep (SWS) stages (bottom left), and S2 sleep stages (bottom right).
  • FIG. 9 is a chart illustrating an example of staging (Light-S1 & S2, Deep-SWS, REM, Wake) for a healthy subject using the techniques disclosed herein, versus results of manual staging. The chart demonstrates effective staging of the disclosed techniques, including for REM and SWS stages.
  • FIG. 10 is a chart illustrating correlation of motion data with Sleep/Wake detection during sleep.
  • FIGS. 11A and 11B are illustrations of an example of a wearable body data recorder.
  • FIG. 12 is an illustration of an example of a T-shirt housing wearable sensors, for use with a body data recorder.
  • FIG. 13 is a conceptual drawing illustrating example of the components of a sleep monitoring system.
  • Like reference numbers and designations in the various drawings indicate like elements.
  • DETAILED DESCRIPTION
  • It is highly desirable to be able to extract high-accuracy sleep stage information and to compute sleep architectures for a subject in a home environment using sensors that can be housed in a comfortable, wearable device. The techniques disclosed herein allow clinically validated home sleep monitoring of this type. It would also provide an effective consumer electronic device to assess personalized health. Several sensors, including EEGs, heart rate (e.g., EKGs or PPGs—Photoplethysmograms) and heart rate variability (HRV), oxygen saturation, activity sensors, EMGs, etc., can be used to enable automatic sleep staging mechanisms. The selection and combination of sensors coupled with effective processing methods can impact the accuracy and quality of the sleep analysis and staging results. Multiple sensors can be included in a wearable device or system after taking into account their accuracy, cost, form-factor, and invasiveness.
  • Heart rate and heart rate variability signals can be used in a staging process closely reflecting several of the signals in the EEGs. EEGs are the key signals used for staging in sleep labs. HRV information integrates sympathetic and parasympathetic activity of the autonomic nervous system that varies across sleep stages and therefore can be an effective indicator of sleep-wake physiology.
  • The technology provides many different features and advantages. For example, it allows a system to take an EKG signal or heart-rate signal and determine sleep stages from it. The sleep stage detected can include REM, deep or slow wave sleep, and light sleep. The sleep stages can be determined automatically by the system, within technician scoring error for manual sleep studies. Additional aspects of heart rate can be used. For example, randomness of the heart rate signal, HRV frequency analysis, and/or approximate entropy measures can be used to determine the sleep stages. As another example, relative increases in heart rate and/or absolute value of the heart rate signal may be used by the system to classify sleep into stages. The system may also convert EKG signals to heart rate measures, which can then be further processed. The heart rate data used can be obtained from any appropriate heart-rate sensing device, pulse rate sensing device, or EKG sensing device. When EKG data is of low quality, or there are gaps or noise in the data, the techniques allow the conversion from EKG signals to heart beats and heart rate even when one or more peaks or valleys in the EKG signal are missing. The system can determine likelihoods of missed beats, and then supplement the detected beats with additional beats that were likely obscured or missing in the raw data.
  • Heart rate data and other data can be provided to a trained classifier that provides outputs indicative of likelihoods that the input signals correspond to different sleep stages. A sleep stage classifier can be trained to fuse different sensor signals into a single estimate. This process may train the system using any of various signals used in polysomnograms (PSGs) such as EEGs (electroencephalograms), EMGs (electromyograms), EOGs (electrooculograms), and pulse-oximetry. Other signals, such as motion data, sound data, breathing rate data, and so on can also be used as signals to the classifier. Examples of data tagged with sleep stage labels, e.g., from polysomnograms or other sources, can be used to train the classifier to estimate sleep stages from various sets of data. In some implementations, the classifier or a decision framework can learn to fuse arbitrary sets of signals and functions that are separately related to sleep into a single sleep stage prediction.
  • The sleep stage information allows a number of other measurements to be determined. For example, a sleep onset latency, a count of awakenings and duration of awakenings during a sleep session, and a wake time can be determined. One of the valuable results is to identify REM states and wake states. The sleep staging can be enhanced using motion sensing, especially for the detection of wake stages and REM stages.
  • FIG. 1 is a flow diagram that illustrates an example of a method 100 for estimating sleep stages from sensor data. The operations of the method 100 may be performed by a wearable body data recorder, such as the one described below and illustrated in FIGS. 11A and 11B. In addition, or as an alternative, the operations may be performed by another device, such as a phone or computer, that receives sensor data from the wearable device.
  • In step 105, EKG data for a subject is acquired while the subject is sleeping. Optionally, sensor data indicating motion of the subject can also be also be acquired. In some implementations, method 100 can take as input an EKG signal sampled at any of multiple rates. Sampling at 100 Hz to 2 KHz or more is highly effective for heart beat detection. At lower rates, beat detection is feasible with the possibility of some missed or extra beats during noisy phases in the signal. In these cases, extra processing can be used to filter out extra beats or probabilistically detect the possibility of missed beats. A signal with some noise can be filtered using a 6th order low pass Butterworth filter to remove the noise and reveal the QRS complexes.
  • In step 110, motion quantity and quality are determined, and events in epochs are determined. Data from one or more motion sensors, such as an accelerometer or gyroscope, can be recorded by a body data recorder. The timing of movement data can be recorded so it can be correlated with the EKG signal and/or heartbeat sensor signal detected at the same time. Other sensor data can also be detected and aligned with heart data. For example, data from a microphone, a light sensor, and other sensors can be detected and correlated with heart data for use in determining sleep stages.
  • In step 115, heart beats and R-R intervals are determined from the sensor data. The R-R intervals represent periods between the peaks of adjacent R waves of the EKG signal. The EKG signal can be processed by a beat detection method that examines the signal for the QRS complexes. This can be achieved using peak-detection where the R-peaks are detected above a threshold value with a minimum threshold separation in time to prevent qualification of invalid peaks. In some implementations, where the EKG source shows inversion of the signal, or signals that are biased towards the peaks, the derivative of the signal is first computed by finding the difference of every sample from the previous sample in the EKG time series. The peak detection can then be applied on this derived signal using peak and separation thresholds customized for the derivative signal.
  • The results of beat detection can then be used to determine the R-R intervals i.e. the time between consecutive beats. Iterating through the intervals, the heartbeat can be computed as a time-series where the heart rate for an interval of i seconds is 60/i bpm. In some implementations, the heart rate is computed every second (i.e., 1 Hz). This can be achieved by averaging the R-R intervals for beats detected within each second and then computing the beats-per-minute (bpm) corresponding to the average R-R interval. One-second time periods with no R-R intervals can be replaced with the preceding second's BPM computation.
  • In some implementations, where the EKG signal may have imperfections, such as missing peaks and valleys, a probabilistic method can be used for beat detection. In some implementations, overlapping slider windows can be iterated over. The slider window can be set to be the maximum distance between a peak and valley in the EKG beat signal. In each slider window, computations can be made to determine the range, max, min values. The differences of the range, max, and mins can be computed from the average range, maximum value, and minimum value in all the slider windows. These differences can be converted into probabilities that it is a peak, valley, or peak-valley range based on the statistical observations from the source. A weighted mean of the range, peak, and valley probabilities can be used to combine these individual probabilities, and to assign a probability that this window is a beat. A peak detection on these probabilities with a separation threshold, the minimum beat separation, can be used to filter out the overlapping windows.
  • In step 120, the beat-detection data is converted to heart rate values, and various checks or filtering can be performed. An additional iteration over all the detected beats can be performed. In this iteration, the average or median recent heart rate can be computed, and compared via a ratio to the current beat-to-beat heart rate. If the ration is very close to 0.5, it is highly likely that a beat has been missed, and a beat can be inserted at the peak or valley. Similar method can be used for sparse extra beats detected. The above method allows for detection of beats when occasionally either a peak or valley is not captured in a beat data or the entire beat is not properly captured due to device limitations.
  • The heart rate data, which in some implementations is computed once per second, can be checked for inconsistencies and invalid data such as outages in the heart rate stream due to any device malfunctions or incorrect beat detections. This check can be performed for all possible heart rate sources that include EKG interpretations, pulse rate recording, or heart rate reported by third-party devices. Invalid heart rate data can be detected by checking for default values that the heart rate reverts to such as zero for certain devices, and spikes and fluctuations to values that are not within the range of possible heart rate values. Invalid detection can be customized to the source EKG.
  • Since the nature of the heart rate signal can fluctuate substantially between different EKG or heart rate sources, a 6th order low pass Butterworth filter is used to preprocess the heart rate signal for specific subroutines in the method. In some implementations, this is implemented using a cutoff frequency of 0.01 Hz. Alternately, in some implementations, the filter is bypassed with customized filters and thresholds for different EKG or heart rate sources to account for the differences between these devices.
  • FIGS. 3 and 4 illustrate examples of raw EKG data, filtered signals, and detected heart beats. Detection of the EKG data, as well as generation of the filtered signals and detected heartbeats can be performed by a wearable body data recorder (BDR) such as the one shown in FIGS. 11A and 11B. The processing techniques discussed herein allow the small, battery-operated body data recorder to produce analysis results that can effectively identify sleep stages while minimizing power consumption, network communication bandwidth, and computation while maintaining a high degree of accuracy in assigning sleep stages. At the sampling rates needed for effective sleep stage determination, e.g., often 200 Hz-2000 Hz, a significant amount of data is generated. The body data recorder may communicate with other devices over a low-power or low-bandwidth wireless connection, such as Bluetooth, which can significantly constrain the amount of data that can be reliably transmitted. The body data recorder may be designed to avoid wired connections with outside devices to avoid inconveniencing the subject, since wires connecting to other devices could potentially alter the subject's behavior and negatively affect the measurements. Further, transferring significant amounts of data is power inefficient for the battery-operated body data recorder. By performing at least the initial processing of EKG and heartbeat data locally within the wearable body data recorder, the amount of data that needs to be stored or transmitted and the amount of power consumed during transmission of data is minimized. Rather than storing or sending data for every EKG sample, the processing techniques discussed herein allow the results of beat detection, frequency analysis, and approximate entropy calculations, among others, to be stored and transmitted instead. The body data recorder may send this data further processing, e.g., by a mobile device or a remote computer system. In some implementations, the body data recorder itself completes the processing discussed here to generate sleep stage labels.
  • Referring again to FIG. 1, in step 125, the data is analyzed to detect relative increases in heart rate. Analysis of heart rate data reveals that an increase in heart rate is often correlated with REM and/or Wake states. These increases, especially for REM stages, can be rather subtle and the absolute value may be lower than during other stages or even the average heartbeat for the night, since the heart rate may drop through the night due to a subject's circadian rhythm. This pattern can be detected by analyzing the increase with respect to the values before and after the detected increase.
  • The time series can be divided into epochs to analyze the numerical trends in the data. In some implementations, the epoch size is 60 seconds. The epochs can be iterated over while calculating, for each epoch, the mean, variance, maximum value, and minimum value for heart rates in the epoch. Reference values before and after each epoch can be created by using a threshold time before and the after to analyze the neighborhood. In some implementations, this threshold is 10 minutes. The minimum of all the epoch averages in the last ten minutes before the epoch can be used to find the “previous” reference, and the minimum of all the epoch averages in the 10 minutes following the epoch can be used as the “post” reference. The number of values averaged is truncated when the epoch does not have the full time period before or after it owing to being an early or late epoch. Additionally, for the first and last epoch, the previous reference and post reference respectively, are set to the epoch's own heart rate average.
  • The difference of the average heart rate for the epoch and the previous reference, if positive, is used as the previous neighborhood value increase, and the differences of the average heart rate for the epoch and the post reference, if positive, is used as the post neighborhood value increase. If negative, zero is used as the neighborhood value increase. The average of these two values, i.e., post and previous deviations, is recorded as the EpochValueIncreaseEstimation.
  • In step 130, heart rate variability (HRV) is analyzed. In addition to the possible neighborhood increase in heart rate, REM stages have been analyzed and observed to have more “chaos” i.e., less predictability than the heart rate values in the other stages. This phenomenon can mathematically be described as tending to have higher entropy and has been investigated by the scientific community in analyses of heart health and heart failure.
  • Analysis of sleep data to detect disorders reveals a dramatic reduction in slow wave sleep (SWS) for patients with sleep disorders, especially those with sleep apnea, demonstrating the critical need for discriminating between light sleep (S1 or S2) and SWS sleep within NREM stages in adding value by tracking the SWS sleep quality and its improvement in subjects with sleep disorders. Heart Rate Variability can be used to extract the characteristics of this stage. Further detail about HRV analysis is discussed with respect to FIG. 2.
  • FIG. 2 illustrates an example of a process 200 for using heart rate variability frequency analysis to determine sleep stage patterns. This process 200 can involve a power spectral analysis after a Fast Fourier Transform (FFT) of the HRV signal for each sleep stage can be performed. This reveals the phenomenon of SWS or deep sleep stages (S3 and S4) generally demonstrating a decrease in Low Frequency (LF) components and an increase in High Frequency (HF) components when compared to the other stages of Wake, REM, and S2. The amount of reduction and the change in the ratio of LF/HF (where LF is the summation of the powers in the LF range, and HF is the summation of the powers in the HF range) was found to vary from subject to subject. This phenomenon is due to the increased vagal activity during SWS stages and the activation of the parasympathetic nervous system (PSNS) over the sympathetic nervous system (SNS) during other stages. This can also be identified as the cardio-pulmonary-coupling (CPC). Incidentally, the lack of this type of activity has been noted in post-myocardial infarction patients.
  • In step 205, HRV is derived from heart rate data. For the HRV frequency analysis, the HRV is first computed for the heart rate time series as the difference between heart rates at to adjacent periods, e.g., HRV(i)=HR(i)−HR(i−1), resulting in a HRV time series of length N−1 for a heart rate time series of length N at a sampling rate of fs.
  • In step 210, a Fast Fourier Transform (FFT) is performed on windows of the data. The time series can again be divided into sub-windows and in each window the time domain HRV can be converted to the frequency domain representation using an N-point FFT where Y=fft(window HRV,N). In some implementations the window size is 5 minutes. The value of N in the embodiment is the number of HRV samples in this time window.
  • In step 215, a power spectral density measurement is determined. The power spectral density, e.g., a measurement of the energy at various frequencies, can be computed using the complex conjugate, where Pyy=(Y×Y)/N represents the power computed at certain frequencies, f=fs/N[0, 1, . . . , N−1]. In some implementations, the power is computed at 300 frequencies, i.e. the window size, evenly spaced between 0 and 1 Hz
  • In step 220, power measurements are summed over certain frequency ranges. For example, two ranges may be defined, with one range defined as a low-frequency or LF Range, and another range defined as a high-frequency or HF Range. In some implementations the LF Range is 0.04-0.15 Hz, and the HF Range is defined as 0.15-0.4 Hz. The LF Power can be computed by summing the powers in the power spectral density for the frequencies that lie in the LF Range (i.e. the corresponding elements in Pyy for the range of frequency elements in f). The HF Power can similarly be computed for the HF Range. For each window the LF/HF, and LF % i.e., LF/(LF+HF) can be computed.
  • In step 225, the LF/HF Ratio is computed. The LF/HF ratio can be obtained by computing a ratio of the power in the LF Range divided by the power in the HF Range. This ratio may be computed for each of various segments of the data, for example, over each epoch or window of the data set. In some implementations, the ratio is calculated for equal periods of 0.5 seconds to 5 seconds, and may be computed for overlapping windows or segments of the data.
  • Referring again to FIG. 1, in step 135, approximate entropy can be computed. An approximate entropy algorithm can be used to estimate this heart rate characteristic. In some implementations this method can operate over 300 values of 1 Hz heart rate values derived from the higher rate EKG signal (i.e., 5 minute windows). This heart rate time series within the epoch and two parameters m, the length of compared run of data, and r, a threshold related to the amount of variation in values can be used by the method to compute the approximate entropy. Within these windows each smaller sub-window of values, e.g., 5 in this example, can be used, shifting it over each other sequence of the same length, subtracting the corresponding elements and counting the number of incidences of sliding windows where the maximum difference in corresponding elements is less than a comparison threshold. An averaging of the logarithmic sums of these values for all subwindows when combined over two different window sizes can be used to compute the approximate entropy and reveal the lack of predictability in the REM and Wake stages. The mathematic steps can be executed as follows.
  • A time series of data u(1), u(2), . . . , u(N) is considered for entropy analysis. A sequence of vectors is created x(1), x(2), . . . , x(N−m+1) where

  • x(i)=[u(i),u(i+1), . . . ,u(i−m+1)]
  • The sequence of vectors is used to create for each i,1≦i≦N−m+1

  • C i m(r)=(number of x(j) such that d[x(i),x(j)]<r)/(N−m+1)
  • in which d[x,x*] is defined as
  • d [ x , x * ] = max a u ( a ) - u * ( a ) φ m ( t ) = i = 1 N - m + 1 log ( C i m ( r ) ) / ( N - m + 1 ) and ApEn = φ m ( r ) - φ m + 1 ( r )
  • The value ApEn, representing approximate entropy, can be computed for each window in the sleep session time series for heart rate as an indicator of unpredictability for stage prediction. Alternatively, other chaos theory methods, such as discrete fluctuation analysis (DFA), can be used to create a comparable chaos indicator for the heart rate time series. A chart illustrating examples of sleep stages and ApEn values is shown in FIG. 5.
  • In step 140, cost function factors and weights are generated. Since multiple parameters can contribute to the prediction of each stage such as REM, SWS, etc., each of the contributing parameters can be represented as a factor varying between 0 and 1 facilitating direct comparison, weighting, and probabilistic manipulation.
  • In some implementations, the following factors are determined and used in combination to predict sleep stages. For each window, the mean of the window heart rate time series (meanHR) can be computed. The meanHR is compared to a fixed reference HRRestReference, which represents the expected heart rate value for resting heart rate during sleep, and to the mean of heart rates in the window time series that fall in a low percentile, HRLowReference. The resting heart rate can be used as a different value by age of the subject. In some implementations this low percentile is 50. The differences of the meanHR and these references, respectively, are converted into factors, HR Rest Comparison Factor, and a HR Low Comparison Factor using a scoring function that is defined by:
  • F ( c , w , c w ) = 1 - rc 1 + rc
  • where
  • r = log ( 1 - w ) log ( 1 + w ) / - c w ,
  • and where c is the scoring function's value at reference weight, cw.
  • In some implementations, these parameters are computed using F(meanHR−HRRestReference, 0.5, 10) and F(meanHR−HRLowReference, 0.5, 10) respectively. Similar factors can be computed using the same scoring function to score the mean of all the Epoch ValueIncreaseEstimation for all epochs in the window under consideration, resulting in the WindowValueIncreaseEstimation factor.
  • The difference between ApEn and the mean of the lower percentile of ApEn values can be used to compute ApEn Low Comparison Factor is computed using another scoring function defined as:
  • y ( x , μ , σ ) = - 0.5 × ( ( x - μ ) / σ ) 2 2 × π × σ and g ( x , μ , σ ) = y ( x , μ , σ ) y ( μ , μ , σ )
  • The ApEn High Factor is computed using F(ApEn, wApEn, cw Apen ).
  • Various parameters may be used in the functions F( ) and g( ) discussed above. The values used for the thresholds may be determined using statistical measures of the various ApEn values. In some implementations the ApEnHighFactor is computed as F(ApEn, 0.5, ApEnMean) and ApEnLowComparisonFactor is computed as g(ApEn-ApEnMeanLow25Percentile, 0, ApEnMeanLow50Percentile) where ApEnMeanLow25Percentile and ApEnMeanLow50Percentile are the mean of the lowest 25% and 50% of all ApEn values, respectively. In some implementations, WindowValueIncreaseEstimationFactor=F(WindowValueIncreaseEstimation, 0.5, 3). In other implementations, such as for data with more peaks in the heart rate values, the value WindowValueIncreaseEstimationFactor is described as F(WindowValueIncreaseEstimation, 0.5, 5).
  • In step 145, probability scores indicating the likelihoods for different sleep stages are calculated. These scores can be determined using input from multiple types of sensors. For example, the information from motion sensors can be combined with information determined from heart rate information to produce a more accurate result. Different Probabilistic factors are created for REM or Wake probability combining the WindowValueIncreaseEstimationFactor and High Entropy factor as a mean, and for SWS combining LF/HF ratio factor and low entropy factors using a geometric mean. In another embodiment the SWS probability is directly assigned as the LF/HF ratio factor. A chart illustrating an example of LF/HF ratios and indicators of sleep stages determined using those ratios is shown in FIG. 6.
  • The REM or Wake probability can be considered to indicate that the stage is either Wake or REM where another discriminator can be used to distinguish between the two. Additionally, to discriminate between Wake and REM additional factors can be created to predict Wake states, which often demonstrate more dramatic spikes from resting heart rates (55-66 bpm in adults) to sudden wake heart rates of more than 80 bpm and periods of much higher entropy and variance. In devices where accelerometers and gyroscopes are available, these can be used to compute posture and activity levels over windows and use the presence of activity to enhance the probability of Wake states since REM stages, in contrast, involve highest level of inactivity and physical paralysis.
  • The HRV frequency analysis can also be similarly converted into factors by computing the means of the lowest 25 and 50 percentile of all the LF/HF values. fThe difference of the LF/HF value for each window from the lowest 50 percentile means (dw,50) can be found in addition to the difference between the lowest 25 percentile and 50 percentile means (d25,50). The LF/HF Low Factor is computed asF(df,50, wFreq, d25,50). A predictive probability can be estimated for the probability of the window being a SWS stage (SWSpx) by combining this factor and the minimum value of the ApEn measure for the window by using a geometric mean of ApEn Low Factor and LF/HF Low Factor with equal weights. This can be expressed with the following equation:
  • x _ = ( i = 1 n x i w i ) 1 / i - 1 n w i
  • Besides computing factors to determine the lowest LF/HF ratios in the data, other factors can be created to evaluate how low the ratio is when evaluated as an absolute value. In some implementations, this reference value is 0.5 and values above 0.5 are not considered to be valid for SWS stage detection. This factor, LF/HF Absolute Low Factor, is computed by weighing the difference between 0.5 and the LF/HF ratio, with higher differences generating a higher SWS probability. A predictive probability of the window being a REM or Wake stage, REMWakepx, can be computed by first using a scoring aggregation such as the weighted mean of ApEn High Factor and WindowValueIncreaseEstimation Factor. enhanced by a manipulation such as
  • Additionally, to discriminate between the similarity in REM and WAKE characteristics an additional REMvsWakepx can be created to extract the difference in these stages with respect to sensor data. This factor uses different thresholds to exploit the general trends of higher heart-rates and entropy for Wake than in REM. Additionally, the stage duration factor is taken into account with Wake being long periods or micro-awakenings of a few seconds or a minute, and REM usually being between 5 mins to an hour. If the sensor data comprises a motion sensor, it can be used to effectively set or bolster the REM vs Wake probabilities. Detecting movement or motion, in bed or walking, imply Wake states as REM is a highly physically paralyzed state. This motion probability can be used to discriminate between Wake and REM. Another factor, REMWake Value High Factor can be created to determine how high the window values are using a geometric mean of HR Rest Comparison Factor, and a HR Low Comparison Factor. Further, if this factor is higher than a threshold the REMWakepx can be enhanced by a manipulation such as

  • REMWakepx=REMWakepx REMwake Value High Factor
  • Examples of HRV frequency spectra are shown in FIGS. 7 and 8. Each of these figures illustrates examples of charts showing HRV frequency spectra for a for wake stages (top left), REM Stages (top right), slow wave sleep (SWS) stages (bottom left), and S2 sleep stages (bottom right). FIG. 7 shows data for a healthy subject, while FIG. 8 shows data for a child with ADHD.
  • In step 150, stage decisions and stage assignments are made using a rule-based framework. A top layer decision-making algorithm weighs and resolves the computed predictive probabilities of stages in each window, comparing the current window's prediction with the previous/ongoing stage, the probability of transition, and rules that govern the possible transitions and the amount of stable prediction time required to actually confirm a change in stage. Each window can be assigned a stage of Light (S1 or S2), Deep (Slow Wave Sleep—S3 or S4), REM or Wake. This can be done by processing each window's stage predictive probabilities such as slow wave sleep probability (SWSpx), REM or Wake probability (REMWakepx), and REMvsWakepx, and can additionally be customized based on their experimental values and trends evaluated against technician scored stages.
  • Rules can be enforced necessitating the detection of light sleep stage before REM or SWS stages, ignoring Wake fluctuations within REM when less than a certain time threshold. For each window the previous stage is considered and assigned a probability to continue and the stage probabilities of the current window are compared to each other. The highest probability stage if higher than a threshold, e.g., 0.5 in some implementations, is considered as a possible stage for the window. If the highest probability stage is the same as the previous stage the method makes the assignment to the stage and continues to the next window. If the stage is different from the previous stage it is considered as a possible stage transition and checks are made to determine if the transition if feasible. If it is a REMWake stage prediction, the probability of REMVsWakepx is checked. If this is above a threshold, such as 0.5 in some implementations, the stage is considered a Wake stage else a REM stage. Based on the application, the method is being used for additional checks in patterns can be performed to minimize stage fluctuations and add additional rules that are relevant for the application cases. The windows used for these computations can range from discrete windows to sample by sample overlapping windows based on the processor speed and memory available on the system they are implemented on.
  • In some implementations, the stage prediction is made by a stage classifier trained on truth data, e.g., examples of inputs having actual corresponding sleep stages labelled. Examples of such classifiers include, for example, neural networks, maximum entropy classifiers, support vector machines, and decision trees. Methods and systems are described for classifying segments of a set of signals to a set of discrete decisions about said signals in the form of sleep stages. The method includes a procedure for training the system to predict sleep stages based on a set of input signals. The input signals may include sensor data or other parameters calculated from sensor data, e.g., values indicating a subject's movement, heart rate, HRV, etc. Examples of these sets of signals that have been labeled with a truth signal, e.g., data that indicates correct stages corresponding to the examples, can be used to train the classifier. The trained classifier can then be used to evaluate a set of input signals using the trained system to predict sleep stage. In some implementations, raw heart rate, smoothed heart rate, LF/HF, and approximate entropy as input signals to the classifier. Signals can be passed along with ground truth to the training system to generate a sleep stage evaluation engine, which can evaluate new input signals to produce a prediction of sleep stage.
  • First, a method for signal classifier training is described. The purpose of the training step is to supply a set of ground truth data for some number of subjects, along with arbitrary data that corresponds with the ground truth data in time. The system described will build a set of histograms over this data which will be used during evaluation to compute a probability distribution describing sleep stage at some point in time.
  • Let Ns be the number of subjects to be used for training, for which ground truth data is available.
  • Let
  • S i ( t ) = 0 if wake 1 if rem 2 if light 3 if deep
  • be a function representing the ground truth sleep stage for multiple subjects with i representing the ith subject. More stages can be added as needed.
  • Let Nf be the number of input functions to be used for training and classification.
  • Let Fij(t) be a set of Nf functions for each individual i which correlates with each Si(t) in time. These functions are arbitrary, but could represent directly, or a variation on, sensor data taken during the sleep study.
  • Now define Nfvalues Fminj and Fmaxj, where

  • Fminj=minF ij(t) for all i,t

  • Fmaxj=maxF ij(t) for all i,t
  • These are the minimums and maximums of each function j computed over all subjects.
  • Finally we define Nf×4 histograms Hjs with N bins where each bin value is equal to the sum of number of samples of j for all individuals i in which Fij(t) falls into the bin for a given sleep stage s.
  • Stage s is known by looking up the ground truth in Si(t). The bin's index is computed as

  • b i=floor((F ij(t)−Fminj)/(Fmaxj −FminjN)
  • These histograms represent a trained system which can be used for classification. From an intuitive point of view, they represent for any given sleep stage s and function j, how the function's value at any time t correlate to a given sleep stage.
  • The number of input functions is also arbitrary, but should be fixed between the training and classification steps. New training data can be added to an already trained system, so long as the values of the new training data fall within Fminjand Fmaxjof the already trained system. If they do not, the system can be retrained using the old and new training data.
  • Now a method for classification of a set of functions into a sleep stage probability distribution is described. For classification, the input is the same set of input functions Fij(t), except without ground truth Si(t).
  • We define stage count as
  • C s ( t ) = j = 0 N f H js ( floor ( ( F ij ( t ) - Fmin j ) / ( Fmax j - Fmin j ) × N ) )
  • Finally, we compute a distribution describing the probability of any sleep stage at some time as

  • P s(t)=C s(t)/ΣC s(t)
  • Ps(t) can then be used to estimate sleep stage in an individual at some point in time.
  • In some implementations, the following Fij(t) are used as input:
  • Let Fi0(t) be raw heart rate as received from a heart rate monitoring device.
  • Let Fi1(t) be approximate entropy of Fi0(t).
  • Let Fi2(t) be LF/HF of Fi0(t) where LF is the low frequency component of the Fourier transformation of Fi0(t) and HF the high frequency component.
  • Let Fi3(t)=ExpAvg(Fi0(t))−Median(Fi0(t)) where ExpAvg(F) is computed as an exponential average of F starting from the first sample, and using a weighting of 0.99.
  • In some implementations, other sensor data can be used as input. This includes but is not limited to: audio, inertial measurements, EEG, temperature, and any mathematical transformations of these measurements. Breath detection can be performed using an audio signal, which can be transformed into breathing rate and used as an input to the system described. Inertial measurements could be used directly, as there is likely to be more movement during the wake phases, which makes the signal directly useful. In some implementations, EEG is used by sleep technicians to produce the ground truth input to this system, thus they are also directly applicable. In some implementations, the intermediate probabilities and factors described above such as Apen Low Factor, SWSpx, REMWakepx, LF/HF Low Factor, WindowValueIncreaseEstimation Factor are also used to train and classify stages. In some implementations functions representing EEG based stage probabilities, for increased delta waves or alpha waves observed, and EOG based stage probabilities for REM stages are created and used as an input to the stage classifier. This can allow the same fusion method described herein to predict sleep stages using new sensors as the technology to make them suitable to a particular application emerges.
  • In addition, functions not directly tied to sensor data can also be used. A sleep stage probability function over time can be used as input, where the value of this function changes based on known constraints of normal human sleep. For example, deep sleep is more likely to occur earlier in sleep, and thus a deep sleep probability function would have a high value when sleep is first detected, and trail off toward zero over time or as deep sleep phases are detected. This relative time sleep stage function can indicate that REM, especially longer REM periods are much more likely in the second half of the sleep session, with some smaller REM periods in the first half.
  • Additionally, for people in general, or an individual and absolute time function can be created since a person's circadian rhythm causes similar stages to occur night to night. Therefore, for a particular person the probability of having deep sleep between 4 am and 5 am may be very low. A function representing durations of a sleep stage in the time neighborhood can be used to assign the probabilities of longer deep sleep sessions, with possible other stage interruptions in the first half of the night and similar sessions of REM in the second half. Additionally, it can be used to limit the fact that REM sessions of greater than 1 hr are highly unlikely and may suggest a stage such as Wake.
  • Once the probability function Ps(t) has been computed using the described method, a final decision for sleep stage at some time can be made. In some implementations, a stage is chosen for each point in time as the max over s of Ps(t). In another embodiment, constraints can be applied when making a decision. Constraints can include those which do not allow specific transitions to occur since they are unlikely based on knowledge of human sleep behavior. For example, a transition from awake to deep sleep is unlikely, so even when such a deep phase is computed to have the highest probability using this method, the next highest probability stage is chosen due to this constraint.
  • In another embodiment, a neural network can be used to compute a final decision for sleep stage. The input to this network would be the probability functions PPs(t) for some time t, as well as some number of decisions for previous times. This network would be trained using the same ground truth data used to train the classification system described. The output from the network would be a sleep stage decision for a single point in time.
  • The methods of combining multiple inputs can be used to classify finer stages such as S1, S2, SWS, REM, and Wake as data is available. They can also be used to train and classify simpler high precision Wake vs Sleep classification. Additionally, this can be performed by post combining all the sleep specific stages of light, deep and REM into a SLEEP category.
  • FIG. 9 shows the method's performance compared to sleep studies in a sleep clinic. To produce the results illustrated, the system takes as input the two-lead EKG signal in the PSG Sleep Studies. In some implementations, the techniques disclosed herein and results of sleep lab manual analysis were found to agree a majority of the time, for example, 95% of the time for Wake, 82% of the time for REM, and 62% of the time for SWS.
  • The performance of the embodiments reduced to practice are comparable to the inter-rater reliability of different human scorers as reported by the AASM Visual Scoring Task Force which reported 78-94% agreement for REM (3 studies), 68-89% for Wake (3 studies) and 69% for SWS (2 studies). Another study by the task force examining 3 technicians rescoring 20 studies after a median of 6.5 months resulted in (self) agreements of 89-93% for Wake, 72-88% for REM and 55-75% for SWS. Thus, our algorithm is as effective as the human technicians studied in these reports.
  • The source of the data, for the method and system disclosed herein, can be from PSGs in a sleep laboratory or sensors worn on the body. In some implementations of this invention, the source of the sensor data is a wearable device. The methods can be used on any device providing EKG signals or derived heart rate. The data could also be pulse rate from devices such as pulse oximeters, pulse-meters or pulse monitors in the form of watches such as the Apple Watch, Samsung Gear watches, and other fitness monitors. Some of these devices may use photoplethysmography (PPGs) to estimate pulse rate. Methods for determining posture and sleep duration and quality using motion sensing can take input from any quality accelerometer and gyroscope combination. Any off-the-shelf motion sensor with even a single axis accelerometer can be used to enable the methods disclosed herein, along with light sensors and microphones with varying sensitivities when available. Any EKG source or derived heart rate sensor or pulse-rate monitor such as wearable watches can be used by the methods disclosed herein to analyze sleep events and detect sleep staging.
  • The microphone audio signals, when available, can be used to detect sleep apnea events and breathing patterns and rates. This can be implemented using pattern matching methods or neural networks trained on data from patients with obstructive sleep apnea. In this process, stretches or windows of audio data are normalized to a fixed time length and tested on neural networks trained for apnea detection. The above sleep stage classification functions can be improved using factors that represent that apnea events are much less likely during deep sleep. This data is available in several databases of clinical trials. Additionally, heart rate variability (HRV) can be further used to bolster the methods tracking the characteristics in heart rate patterns and HRV frequency spectrums when the subject stops breathing, restarts breathing and other apnea related events.
  • Referring to FIG. 10, accelerometer or gyroscope data can be used to improve discrimination between Wake and REM stages. In addition, they can be used to approximate sleep and wake states, based on motion detection. Step detection where the posture of the person has changed to an upright position and detection of walking signatures can confirm the subject to be awake except in the extreme case of sleep walking. These can be used to enhance REMvsWakepx. Movement during lying down or sitting can be used to determine whether a person is asleep or lying down or sitting still. Movement variances and magnitudes can also be used to determine how restful a subject's sleeping patterns are. This can be achieved by aggregating motion sensor data, accelerometer and gyroscope data in epochs over the sleep session. In some implementations, this epoch size is 30 seconds. The variances, maximums, minimums, averages are computed for each epoch and then aggregated for bigger windows. In some implementations, this larger window can be 5 minutes. This hierarchical comparison and aggregation can be used for larger windows such as combining 5-minute windows into 30-minute larger windows. This can enable movement and their effects to be analyzed by logic and reason in these larger windows. This allows a small movement while changing posture preceded and followed by no movement to be categorized as a posture change rather than a long wake state. Multiple occurrences of movement are therefore weighted higher than single spikes of movement, which may occur while changing position or other movement in the sleep environment unrelated to the subject.
  • As shown in FIG. 10, motion data can include raw acceleration data. From this data, and potentially other sensors, various cardinal postures can be detected. From the accelerometer data and/or the detected postures, a sleep/wake prediction can be generated. This information may be used to validate staging results using heart rate, as a factor in the calculations for stage likelihoods, as input to a sleep stage classifier, or in another manner.
  • Referring to FIGS. 11A and 11B, in some implementations, the source of the data is a miniature wearable device 1100, referred to herein as a Body Data Recorder (BDR). The BDR which has been reduced to practice as a dual-sided 4-layer circuit board designed to achieve a miniature form factor (˜1 inch2) housing 3-axis accelerometers, 3-axis gyroscopes, microphones, heart rate sensors, temperature, and light sensors, among others. An image of a U.S. quarter dollar coin is included in FIGS. 11A and 11B as a visual size reference showing the scale of the device 1100.
  • For the example illustrated, the combination of sensors has been selected since the data provided can be used for detection of sleep staging with acceptable scoring accuracy, and for recording ambient signals of light and sound to assess the suitability of the environment and in tracking adherence to clinician recommendations. The device also supports detecting and quantifying apnea events and snoring using a built-in microphone. Specifically, the above features can be achieved and provided in a small form factor at an affordable price. The small form factor allows the device to be worn in a location such as the chest all night or day for multiple nights without causing discomfort. This can be critical to the target application.
  • The body data recorder has been designed to be small, low power, and to fit comfortably in clothing, where the contacts can even be provided using conductive polymer. The BDR can operate independent of other devices, without any cords or cables connected to a computer or other device not worn by the patient. The parts selected and populated on the dual-sided board for this embodiment include the following specific components, though alternatives will be apparent to those skilled in the art: an AD8232 providing single lead heart rate monitoring and serves as an integrated conditioning block for ECG and instrumentation amplification sampling up to 2 kHz, Invensense 6 degree of freedom Inertial Measurement Unit (MPU6500), Knowles MEMS omni-directional microphone with 100 Hz˜10 kHz and −18 dB±3 dB sensitivity, TEMT-6000 high photo sensitivity ambient light sensor from Vishay Electronics, TMP102 temperature sensor from Texas Instruments, U.FL connector for delocalization of sensor pads via a miniature RF connector for high-frequency signals up to 6 GHz manufactured by Hirose Electric Group, Communications: microSD, low power bluetooth (BLE) from Microchip, 1000 mah Li polymer battery, sensor cables, and sensor pads (3M) and 3-D Printed prototype packaging. A circuit board 1101 on which data storage, sensors, processing devices, etc. are mounted can be places in a protective housing 1104.
  • Any appropriate off-the-shelf motion sensor with even a single axis accelerometer can be used to enable the methods disclosed herein, along with light and microphones with varying sensitivities, when available. Any EKG source or derived heart rate sensor or pulse-rate monitor such as wearable watches can be used by the methods disclosed herein to analyze sleep events and detect sleep staging.
  • The two EKG sensor pads 1102 can be worn below the pectoral muscles (i.e., on the rib cage—left and right). This location has been selected after experimentation to reduce muscle activation artifacts in the EKG signal. The system produces similar results for being worn with the right electrode on the pectorals and the left under the pectorals and several other electrode placements. In some implementations, these sensor pads are provided via conductive polymer that can be built into a t-shirt or garment.
  • In another embodiment, the device can support both 3-lead and 2-lead EKG sensor pads, where the thigh is where the 3rd EKG lead is connected for higher EKG amplitude. In this embodiment, the device is equipped with Wi-Fi to communicate with routers, mobile phones to transfer data. Wi-Fi can be used in this embodiment to upload sensor or processed data to servers for storage or to execute further processing to achieve goals such as stage classification or apnea event detection. In this embodiment the light sensor is a TAOS TFL2561FN, and the microphone an Invensense INMP401. A digital signal processor from Microchip is used to ensure high sampling rate logging to the microSD card of all sensors, long device life and upload and transfer via Wi-Fi. An ESP8266 Wi-Fi module is used for the Wi-Fi communications.
  • The methods described herein can be implemented completely or partially on these or other similar devices, or on mobile devices, such as smartphones. The processing can be shared among these devices and even a remote server to aid processing power and device battery life.
  • A comprehensive sleep monitoring and analytics system can include a device that records sleep data, a mechanism for the subject to enter data, a communications link for data and entries to be transferred to a computer system, the methods disclosed herein to convert the data into sleep results, such as sleep stages and events, and methods to generate clinical reports from the data and results.
  • Referring to FIG. 12, in some implementations, a body data recorder 1200 may include a t-shirt 1202 or other garment that houses multiple sensors. The t-shirt can be worn by the subject during sleep sessions, or even all day if the subject chooses.
  • Referring to FIG. 13, a system 1300 for measuring sleep information and performing sleep staging can include a BDR 1302 with multiple sensors, a mobile device 1304 that communicates with the BDR 1302, and a server 1306. The data from multiple sensors of the BDR 1302 is recorded and can be transferred to a mobile application of the mobile device 1304 via a wireless link such as Bluetooth or low energy Bluetooth. The mobile application also enables the subject to make the diary entries of sleep times, moods, habits, and sleep satisfaction questionnaires. The mobile application also enables the subject to take reaction time tests.
  • The mobile application can upload the data to the server 1306 over a link such as Wi-Fi or cell links. Alternately, the data can be transferred from the device or the mobile app via Bluetooth or Wi-Fi to a computer system or via a cable connection. The server 1306 can host the methods to translate received data into sleep results i.e., sleep stages (hypnograms) and sleep events (e.g., apnea detection, sleep walking, snoring).
  • Examples of hypnograms are shown in chart 1310. In some implementations, raw sensor data is provided to the server 1306. In other implementations, to conserve power and manage bandwidth constraints, the BDR 1302 may perform at least some of the processing of the heart beat data, which reduces the amount of data that must be transferred. Thus the BDR 1302 and the server 1306 may together perform the functions described with respect to FIGS. 1 and 2.
  • The methods to generate sleep results and reports can be housed on the computer system. They can additionally or alternatively be hosted on or communicated to the mobile application of the mobile device 1304 or even on the BDR 1302. In consumer applications, the results i.e., hypnograms, long term views, sleep events etc. can be displayed daily and over long periods of time for the subject to view along with their Sleep Score, Subjective Sleep Score and any other sub-scores. These can be also be plot against performance and habits to enable the subject to identify trends in their sleep and correlations between different sleep variables (e.g., total sleep time vs coffee intake). In a clinical use of this embodiment, the system may be prescribed to the subject by a clinician to be used for a certain period of time, usually multiple nights (e.g., 7-10 days). The subject can take the system to their home environment, or elsewhere, and wear the device while sleeping and interact with the app. The data is processed by the mobile application and uploaded to the server over the communications link, where the data is converted into a clinical report that is delivered to the clinician's office. The report can include adherence tracking to assist in behavior changes and therapy and efficacy tracking.
  • The wearable device can also be worn beyond sleep sessions or even all day, where coupled with analytical processes, activity and performance can also be monitored. In some implementations, occurrences of stress and anxiety are also detected and scored. Stress and anxiety can be distinguished from high intensity physical exercise using a combination of HRV analysis and neural networks. For specific cases such as developmental disorders, patterns such as rocking, seizures etc. can be detected and even mapped by time and location when location systems are available to the analytics.
  • In the clinical setting, the clinician can view the hypnograms revealing the sleep stages as a part of a report generated. These can include hypnograms for multiple days. The clinician based on these can infer if the subject's sleep architecture has abnormalities, e.g., lacks SWS sleep, low REM sleep, large number of awakenings, apnea events etc. The clinician may use the results of staging and events to offer the subject actionable suggestions (e.g., Cognitive Behavioral Therapy—CBT). These may include actions such as decreasing the amount of time spent in bed to reduce laying in bed awake, lowering the temperature of the room, increasing exercise in the morning, etc.
  • The methods herein can be implemented on devices such as smartwatches, smartphones or health monitors that provide pulse rate or heart rate and possibly motion data. In these cases, the methods can be used to determine stages and events, score sleep quality and display it to the user as part of a health monitoring application. The user can adjust their lifestyle patterns based on this sleep information displayed over multiple days or longer periods of time. Additionally, this information can be supplied to clinicians through patient-clinician correspondence or integration into Electronic Health Records (EHRs). This can be even more relevant for consumer devices with FDA clearance or similar certification aimed more at clinical data.
  • The results of staging in consumer and other applications can be used by a system for various applications. Some of these may include sending a message to a third-party system about what sleep stage the subject is in, waking up consumers who wish to lucid dream during certain REM or dreaming stages, and displaying sleep stages to the subject the next morning or in real-time. Displays can be made to the subject for the results of the methods disclosed herein using any computer system with a screen such as a laptop or mobile phone.
  • Embodiments of the invention and all of the functional operations described in this specification may be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. Embodiments of the invention may be implemented as one or more computer program products, i.e., one or more modules of computer program instructions encoded on a computer-readable medium for execution by, or to control the operation of, data processing apparatus. The computer readable medium may be a non-transitory computer readable storage medium, a machine-readable storage device, a machine-readable storage substrate, a memory device, a composition of matter effecting a machine-readable propagated signal, or a combination of one or more of them. The term “data processing apparatus” encompasses all apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, or multiple processors or computers. The apparatus may include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them. A propagated signal is an artificially generated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal that is generated to encode information for transmission to suitable receiver apparatus.
  • A computer program (also known as a program, software, software application, script, or code) may be written in any form of programming language, including compiled or interpreted languages, and it may be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program does not necessarily correspond to a file in a file system. A program may be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code). A computer program may be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
  • The processes and logic flows described in this specification may be performed by one or more programmable processors executing one or more computer programs to perform functions by operating on input data and generating output. The processes and logic flows may also be performed by, and apparatus may also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit).
  • Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read only memory or a random access memory or both. The essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks. However, a computer need not have such devices. Moreover, a computer may be embedded in another device, e.g., a tablet computer, a mobile telephone, a personal digital assistant (PDA), a mobile audio player, a Global Positioning System (GPS) receiver, to name just a few. Computer readable media suitable for storing computer program instructions and data include all forms of non-volatile memory, media, and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto optical disks; and CD ROM and DVD-ROM disks. The processor and the memory may be supplemented by, or incorporated in, special purpose logic circuitry.
  • To provide for interaction with a user, embodiments of the invention may be implemented on a computer having a display device, e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, for displaying information to the user and a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user may provide input to the computer. Other kinds of devices may be used to provide for interaction with a user as well; for example, feedback provided to the user may be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user may be received in any form, including acoustic, speech, or tactile input.
  • Embodiments of the invention may be implemented in a computing system that includes a back end component, e.g., as a data server, or that includes a middleware component, e.g., an application server, or that includes a front end component, e.g., a client computer having a graphical user interface or a Web browser through which a user may interact with an implementation of the invention, or any combination of one or more such back end, middleware, or front end components. The components of the system may be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network (“LAN”) and a wide area network (“WAN”), e.g., the Internet.
  • The computing system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
  • While this specification contains many specifics, these should not be construed as limitations on the scope of the invention or of what may be claimed, but rather as descriptions of features specific to particular embodiments of the invention. Certain features that are described in this specification in the context of separate embodiments may also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment may also be implemented in multiple embodiments separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination may in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
  • Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems may generally be integrated together in a single software product or packaged into multiple software products.
  • In each instance where an HTML file is mentioned, other file types or formats may be substituted. For instance, an HTML file may be replaced by an XML, JSON, plain text, or other types of files. Moreover, where a table or hash table is mentioned, other data structures (such as spreadsheets, relational databases, or structured files) may be used.
  • Thus, particular embodiments of the invention have been described. Other embodiments are within the scope of the following claims. For example, the actions recited in the claims may be performed in a different order and still achieve desirable results.

Claims (25)

What is claimed is:
1. A method performed by one or more computing devices, the method comprising:
obtaining, by the one or more computing devices, sensor data generated by one or more sensors over a time period while a person is sleeping;
dividing, by the one or more computing devices, the time period into a series of intervals;
analyzing, by the one or more computing devices, heart rate and the changes in the heart rate of the person indicated by the sensor data over the intervals;
based on the analysis of heart rate the heart rate changes, assigning, by the one or more computing devices, sleep stage labels to different portions of the time period; and
providing, by the one or more computing devices, an indication of the assigned sleep stage labels.
2. The method of claim 1, wherein obtaining the sensor data, dividing the time period into a series of intervals, and analyzing the heart rate and the changes in the heart rate are performed by a wearable, battery-operated device, wherein the wearable device includes sensors that detect the sensor data.
3. The method of claim 2, further comprising providing, by the wearable device, results of analyzing the heart rate and the changes in the heart rate to a second device for transmission to a server system; and
wherein assigning the sleep stage labels and providing the indication of the assigned sleep stage labels are performed by the server system.
4. The method of claim 1, wherein the sensor data comprises movement data from one or more motion sensors; and
wherein the method comprises determining correlations of the movement data with sensor data indicating heart beats of the person, wherein the sleep stage labels are assigned based at least in part on the correlations between the movement data and the sensor data indicating the heart beats.
5. The method of claim 1, wherein assigning a sleep stage label to a particular portion of the time period comprises:
obtaining likelihood scores from multiple different sleep stage analysis functions; and
performing a sleep stage classification decision for the particular portion of the time period based on a combination of the likelihood scores from the multiple different sleep stage analysis functions.
6. The method of claim 1, wherein the sleep stage labels include labels corresponding to stages for wake, rapid eye movement (REM) sleep, light sleep, and deep or slow-wave sleep.
7. The method of claim 1, wherein assigning the sleep stage labels comprises:
determining, for each of the intervals, (i) a likelihood that the interval corresponds to REM sleep, and (ii) an indication whether the interval is classified as REM sleep;
determining, for each of the intervals, (i) a likelihood that the interval corresponds to light sleep, and (ii) an indication whether the interval is classified as light sleep;
determining, for each of the intervals, (i) a likelihood that the interval corresponds to slow-wave sleep, and (ii) an indication whether the interval is classified as slow-wave sleep; or
determining, for each of the intervals, (i) a likelihood that the interval corresponds to a wake stage, and (ii) an indication whether the interval is classified as a wake stage.
8. The method of claim 1, wherein assigning the sleep stage labels comprises determining, for each of the intervals, (i) a likelihood that the interval corresponds to the person being awake or asleep, and (ii) an indication whether the person is classified as being awake or asleep.
9. The method of claim 1, wherein the sensor data comprises at least one selected from a group consisting of: (i) EKG signal data from EKG sensor, and (ii) heartbeat data from a heartbeat sensor.
10. The method of claim 1, wherein dividing the time period into a series of intervals comprises diving the time period into adjacent periods each having a same duration.
11. The method of claim 10, wherein dividing the time period into a series of intervals comprises overlapping sliding windows having the same duration, with a sliding window being centered at each sample of heart rate data.
12. The method of claim 1, wherein analyzing the changes in the heart rate of the person comprises determining heart rate variability characteristic scores for different portions of the sleep session, and wherein the sleep stage labels are assigned based at least in part on the heart rate variability scores.
13. The method of claim 1, wherein analyzing the changes in the heart rate of the person comprises determining measures of randomness of heartbeat data for different sliding windows of the time period, wherein the sleep stage labels are assigned based at least in part on the heart rate variability scores.
14. The method of claim 13, wherein the measure of randomness of heartbeat data is computed by determining a measure of approximate entropy based on the heartbeat data.
15. The method of claim 1, wherein analyzing the heart rate of the person comprises evaluating the value of an EKG signal in each interval with respect to the values in a neighborhood of the interval, wherein the neighborhood for each interval is a time window that extends from a first time threshold that precedes the interval to a second time threshold that follows the interval.
16. The method of claim 1, wherein the obtained sensor data corresponds to a sleep session of the person; and
wherein analyzing the heart rate of the person comprises evaluating an absolute value of the heart rate signal with respect to the rest of the heart rate values for the sleep session and predetermined thresholds.
17. The method of claim 1, wherein analyzing the changes in the heart rate of the person comprises:
generating a heart-rate variability (HRV) signal; and
performing a frequency analysis of the HRV signal; and
examining a ratio of low frequency components of the HRV signal to high frequency components of the HRV signal.
18. The method of claim 1, wherein assigning a sleep stage label to a particular portion of the time period comprises:
obtaining likelihood scores from multiple different sleep stage analysis functions; and
performing a sleep stage classification decision for the particular portion of the time period based on a combination of the likelihood scores from the multiple different sleep stage analysis functions.
19. The method of claim 1, wherein assigning a sleep stage label to a particular portion of the time period comprises:
providing multiple signals that are separately correlated to sleep stages as input to a sleep stage classifier; and
obtaining a single sleep stage label for the particular portion of the time period based on output of the sleep stage classifier.
20. The method of claim 1, further comprising:
obtaining data sets indicating signals or function outputs, the data sets being labeled with sleep stage labels corresponding to the data sets; and
training the sleep stage classifier based on the data sets to produce output indicating likelihoods that input data corresponds to a sleep stage from among a set of sleep stages.
21. The method of claim 20, further comprising:
determining, for each of the signals or function outputs, a signal range histogram for the signal or function output; and
using the signal range histograms to train the sleep stage classifier.
22. The method of claim 1, wherein analyzing the changes in heart rate comprises detecting heartbeats indicated by the sensor data.
23. The method of claim 22, wherein detecting the heartbeats comprises:
detecting peaks in an EKG signal;
determining a derivative signal from of the EKG signal by determining, for each sample of the EKG signal, a difference of between a current sample and the immediately previous sample;
identifying R waves based on applying one or more thresholds to the derivative signal; and
detecting the heartbeats based on the identified R waves.
24. The method of claim 11, wherein detecting the heartbeats comprises:
identifying local maxima and local minima in an EKG signal;
computing ranges between the maximum value of the EKG signal and minimum value of the EKG signal within each of multiple windows of the EKG signal; and
assigning, to each of the multiple windows, a probability of the window representing a beat.
25. A system comprising:
one or more computing devices;
one or more computer-readable media storing instructions that, when executed by the one or more computing devices, cause the one or more computing devices to perform operations comprising:
obtaining, by the one or more computing devices, sensor data generated by one or more sensors over a time period while a person is sleeping;
dividing, by the one or more computing devices, the time period into a series of intervals;
analyzing, by the one or more computing devices, heart rate and the changes in the heart rate of the person indicated by the sensor data over the intervals;
based on the analysis of the heart rate changes, assigning, by the one or more computing devices, sleep stage labels to different portions of the time period; and
providing, by the one or more computing devices, an indication of the assigned sleep stage labels.
US15/249,108 2015-08-28 2016-08-26 Determining sleep stages and sleep events using sensor data Active US10321871B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/249,108 US10321871B2 (en) 2015-08-28 2016-08-26 Determining sleep stages and sleep events using sensor data

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201562211261P 2015-08-28 2015-08-28
US15/249,108 US10321871B2 (en) 2015-08-28 2016-08-26 Determining sleep stages and sleep events using sensor data

Publications (2)

Publication Number Publication Date
US20170055898A1 true US20170055898A1 (en) 2017-03-02
US10321871B2 US10321871B2 (en) 2019-06-18

Family

ID=56877153

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/249,108 Active US10321871B2 (en) 2015-08-28 2016-08-26 Determining sleep stages and sleep events using sensor data

Country Status (2)

Country Link
US (1) US10321871B2 (en)
WO (1) WO2017040331A1 (en)

Cited By (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150186312A1 (en) * 2013-12-27 2015-07-02 Petari Incorporation Apparatus and method for sensing object state
US20170263106A1 (en) * 2016-03-09 2017-09-14 Honda Motor Co., Ltd. Information processing system, terminal, information processing method, information processing method of terminal, and program
US20180064388A1 (en) * 2016-09-06 2018-03-08 Fitbit, Inc. Methods and systems for labeling sleep states
US10026294B2 (en) 2016-03-09 2018-07-17 Honda Motor Co., Ltd. Information processing system, terminal, information processing method, information processing method of terminal, and program
CN109394188A (en) * 2018-11-27 2019-03-01 中山大学 A kind of adnormal respiration detection method, device and equipment based on heart rate variability
WO2019058280A1 (en) 2017-09-20 2019-03-28 Johnson & Johnson Consumer Inc. Healthcare caregiver behavior coaching system and method
EP3485804A1 (en) * 2017-11-20 2019-05-22 Kinpo Electronics, Inc. Wearable device capable of recognizing doze-off stage and recognition method thereof
US10376670B2 (en) 2013-07-08 2019-08-13 Resmed Sensor Technologies Limited Methods and systems for sleep management
WO2019168474A1 (en) * 2018-03-02 2019-09-06 Nitto Denko Corporation Method, computing device and wearable device for sleep stage detection
EP3536225A1 (en) * 2018-03-07 2019-09-11 Koninklijke Philips N.V. Sleep apnea detection system and method
CN110251119A (en) * 2019-05-28 2019-09-20 深圳和而泰家居在线网络科技有限公司 Disaggregated model acquisition methods, HRV data classification method, device and Related product
WO2019185392A1 (en) * 2018-03-30 2019-10-03 Koninklijke Philips N.V. System and method for non-invasive determination of blood pressure dip based on trained prediction models
US10470719B2 (en) * 2016-02-01 2019-11-12 Verily Life Sciences Llc Machine learnt model to detect REM sleep periods using a spectral analysis of heart rate and motion
WO2019246234A1 (en) * 2018-06-19 2019-12-26 Hilmisson Hugi Systems and methods for evaluation of health situation or condition
CN111067503A (en) * 2019-12-31 2020-04-28 深圳安视睿信息技术股份有限公司 Sleep staging method based on heart rate variability
CN111481173A (en) * 2020-04-15 2020-08-04 上海贝氪若宝健康科技有限公司 Body sign signal detection method, medium, equipment and system
JP2020525146A (en) * 2017-06-29 2020-08-27 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Method and apparatus for determining sleep statistics
CN111631682A (en) * 2020-04-23 2020-09-08 平安国际智慧城市科技股份有限公司 Physiological feature integration method and device based on trend-removing analysis and computer equipment
IT201900004689A1 (en) * 2019-03-28 2020-09-28 Microbiomed S R L ELECTRONIC DEVICE AND PROCEDURE FOR AUTOMATICALLY DETECTING BEHAVIOR DISORDERS IN A SUBJECT DURING REM SLEEP PHASE AND / OR NIGHT APNEE
CN112006652A (en) * 2019-05-29 2020-12-01 深圳市睿心由科技有限公司 Sleep state detection method and system
WO2021046342A1 (en) * 2019-09-05 2021-03-11 Emory University Systems and methods for detecting sleep activity
US10966666B2 (en) 2016-02-01 2021-04-06 Verily Life Sciences Llc Machine learnt model to detect REM sleep periods using a spectral analysis of heart rate and motion
CN113456030A (en) * 2021-08-05 2021-10-01 成都云卫康医疗科技有限公司 Sleep staging method based on heart rate monitoring data
CN114041753A (en) * 2021-11-16 2022-02-15 上海市第六人民医院 Sleep staging method and device, computer equipment and storage medium
US11273283B2 (en) 2017-12-31 2022-03-15 Neuroenhancement Lab, LLC Method and apparatus for neuroenhancement to enhance emotional response
US11364361B2 (en) 2018-04-20 2022-06-21 Neuroenhancement Lab, LLC System and method for inducing sleep by transplanting mental states
US11363996B2 (en) * 2018-02-11 2022-06-21 Xi'an Jiaotong University Early warning method, device and system of sudden death
CN114652274A (en) * 2022-04-19 2022-06-24 无锡市人民医院 Intelligent sleep monitoring system for three-dimensional multi-dimensional data
US11452839B2 (en) 2018-09-14 2022-09-27 Neuroenhancement Lab, LLC System and method of improving sleep
WO2023025770A1 (en) * 2021-08-26 2023-03-02 Pprs Sas Sleep stage determining system
CN115844335A (en) * 2023-01-29 2023-03-28 广东工业大学 Sleep stage staging method and system based on feature overlapping and generalized decision forest
CN115868941A (en) * 2023-03-03 2023-03-31 深圳市魔样科技有限公司 Information management method for intelligent ring
US11648373B2 (en) 2013-07-08 2023-05-16 Resmed Sensor Technologies Limited Methods and systems for sleep management
US11717686B2 (en) 2017-12-04 2023-08-08 Neuroenhancement Lab, LLC Method and apparatus for neuroenhancement to facilitate learning and performance
US11723579B2 (en) 2017-09-19 2023-08-15 Neuroenhancement Lab, LLC Method and apparatus for neuroenhancement
US11786694B2 (en) 2019-05-24 2023-10-17 NeuroLight, Inc. Device, method, and app for facilitating sleep
US11937938B1 (en) * 2019-07-26 2024-03-26 Apple Inc. Methods for assessing sleep conditions

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
SG10201608507PA (en) * 2016-10-11 2018-05-30 Nat Univ Singapore Determining Sleep Stages
EP3637432B1 (en) * 2018-09-21 2022-03-23 Tata Consultancy Services Limited System and method for non-apnea sleep arousal detection
JP2022542580A (en) * 2019-07-25 2022-10-05 インスパイア・メディカル・システムズ・インコーポレイテッド Sleep detection for sleep-disordered breathing (SDB) care
EP3875026A1 (en) * 2020-03-03 2021-09-08 Koninklijke Philips N.V. Sleep apnea detection system and method
US11666271B2 (en) 2020-12-09 2023-06-06 Medtronic, Inc. Detection and monitoring of sleep apnea conditions
CN117615710A (en) * 2021-07-27 2024-02-27 优化睡眠株式会社 Sleep arousal determination system, sleep arousal determination method, and computer program

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4999772A (en) * 1989-01-24 1991-03-12 Eden Tec Corporation Sleep screening system with time based modulated data
US20040073098A1 (en) * 2002-01-07 2004-04-15 Widemed Ltd. Self-adaptive system for the analysis of biomedical signals of a patient
US20050115561A1 (en) * 2003-08-18 2005-06-02 Stahmann Jeffrey E. Patient monitoring, diagnosis, and/or therapy systems and methods
US20050143617A1 (en) * 2003-12-31 2005-06-30 Raphael Auphan Sleep and environment control method and system
US20060111635A1 (en) * 2004-11-22 2006-05-25 Koby Todros Sleep staging based on cardio-respiratory signals
US20060149144A1 (en) * 1997-01-27 2006-07-06 Lynn Lawrence A System and method for automatic detection of a plurality of SPO2 time series pattern types
US20140221780A1 (en) * 2011-07-22 2014-08-07 President And Fellows Of Harvard College Complexity based methods and systems for detecting depression
US20160310696A1 (en) * 2013-12-18 2016-10-27 Koninklijke Philips N.V. System and method for enhancing sleep slow wave activity based on cardiac characteristics or respiratory characterics

Family Cites Families (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IL100080A (en) 1991-11-19 1994-12-29 Sleep Disorders Diagnostic And Monitor system for determining the sleep stages of a person
US5259390A (en) 1992-02-03 1993-11-09 Queen's University Method and apparatus to monitor sleep behaviour
US5732696A (en) 1992-03-17 1998-03-31 New York University Polysomnograph scoring
CA2349561C (en) 1998-10-30 2009-11-10 Walter Reed Army Institute Of Research System and method for predicting human cognitive performance using data from an actigraph
US7204250B1 (en) 1999-12-16 2007-04-17 Compumedics Limited Bio-mask
KR20040047754A (en) 2001-06-13 2004-06-05 컴퓨메딕스 리미티드 Methods and apparatus for monitoring consciousness
AU2003263571A1 (en) 2002-09-19 2004-04-08 Ramot At Tel Aviv University Ltd. Method, apparatus and system for characterizing sleep
US6878121B2 (en) 2002-11-01 2005-04-12 David T. Krausman Sleep scoring apparatus and method
IL155955A0 (en) 2003-05-15 2003-12-23 Widemed Ltd Adaptive prediction of changes of physiological/pathological states using processing of biomedical signal
US7190995B2 (en) 2003-06-13 2007-03-13 The Regents Of The University Of Michigan System and method for analysis of respiratory cycle-related EEG changes in sleep-disordered breathing
JP4390535B2 (en) 2003-11-26 2009-12-24 横河電機株式会社 Sleep stage estimation method and apparatus using the method
US7324845B2 (en) 2004-05-17 2008-01-29 Beth Israel Deaconess Medical Center Assessment of sleep quality and sleep disordered breathing based on cardiopulmonary coupling
EP1781171A4 (en) 2004-06-24 2009-10-21 Vivometrics Inc Systems and methods for monitoring cough
US20060060198A1 (en) 2004-09-17 2006-03-23 Acoba, Llc Method and system of scoring sleep disordered breathing
WO2006121455A1 (en) 2005-05-10 2006-11-16 The Salk Institute For Biological Studies Dynamic signal processing
DE102005048496A1 (en) 2005-10-07 2007-04-12 Inmeditec Medizintechnik Gmbh Medical diagnostics measuring mat e.g. for monitoring sleep, determines pressure distribution on surface using pressure sensors and has temperature and moisture sensors
US8016776B2 (en) 2005-12-02 2011-09-13 Medtronic, Inc. Wearable ambulatory data recorder
JP2007195823A (en) 2006-01-27 2007-08-09 Daikin Ind Ltd Sleep information providing system
WO2008132736A2 (en) 2007-05-01 2008-11-06 Hypnocore Ltd. Method and device for characterizing sleep
CN102065753B (en) 2008-04-14 2015-01-21 伊塔马医疗有限公司 Non-invasive method and apparatus for determining light- sleep and deep-sleep stages
US8355769B2 (en) 2009-03-17 2013-01-15 Advanced Brain Monitoring, Inc. System for the assessment of sleep quality in adults and children
EP2460464A1 (en) 2010-12-03 2012-06-06 Koninklijke Philips Electronics N.V. Sleep disturbance monitoring apparatus
WO2012112186A1 (en) * 2011-02-15 2012-08-23 The General Hospital Corporation Systems and methods to monitor and quantify physiological stages
JP5720295B2 (en) 2011-02-22 2015-05-20 オムロンヘルスケア株式会社 Sleep evaluation apparatus and display method in sleep evaluation apparatus
US8948861B2 (en) 2011-03-31 2015-02-03 Toyota Motor Engineering & Manufacturing North America, Inc. Methods and systems for determining optimum wake time
WO2012153263A1 (en) 2011-05-11 2012-11-15 Koninklijke Philips Electronics N.V. Sleep stage annotation device
JP5879833B2 (en) 2011-09-06 2016-03-08 ソニー株式会社 Information processing apparatus, information processing method, and program
WO2013054712A1 (en) 2011-10-14 2013-04-18 株式会社タニタ Sleep assessment system and sleep assessment apparatus
US10492720B2 (en) * 2012-09-19 2019-12-03 Resmed Sensor Technologies Limited System and method for determining sleep stage
US9633175B2 (en) 2013-02-05 2017-04-25 Big Health Ltd Interactive system for sleep improvement
CN116328142A (en) 2013-07-08 2023-06-27 瑞思迈传感器技术有限公司 Method and system for sleep management
US20150112157A1 (en) * 2013-10-23 2015-04-23 Quanttus, Inc. Arrhythmia detection
US9655559B2 (en) 2014-01-03 2017-05-23 Vital Connect, Inc. Automated sleep staging using wearable sensors

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4999772A (en) * 1989-01-24 1991-03-12 Eden Tec Corporation Sleep screening system with time based modulated data
US20060149144A1 (en) * 1997-01-27 2006-07-06 Lynn Lawrence A System and method for automatic detection of a plurality of SPO2 time series pattern types
US20040073098A1 (en) * 2002-01-07 2004-04-15 Widemed Ltd. Self-adaptive system for the analysis of biomedical signals of a patient
US20050115561A1 (en) * 2003-08-18 2005-06-02 Stahmann Jeffrey E. Patient monitoring, diagnosis, and/or therapy systems and methods
US20050143617A1 (en) * 2003-12-31 2005-06-30 Raphael Auphan Sleep and environment control method and system
US20060111635A1 (en) * 2004-11-22 2006-05-25 Koby Todros Sleep staging based on cardio-respiratory signals
US20140221780A1 (en) * 2011-07-22 2014-08-07 President And Fellows Of Harvard College Complexity based methods and systems for detecting depression
US20160310696A1 (en) * 2013-12-18 2016-10-27 Koninklijke Philips N.V. System and method for enhancing sleep slow wave activity based on cardiac characteristics or respiratory characterics

Cited By (51)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10376670B2 (en) 2013-07-08 2019-08-13 Resmed Sensor Technologies Limited Methods and systems for sleep management
US11364362B2 (en) 2013-07-08 2022-06-21 Resmed Sensor Technologies Limited Methods and systems for sleep management
US11648373B2 (en) 2013-07-08 2023-05-16 Resmed Sensor Technologies Limited Methods and systems for sleep management
US20150186312A1 (en) * 2013-12-27 2015-07-02 Petari Incorporation Apparatus and method for sensing object state
US10966666B2 (en) 2016-02-01 2021-04-06 Verily Life Sciences Llc Machine learnt model to detect REM sleep periods using a spectral analysis of heart rate and motion
US10470719B2 (en) * 2016-02-01 2019-11-12 Verily Life Sciences Llc Machine learnt model to detect REM sleep periods using a spectral analysis of heart rate and motion
US10026294B2 (en) 2016-03-09 2018-07-17 Honda Motor Co., Ltd. Information processing system, terminal, information processing method, information processing method of terminal, and program
US20170263106A1 (en) * 2016-03-09 2017-09-14 Honda Motor Co., Ltd. Information processing system, terminal, information processing method, information processing method of terminal, and program
US10049558B2 (en) * 2016-03-09 2018-08-14 Honda Motor Co., Ltd. Information processing system, terminal, information processing method, information processing method of terminal, and program
US11877861B2 (en) 2016-09-06 2024-01-23 Fitbit, Inc. Methods and systems for labeling sleep states
US11207021B2 (en) * 2016-09-06 2021-12-28 Fitbit, Inc Methods and systems for labeling sleep states
US20180064388A1 (en) * 2016-09-06 2018-03-08 Fitbit, Inc. Methods and systems for labeling sleep states
JP7104076B2 (en) 2017-06-29 2022-07-20 コーニンクレッカ フィリップス エヌ ヴェ Methods and devices for determining sleep statistics
JP2020525146A (en) * 2017-06-29 2020-08-27 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Method and apparatus for determining sleep statistics
US11723579B2 (en) 2017-09-19 2023-08-15 Neuroenhancement Lab, LLC Method and apparatus for neuroenhancement
WO2019058280A1 (en) 2017-09-20 2019-03-28 Johnson & Johnson Consumer Inc. Healthcare caregiver behavior coaching system and method
EP3485804A1 (en) * 2017-11-20 2019-05-22 Kinpo Electronics, Inc. Wearable device capable of recognizing doze-off stage and recognition method thereof
TWI698223B (en) * 2017-11-20 2020-07-11 金寶電子工業股份有限公司 Wearable device capable of recognizing doze-off stage and recognition method thereof
US11717686B2 (en) 2017-12-04 2023-08-08 Neuroenhancement Lab, LLC Method and apparatus for neuroenhancement to facilitate learning and performance
US11478603B2 (en) 2017-12-31 2022-10-25 Neuroenhancement Lab, LLC Method and apparatus for neuroenhancement to enhance emotional response
US11273283B2 (en) 2017-12-31 2022-03-15 Neuroenhancement Lab, LLC Method and apparatus for neuroenhancement to enhance emotional response
US11318277B2 (en) 2017-12-31 2022-05-03 Neuroenhancement Lab, LLC Method and apparatus for neuroenhancement to enhance emotional response
US11363996B2 (en) * 2018-02-11 2022-06-21 Xi'an Jiaotong University Early warning method, device and system of sudden death
US11839485B2 (en) 2018-03-02 2023-12-12 Nitto Denko Corporation Method, computing device and wearable device for sleep stage detection
WO2019168474A1 (en) * 2018-03-02 2019-09-06 Nitto Denko Corporation Method, computing device and wearable device for sleep stage detection
CN111867450A (en) * 2018-03-07 2020-10-30 皇家飞利浦有限公司 Sleep apnea detection system and method
WO2019170734A1 (en) * 2018-03-07 2019-09-12 Koninklijke Philips N.V. Sleep apnea detection system and method
EP3536225A1 (en) * 2018-03-07 2019-09-11 Koninklijke Philips N.V. Sleep apnea detection system and method
WO2019185392A1 (en) * 2018-03-30 2019-10-03 Koninklijke Philips N.V. System and method for non-invasive determination of blood pressure dip based on trained prediction models
US11844593B2 (en) * 2018-03-30 2023-12-19 Koninklijke Philips N.V. System and method for non-invasive determination of blood pressure dip based on trained prediction models
US11364361B2 (en) 2018-04-20 2022-06-21 Neuroenhancement Lab, LLC System and method for inducing sleep by transplanting mental states
WO2019246234A1 (en) * 2018-06-19 2019-12-26 Hilmisson Hugi Systems and methods for evaluation of health situation or condition
US11452839B2 (en) 2018-09-14 2022-09-27 Neuroenhancement Lab, LLC System and method of improving sleep
CN109394188A (en) * 2018-11-27 2019-03-01 中山大学 A kind of adnormal respiration detection method, device and equipment based on heart rate variability
CN109394188B (en) * 2018-11-27 2022-03-08 中山大学 Method, device and equipment for detecting respiratory anomaly based on heart rate variability
IT201900004689A1 (en) * 2019-03-28 2020-09-28 Microbiomed S R L ELECTRONIC DEVICE AND PROCEDURE FOR AUTOMATICALLY DETECTING BEHAVIOR DISORDERS IN A SUBJECT DURING REM SLEEP PHASE AND / OR NIGHT APNEE
US11786694B2 (en) 2019-05-24 2023-10-17 NeuroLight, Inc. Device, method, and app for facilitating sleep
CN110251119A (en) * 2019-05-28 2019-09-20 深圳和而泰家居在线网络科技有限公司 Disaggregated model acquisition methods, HRV data classification method, device and Related product
CN112006652A (en) * 2019-05-29 2020-12-01 深圳市睿心由科技有限公司 Sleep state detection method and system
US11937938B1 (en) * 2019-07-26 2024-03-26 Apple Inc. Methods for assessing sleep conditions
EP4025120A4 (en) * 2019-09-05 2023-08-30 Emory University Systems and methods for detecting sleep activity
WO2021046342A1 (en) * 2019-09-05 2021-03-11 Emory University Systems and methods for detecting sleep activity
CN111067503A (en) * 2019-12-31 2020-04-28 深圳安视睿信息技术股份有限公司 Sleep staging method based on heart rate variability
CN111481173A (en) * 2020-04-15 2020-08-04 上海贝氪若宝健康科技有限公司 Body sign signal detection method, medium, equipment and system
CN111631682A (en) * 2020-04-23 2020-09-08 平安国际智慧城市科技股份有限公司 Physiological feature integration method and device based on trend-removing analysis and computer equipment
CN113456030A (en) * 2021-08-05 2021-10-01 成都云卫康医疗科技有限公司 Sleep staging method based on heart rate monitoring data
WO2023025770A1 (en) * 2021-08-26 2023-03-02 Pprs Sas Sleep stage determining system
CN114041753A (en) * 2021-11-16 2022-02-15 上海市第六人民医院 Sleep staging method and device, computer equipment and storage medium
CN114652274A (en) * 2022-04-19 2022-06-24 无锡市人民医院 Intelligent sleep monitoring system for three-dimensional multi-dimensional data
CN115844335A (en) * 2023-01-29 2023-03-28 广东工业大学 Sleep stage staging method and system based on feature overlapping and generalized decision forest
CN115868941A (en) * 2023-03-03 2023-03-31 深圳市魔样科技有限公司 Information management method for intelligent ring

Also Published As

Publication number Publication date
US10321871B2 (en) 2019-06-18
WO2017040331A1 (en) 2017-03-09

Similar Documents

Publication Publication Date Title
US10321871B2 (en) Determining sleep stages and sleep events using sensor data
US10561321B2 (en) Continuous monitoring of a user&#39;s health with a mobile device
US11678838B2 (en) Automated detection of breathing disturbances
Haoyu et al. An IoMT cloud-based real time sleep apnea detection scheme by using the SpO2 estimation supported by heart rate variability
US10582890B2 (en) Visualizing, scoring, recording, and analyzing sleep data and hypnograms
US11246520B2 (en) Using heartrate information to classify PTSD
US20190076031A1 (en) Continuous monitoring of a user&#39;s health with a mobile device
Ramirez-Villegas et al. Heart rate variability dynamics for the prognosis of cardiovascular risk
Kwon et al. Attention-based LSTM for non-contact sleep stage classification using IR-UWB radar
Sannino et al. Monitoring obstructive sleep apnea by means of a real-time mobile system based on the automatic extraction of sets of rules through differential evolution
Li et al. A deep learning-based algorithm for detection of cortical arousal during sleep
Nakayama et al. Obstructive sleep apnea screening by heart rate variability-based apnea/normal respiration discriminant model
Altini et al. Cardiorespiratory fitness estimation using wearable sensors: Laboratory and free-living analysis of context-specific submaximal heart rates
US11109809B2 (en) Methods and systems for adaptable presentation of sensor data
Kaneriya et al. Markov decision-based recommender system for sleep apnea patients
US20230248320A1 (en) Detection of User Temperature and Assessment of Physiological Symptoms with Respiratory Diseases
Pärkkä Analysis of personal health monitoring data for physical activity recognition and assessment of energy expenditure, mental load and stress
Pan et al. Wrist movement analysis for long-term home sleep monitoring
Assaf et al. Sleep detection using physiological signals from a wearable device
US20220125376A1 (en) Sleep apnea syndrome determination apparatus, sleep apnea syndrome determination method, and sleep apnea syndrome determination program
Roomkham The potential of personal devices in large-scale sleep studies
권현빈 Non-Contact Sleep Monitoring Using Impulse Radio Ultra-Wideband Radar Based on Long Short-Term Memory Network
Kohzadi et al. Developing an apnea-hypopnea diagnostic model using SVM
JUSTINO MACHINE LEARNING MODELS FOR MENTAL STRESS CLASSIFICATION BASED ON MULTIMODAL BIOSIGNAL INPUT
Dai Smart Sensing and Clinical Predictions with Wearables: From Physiological Signals to Mental Health

Legal Events

Date Code Title Description
AS Assignment

Owner name: AWARABLES, INC., MARYLAND

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BANDYOPADHYAY, AMRIT;BLANKENSHIP, GILMER;UPENDER, RAGHU;AND OTHERS;SIGNING DATES FROM 20160830 TO 20160831;REEL/FRAME:039680/0898

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2551); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

Year of fee payment: 4