CN109833029B - Sleep staging method, system and terminal equipment - Google Patents

Sleep staging method, system and terminal equipment Download PDF

Info

Publication number
CN109833029B
CN109833029B CN201711202811.XA CN201711202811A CN109833029B CN 109833029 B CN109833029 B CN 109833029B CN 201711202811 A CN201711202811 A CN 201711202811A CN 109833029 B CN109833029 B CN 109833029B
Authority
CN
China
Prior art keywords
sleep
intelligent terminal
user
orientation
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711202811.XA
Other languages
Chinese (zh)
Other versions
CN109833029A (en
Inventor
宋雨
贺超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Chuangda Yunrui Intelligent Technology Co ltd
Original Assignee
Shenzhen Chuangda Yunrui Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Chuangda Yunrui Intelligent Technology Co ltd filed Critical Shenzhen Chuangda Yunrui Intelligent Technology Co ltd
Priority to CN201711202811.XA priority Critical patent/CN109833029B/en
Priority to PCT/CN2018/084633 priority patent/WO2019100660A1/en
Publication of CN109833029A publication Critical patent/CN109833029A/en
Application granted granted Critical
Publication of CN109833029B publication Critical patent/CN109833029B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Telephone Function (AREA)

Abstract

The application is applicable to the technical field of medical detection, and provides a sleep staging method, a sleep staging system and terminal equipment, wherein the sleep staging method comprises the following steps: when the intelligent terminal is in a non-standing state, acquiring the orientation relation between a sleep instrument worn by a user and the intelligent terminal; if the orientation relation meets a first preset condition, acquiring the distance between the sleep instrument and the intelligent terminal; and if the distance is within the range of the preset distance, judging that the user is in the waking period. When the intelligent terminal is in the non-standing state in the process, if the fact that the orientation relation between the intelligent terminal and the sleep apparatus worn by the user meets the first preset condition is detected, and the distance between the intelligent terminal and the sleep apparatus is smaller than the preset distance, the fact that the user uses the intelligent terminal can be judged, the fact that the user is in the waking period can be further judged, and the accuracy of the sleep period judgment result of the user is improved.

Description

Sleep staging method, system and terminal equipment
Technical Field
The application belongs to the technical field of medical detection, and particularly relates to a sleep staging method, a sleep staging system and terminal equipment.
Background
With the continuous acceleration of the life rhythm of people, various sleep problems continuously appear, and people pay more and more attention to the sleep quality of each sleep period in the sleep process. In staging sleep, according to the Polysomnography (PSG) based electroencephalogram sleep staging criteria (AASM criteria) promulgated by the american sleep association (AASM), sleep is generally classified into: a W stage-awake (wakefulness) NREM stage-non-rapid-eye-movement stage (non-rapid-eye-movement) and a REM stage-rapid-eye-movement stage (non-rapid-eye-movement); wherein, NREM period can be divided into shallow sleep period and deep sleep period. If the user does not have relatively large eye movement or body movement when lying, the single-conductor wave judges whether the user is in a waking period, a shallow sleep period or a rapid eye movement period according to the acquired brain wave data, and the accuracy of the judgment result of the single-conductor wave on the whole sleep process stage of the user is reduced.
Disclosure of Invention
In view of this, embodiments of the present application provide a sleep staging method, a sleep staging system, and a terminal device, so as to solve the problem that in the prior art, when a sleep stage of a user is analyzed, a sleep apparatus cannot accurately determine whether the user is in a wake stage.
A first aspect of an embodiment of the present application provides a sleep staging method, including:
when the intelligent terminal is in a non-standing state, acquiring the orientation relation between a sleep instrument worn by a user and the intelligent terminal;
judging whether the orientation relation meets a first preset condition or not, wherein the orientation relation comprises the relation between the face orientation of a user and the orientation of a display screen of the intelligent terminal when the user wears the sleep apparatus;
if the orientation relation meets a first preset condition, acquiring the distance between the sleep instrument and the intelligent terminal;
and if the distance is within the range of the preset distance, judging that the user is in the waking period.
A second aspect of embodiments of the present application provides a sleep staging system comprising:
the first acquisition unit is used for acquiring the orientation relation between the sleep instrument worn by the user and the intelligent terminal when the intelligent terminal is in a non-standing state;
the judging unit is used for judging whether the orientation relation meets a first preset condition or not, wherein the orientation relation comprises the relation between the face orientation of the user and the orientation of a display screen of the intelligent terminal when the user wears the sleep apparatus;
the second acquisition unit is used for acquiring the distance between the sleep instrument and the intelligent terminal when the orientation relation meets a first preset condition;
and a waking period determination unit configured to determine that the user is in a waking period when the distance is within a range of a preset distance.
A third aspect of embodiments of the present application provides a terminal device, comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, the processor implementing the steps of any one of the sleep staging methods when executing the computer program.
A fourth aspect of embodiments of the present application provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method according to any one of the sleep staging methods.
In the embodiment provided by the application, when the terminal device is in a non-standing state, acquiring an orientation relation between a sleep apparatus worn by a user and the intelligent terminal, wherein the orientation relation comprises a relation between the orientation of the face of the user and the orientation of a display screen of the intelligent terminal; then judging whether the orientation relation meets a first preset condition or not; when the first preset condition is met, the distance between the sleep instrument and the intelligent terminal is acquired, and if the distance between the sleep instrument and the intelligent terminal is within the range of the preset distance, the user wearing the sleep instrument can be judged to be in a waking period. When the intelligent terminal is in the non-standing state in the process, if the orientation relation between the intelligent terminal and the sleep apparatus worn by the user is detected to meet a first preset condition, if the intelligent terminal and the sleep apparatus are in a face-to-face state and the distance between the intelligent terminal and the sleep apparatus is smaller than a preset distance, the fact that the user uses the intelligent terminal can be judged, the user is judged to be in a waking period, and the accuracy of the judgment result of the sleep period where the user is located is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flowchart illustrating an implementation flow of a sleep staging method according to an embodiment of the present disclosure;
fig. 2 is a schematic flow chart illustrating an implementation of the providing step S12 according to an embodiment of the present application;
fig. 3 is a schematic diagram of a sleep staging system according to a second embodiment of the present application;
fig. 4 is a schematic diagram of a terminal device provided in the third embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In the embodiment provided by the application, when the terminal device is in a non-standing state, acquiring an orientation relation between a sleep apparatus worn by a user and the intelligent terminal, wherein the orientation relation comprises a relation between the orientation of the face of the user and the orientation of a display screen of the intelligent terminal; then judging whether the orientation relation meets a first preset condition or not; when the first preset condition is met, the distance between the sleep instrument and the intelligent terminal is acquired, and if the distance between the sleep instrument and the intelligent terminal is within the range of the preset distance, the user wearing the sleep instrument can be judged to be in a waking period.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
The first embodiment is as follows:
step S11, when the intelligent terminal is in a non-standing state, acquiring the orientation relation between the sleep apparatus worn by the user and the intelligent terminal, wherein the orientation relation comprises the relation between the orientation of the face of the user and the orientation of the display screen of the intelligent terminal when the sleep apparatus is worn by the user;
the sleep instrument is worn by a user during sleep in the embodiment provided by the application, and the face orientation of the user and the front orientation of the sleep instrument are in the same direction when the user wears the sleep instrument. In order to improve the accuracy of judging whether the user is in the waking period, when the intelligent terminal is in a non-standing state, the orientation relation between the sleep instrument worn by the user and the intelligent terminal is detected and acquired.
The intelligent terminal is provided with an operation program for controlling a sleep apparatus worn by a user, and the application program is used for controlling the operation mode of the sleep apparatus and recording data related to sleep of the user monitored by the sleep apparatus.
Optionally, the smart terminal includes a smart phone, a tablet computer, and the like.
Wherein the orientation relation comprises the relation between the orientation of the face of the user or the orientation of the front face of the sleep meter and the orientation of the display screen of the intelligent terminal. For example, when the smart terminal is a smart phone, the display screen of the smart phone is in a face-to-face relationship with the front face of the sleep meter.
Step S12, judging whether the orientation relation meets a first preset condition;
the method and the device for judging the orientation relation between the obtained face orientation of the user and the orientation of the display screen of the intelligent terminal meet the requirements of a first preset condition or not. The first preset condition comprises that when a user wears the sleep apparatus, the front face of the sleep apparatus faces or the face of the user faces to the display screen of the intelligent terminal; or both may be substantially face-to-face.
Step S13, if the orientation relation meets a first preset condition, acquiring the distance between the sleep instrument and the intelligent terminal;
in the embodiment provided by the application, if the orientation relation meets a first preset condition, the distance between the sleep meter and the intelligent terminal is acquired. When the distance between the two is acquired, the distance between the two can be detected and calculated through the infrared detection equipment, and the distance between the two can also be calculated through the signal intensity when the two are connected through Bluetooth.
And step S14, if the distance is within the range of the preset distance, determining that the user is in the waking period.
In the step, after the distance between the intelligent terminal and the sleep instrument worn by the user is acquired, whether the distance between the intelligent terminal and the sleep instrument worn by the user is within a preset distance range is judged, and if the distance between the intelligent terminal and the sleep instrument worn by the user is within the preset distance range, the user is judged to be in a state of using the intelligent terminal, so that the user is determined to be in a waking period.
Optionally, the preset distance may be 0.5 meter, or may be another value, and the size of the specific preset distance may be set by a user, which is not limited herein.
In the embodiment provided by the application, when the terminal device is in a non-standing state, acquiring an orientation relation between a sleep apparatus worn by a user and the intelligent terminal, wherein the orientation relation comprises a relation between the orientation of the face of the user and the orientation of a display screen of the intelligent terminal; then judging whether the orientation relation meets a first preset condition or not; when the first preset condition is met, the distance between the sleep instrument and the intelligent terminal is acquired, and if the distance between the sleep instrument and the intelligent terminal is within the range of the preset distance, the user wearing the sleep instrument can be judged to be in a waking period. When the intelligent terminal is in the non-standing state in the process, if the orientation relation between the intelligent terminal and the sleep apparatus worn by the user is detected to meet a first preset condition, if the intelligent terminal and the sleep apparatus are in a face-to-face state and the distance between the intelligent terminal and the sleep apparatus is smaller than a preset distance, the fact that the user uses the intelligent terminal can be judged, the user is judged to be in a waking period, and the accuracy of the judgment result of the sleep period where the user is located is improved.
Optionally, in another embodiment provided herein: step S12, the determining whether the orientation relation satisfies a first preset condition includes:
step S21, detecting the orientation of the front of the sleep meter and the orientation of the display screen of the intelligent terminal;
in the step, the orientation of the front face of the sleep meter and the orientation of the display screen of the intelligent terminal are detected, at the moment, the two are assumed to be in the same coordinate system in advance, and the angles of the two relative to the same coordinate axis in the coordinate system are respectively detected.
Step S22, determining the orientation relation between the two according to the detection result;
specifically, the orientation relationship of the two is determined according to the detected angles of the two relative to the same coordinate axis, wherein the orientation relationship includes a relative relationship (face-to-face relationship).
Step S23, judging whether the front of the sleep apparatus is in a relative state with the display screen of the intelligent terminal according to the orientation relation;
in step S24, if the two are in a relative state, it is determined that the orientation relationship satisfies a first preset condition.
And when the orientation of the two is judged to be in a relative relationship according to the determined orientation relationship, namely the two are in a face-to-face state, judging that the orientation relationship of the two meets a first preset condition.
Optionally, in another embodiment provided by the present application, before the obtaining of the orientation relationship between the sleep meter worn by the user and the intelligent terminal, the method includes
Detecting whether an application program used for controlling the sleep instrument in the intelligent terminal is in a running state or not, wherein the running state comprises an operated state and a background running state;
and if the application program is in the operated state, judging that the intelligent terminal is in a non-standing state.
In the step, whether an application program used for controlling the sleep meter in the intelligent terminal is in a running state is detected, if so, whether the application program is in an operated state or a background running state is determined, and if the application program is in the operated state, the intelligent terminal can be directly determined to be in a non-standing state.
Optionally, in another embodiment provided by the present application, before the obtaining of the orientation relationship between the sleep meter worn by the user and the intelligent terminal, the method further includes:
if the application program is in a background running state, acquiring the acceleration of the intelligent terminal;
and when the acceleration meets a second preset condition, judging that the intelligent terminal is in a non-standing state.
In the step, if an application program for controlling the sleep meter is in a background running state, acquiring the acceleration of the intelligent terminal through a gyroscope or an acceleration sensor in the intelligent terminal, and judging the acceleration; and when the acceleration meets a second preset condition (for example, the acceleration value is greater than 0 or greater than a certain preset numerical value), judging that the intelligent terminal is in a non-standing state.
Optionally, the gyroscope in the intelligent terminal may be further configured to acquire the position, the angle, and the like of the intelligent terminal, so as to determine whether the front of the sleep meter and the display screen of the intelligent terminal are approximately in a face-to-face state, where the approximately face-to-face state indicates a state in which the user can operate the intelligent terminal or view the display screen of the intelligent terminal when the user is in a lying state.
Optionally, in another embodiment provided by the present application, if the distance is within a preset distance range, determining that the user is in a waking period includes:
acquiring the signal intensity of the intelligent terminal and the sleep instrument during Bluetooth connection;
calculating the distance between the intelligent terminal and the sleep instrument according to the signal intensity;
and judging whether the distance is within a preset distance range, and if so, judging that the user is in a waking period.
In the step, when the user prepares to sleep and wears the sleep instrument, the intelligent terminal is connected with the sleep instrument through the Bluetooth, so that the intelligent terminal can receive sleep data of the user detected by the sleep instrument in real time. When the distance between the intelligent terminal and the sleep instrument is calculated, the signal intensity of Bluetooth connection between the intelligent terminal and the sleep instrument is acquired, and the distance between the intelligent terminal and the sleep instrument is calculated through a preset algorithm according to the signal intensity; and then judging whether the distance is within the range of a preset distance, and if so, judging that the user is in the waking period.
Example two:
fig. 3 shows a block diagram of a sleep staging system provided in an embodiment of the present application, which corresponds to the sleep staging method described in the above embodiment, and only shows the relevant parts in the embodiment of the present application for convenience of illustration.
Referring to fig. 3, the apparatus includes: a first acquisition unit 31, a determination unit 32, a second acquisition unit 33, a waking period determination unit 34, wherein
The first obtaining unit 31 is configured to obtain an orientation relationship between the sleep monitor worn by the user and the intelligent terminal when the intelligent terminal is in a non-standing state;
the judging unit 32 is configured to judge whether the orientation relationship meets a first preset condition, where the orientation relationship includes a relationship between the orientation of the face of the user and the orientation of the display screen of the intelligent terminal when the user wears the sleep apparatus;
the second obtaining unit 33 is configured to obtain a distance between the sleep apparatus and the intelligent terminal when the orientation relationship satisfies a first preset condition;
a waking period determination unit 34 configured to determine that the user is in a waking period when the distance is within a range of a preset distance.
Optionally, the determining unit 32 includes:
the detection module is used for detecting the orientation of the front face of the sleep meter and the orientation of a display screen of the intelligent terminal; determining the orientation relation between the two according to the detection result;
the state judging module is used for judging whether the front surface of the sleep instrument is in a relative state with the display screen of the intelligent terminal according to the orientation relation; and if the two are in opposite states, judging that the orientation relation meets a first preset condition.
Optionally, the awake period determination unit 34 includes:
the signal intensity acquisition module is used for acquiring the signal intensity when the intelligent terminal is connected with the sleep instrument through the Bluetooth;
the calculation module is used for calculating the distance between the intelligent terminal and the sleep instrument according to the signal intensity;
and the distance judgment module is used for judging whether the distance is within a preset distance range, and if so, judging that the user is in a waking period.
Optionally, the sleep staging system further comprises:
the application program detection unit is used for detecting whether an application program used for controlling the sleep meter in the intelligent terminal is in a running state or not, wherein the running state comprises an operated state and a background running state;
and the first judging unit is used for judging that the intelligent terminal is in a non-standing state if the application program is in an operated state.
Optionally, the sleep staging system further comprises:
the second judgment unit is used for acquiring the acceleration of the intelligent terminal if the application program is in a background running state; and when the acceleration meets a second preset condition, judging that the intelligent terminal is in a non-standing state.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Example three:
fig. 4 is a schematic diagram of a terminal device according to an embodiment of the present application. As shown in fig. 4, the terminal device 4 of this embodiment includes: a processor 40, a memory 41 and a computer program 42 stored in said memory 41 and executable on said processor 40. The processor 40, when executing the computer program 42, implements the steps in the various sleep staging method embodiments described above, such as steps S11-S14 shown in fig. 1. Alternatively, the processor 40, when executing the computer program 42, implements the functions of the modules/units in the above-mentioned device embodiments, such as the functions of the modules 31 to 34 shown in fig. 3.
Illustratively, the computer program 42 may be partitioned into one or more modules/units that are stored in the memory 41 and executed by the processor 40 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program 42 in the terminal device 4. For example, the computer program 42 may be divided into: the device comprises a first acquisition unit, a judgment unit, a second acquisition unit and a waking period judgment unit, wherein the specific functions of the units are as follows:
the first acquisition unit is used for acquiring the orientation relation between the sleep instrument worn by the user and the intelligent terminal when the intelligent terminal is in a non-standing state;
the judging unit is used for judging whether the orientation relation meets a first preset condition or not, wherein the orientation relation comprises the relation between the face orientation of the user and the orientation of a display screen of the intelligent terminal when the user wears the sleep apparatus;
the second acquisition unit is used for acquiring the distance between the sleep instrument and the intelligent terminal when the orientation relation meets a first preset condition;
and a waking period determination unit configured to determine that the user is in a waking period when the distance is within a range of a preset distance.
Further, the judging unit includes:
the detection module is used for detecting the orientation of the front face of the sleep meter and the orientation of a display screen of the intelligent terminal; determining the orientation relation between the two according to the detection result;
the state judging module is used for judging whether the front surface of the sleep instrument is in a relative state with the display screen of the intelligent terminal according to the orientation relation; and if the two are in opposite states, judging that the orientation relation meets a first preset condition.
Further, the awake period determination unit includes:
the signal intensity acquisition module is used for acquiring the signal intensity when the intelligent terminal is connected with the sleep instrument through the Bluetooth;
the calculation module is used for calculating the distance between the intelligent terminal and the sleep instrument according to the signal intensity;
and the distance judgment module is used for judging whether the distance is within a preset distance range, and if so, judging that the user is in a waking period.
Further, the sleep staging system further comprises:
the application program detection unit is used for detecting whether an application program used for controlling the sleep meter in the intelligent terminal is in a running state or not, wherein the running state comprises an operated state and a background running state;
and the first judging unit is used for judging that the intelligent terminal is in a non-standing state if the application program is in an operated state.
Further, the sleep staging system further comprises:
the second judgment unit is used for acquiring the acceleration of the intelligent terminal if the application program is in a background running state; and when the acceleration meets a second preset condition, judging that the intelligent terminal is in a non-standing state.
The terminal device 4 may be a desktop computer, a notebook, a palm computer, a cloud server, or other computing devices. The terminal device may include, but is not limited to, a processor 40, a memory 41. Those skilled in the art will appreciate that fig. 4 is merely an example of a terminal device 4 and does not constitute a limitation of terminal device 4 and may include more or fewer components than shown, or some components may be combined, or different components, e.g., the terminal device may also include input-output devices, network access devices, buses, etc.
The Processor 40 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 41 may be an internal storage unit of the terminal device 4, such as a hard disk or a memory of the terminal device 4. The memory 41 may also be an external storage device of the terminal device 4, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the terminal device 4. Further, the memory 41 may also include both an internal storage unit and an external storage device of the terminal device 4. The memory 41 is used for storing the computer program and other programs and data required by the terminal device. The memory 41 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. . Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain other components which may be suitably increased or decreased as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media which may not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. A sleep staging method, comprising:
when the intelligent terminal is in a non-standing state, acquiring the orientation relation between a sleep instrument worn by a user and the intelligent terminal; the intelligent terminal is provided with an application program for controlling a sleep apparatus worn by a user, and the application program is used for controlling the running mode of the sleep apparatus and recording data related to the sleep of the user monitored by the sleep apparatus;
judging whether the orientation relation meets a first preset condition or not, wherein the orientation relation comprises the relation between the face orientation of a user and the orientation of a display screen of the intelligent terminal when the user wears the sleep apparatus; the first preset condition comprises that when a user wears the sleep instrument, the front face of the sleep instrument or the face of the user faces to the display screen of the intelligent terminal;
if the orientation relation meets a first preset condition, acquiring the distance between the sleep instrument and the intelligent terminal through infrared detection equipment or a Bluetooth technology;
and if the distance is within the range of the preset distance, judging that the user is in the waking period.
2. The sleep staging method of claim 1, wherein the determining whether the orientation relationship satisfies a first preset condition comprises:
detecting the orientation of the front of the sleep meter and the orientation of a display screen of the intelligent terminal;
determining the orientation relation between the two according to the detection result;
judging whether the front surface of the sleep instrument is in a relative state with a display screen of the intelligent terminal or not according to the orientation relation;
and if the two are in opposite states, judging that the orientation relation meets a first preset condition.
3. The sleep staging method according to claim 1 or 2, prior to the acquiring the orientation relationship between the sleep meter worn by the user and the smart terminal, comprising
Detecting whether an application program used for controlling the sleep instrument in the intelligent terminal is in a running state or not, wherein the running state comprises an operated state and a background running state;
and if the application program is in the operated state, judging that the intelligent terminal is in a non-standing state.
4. The sleep staging method as claimed in claim 3, further comprising, before the acquiring the orientation relationship between the sleep meter worn by the user and the smart terminal:
if the application program is in a background running state, acquiring the acceleration of the intelligent terminal;
when the acceleration meets a second preset condition, judging that the intelligent terminal is in a non-standing state; the second preset condition is that the acceleration value is greater than 0 or greater than a preset numerical value.
5. The sleep staging method of claim 1, wherein the determining that the user is in a wake session if the distance is within a preset distance range comprises:
acquiring the signal intensity of the intelligent terminal and the sleep instrument during Bluetooth connection;
calculating the distance between the intelligent terminal and the sleep instrument according to the signal intensity;
and judging whether the distance is within a preset distance range, and if so, judging that the user is in a waking period.
6. A sleep staging system, comprising:
the first acquisition unit is used for acquiring the orientation relation between the sleep instrument worn by the user and the intelligent terminal when the intelligent terminal is in a non-standing state; the intelligent terminal is provided with an application program for controlling a sleep apparatus worn by a user, and the application program is used for controlling the running mode of the sleep apparatus and recording data related to the sleep of the user monitored by the sleep apparatus;
the judging unit is used for judging whether the orientation relation meets a first preset condition or not, wherein the orientation relation comprises the relation between the face orientation of the user and the orientation of a display screen of the intelligent terminal when the user wears the sleep apparatus; the first preset condition comprises that when a user wears the sleep instrument, the front face of the sleep instrument or the face of the user faces to the display screen of the intelligent terminal;
the second acquisition unit is used for acquiring the distance between the sleep instrument and the intelligent terminal through infrared detection equipment or a Bluetooth technology when the orientation relation meets a first preset condition;
and a waking period determination unit configured to determine that the user is in a waking period when the distance is within a range of a preset distance.
7. The sleep staging system of claim 6, wherein the determining unit includes:
the detection module is used for detecting the orientation of the front face of the sleep meter and the orientation of a display screen of the intelligent terminal; determining the orientation relation between the two according to the detection result;
the state judging module is used for judging whether the front surface of the sleep instrument is in a relative state with the display screen of the intelligent terminal according to the orientation relation; and if the two are in opposite states, judging that the orientation relation meets a first preset condition.
8. The sleep staging system of claim 6, wherein the awake period determination unit includes:
the signal intensity acquisition module is used for acquiring the signal intensity when the intelligent terminal is connected with the sleep instrument through the Bluetooth;
the calculation module is used for calculating the distance between the intelligent terminal and the sleep instrument according to the signal intensity;
and the distance judgment module is used for judging whether the distance is within a preset distance range, and if so, judging that the user is in a waking period.
9. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1 to 5 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 5.
CN201711202811.XA 2017-11-27 2017-11-27 Sleep staging method, system and terminal equipment Active CN109833029B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201711202811.XA CN109833029B (en) 2017-11-27 2017-11-27 Sleep staging method, system and terminal equipment
PCT/CN2018/084633 WO2019100660A1 (en) 2017-11-27 2018-04-26 Sleep stage classification method and system, and terminal device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711202811.XA CN109833029B (en) 2017-11-27 2017-11-27 Sleep staging method, system and terminal equipment

Publications (2)

Publication Number Publication Date
CN109833029A CN109833029A (en) 2019-06-04
CN109833029B true CN109833029B (en) 2021-04-30

Family

ID=66631318

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711202811.XA Active CN109833029B (en) 2017-11-27 2017-11-27 Sleep staging method, system and terminal equipment

Country Status (2)

Country Link
CN (1) CN109833029B (en)
WO (1) WO2019100660A1 (en)

Family Cites Families (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8192376B2 (en) * 2003-08-18 2012-06-05 Cardiac Pacemakers, Inc. Sleep state classification
US8784293B2 (en) * 2008-10-07 2014-07-22 Advanced Brain Monitoring, Inc. Systems and methods for optimization of sleep and post-sleep performance
CN101815127B (en) * 2010-04-12 2014-08-20 中兴通讯股份有限公司 Mobile terminal and method for adjusting visual effect of screen thereof
CN102752458A (en) * 2012-07-19 2012-10-24 北京理工大学 Driver fatigue detection mobile phone and unit
KR101892233B1 (en) * 2012-08-03 2018-08-27 삼성전자주식회사 Method and apparatus for alarm service using context aware in portable terminal
CN103892796A (en) * 2012-12-30 2014-07-02 青岛海尔软件有限公司 Wrist-mounted sleep monitoring system
CN103006225A (en) * 2013-01-11 2013-04-03 湖南纳雷科技有限公司 Sleep monitoring instrument capable of monitoring breathing state in sleep
CN104378491B (en) * 2013-08-14 2019-02-15 中兴通讯股份有限公司 A kind of method and device of screen that lighting mobile terminal
CN103795865B (en) * 2014-02-12 2017-05-31 惠州Tcl移动通信有限公司 A kind of mobile terminal
US9821162B2 (en) * 2014-04-25 2017-11-21 Medtronic, Inc. Deep brain stimulation for sleep disorders
CN104299025A (en) * 2014-09-22 2015-01-21 武汉天喻信息产业股份有限公司 Non-contact smart card with loss preventing function and working method thereof
US20160106950A1 (en) * 2014-10-19 2016-04-21 Curzio Vasapollo Forehead-wearable light stimulator having one or more light pipes
CN104574817A (en) * 2014-12-25 2015-04-29 清华大学苏州汽车研究院(吴江) Machine vision-based fatigue driving pre-warning system suitable for smart phone
CN104896656B (en) * 2015-05-11 2018-04-06 小米科技有限责任公司 Method and device for starting air conditioner
CN105231997A (en) * 2015-10-10 2016-01-13 沈阳熙康阿尔卑斯科技有限公司 Sleep quality judging method and sleep instrument
CN105433904A (en) * 2015-11-24 2016-03-30 小米科技有限责任公司 Sleep state detection method, device and system
JP6322659B2 (en) * 2016-02-17 2018-05-09 株式会社コーエーテクモゲームス Information processing program and information processing apparatus
CN105852810B (en) * 2016-04-05 2019-01-29 福州市马尾区小微发明信息科技有限公司 A kind of sleep control method
CN105760739A (en) * 2016-04-22 2016-07-13 上海与德通讯技术有限公司 Iris-recognition-based unlocking method and system thereof
CN105997004B (en) * 2016-06-17 2019-03-22 美的集团股份有限公司 A kind of method and sleep monitoring device of sleep prompting
CN106333672A (en) * 2016-09-12 2017-01-18 杨代辉 EEG-based fatigue monitoring and rapid restoration head-mounted device for people working under high pressure
CN106333691A (en) * 2016-10-27 2017-01-18 深圳市万机创意电子科技有限公司 The method and device for judging sleep state, rest state and motion state of human body
CN106667436A (en) * 2016-12-19 2017-05-17 深圳创达云睿智能科技有限公司 Sleep diagnosis method and system

Also Published As

Publication number Publication date
CN109833029A (en) 2019-06-04
WO2019100660A1 (en) 2019-05-31

Similar Documents

Publication Publication Date Title
KR102655878B1 (en) Electronic Device which calculates Blood pressure value using Pulse Wave Velocity algorithm and the Method for calculating Blood Pressure value thereof
CN107595245B (en) Sleep management method, system and terminal equipment
CN110262947B (en) Threshold warning method and device, computer equipment and storage medium
US8781991B2 (en) Emotion recognition apparatus and method
EP2879095A1 (en) Method, apparatus and terminal device for image processing
KR102399533B1 (en) Electronic device and method for providing stress index corresponding to activity of user
US9924861B2 (en) System and methods for assessing vision using a computing device
US10653353B2 (en) Monitoring a person for indications of a brain injury
EP3699808B1 (en) Facial image detection method and terminal device
CN110827967A (en) Platform, method and storage medium for identifying space occupation proportion of pharmacy
CN108770046B (en) Method for saving electric quantity of smart watch
Abulkhair et al. Using mobile platform to detect and alerts driver fatigue
US9968287B2 (en) Monitoring a person for indications of a brain injury
CN107729144B (en) Application control method and device, storage medium and electronic equipment
CN107195163B (en) A kind of alarm method, device and wearable device
CN109833029B (en) Sleep staging method, system and terminal equipment
CN113303777A (en) Heart rate value determination method and device, electronic equipment and medium
CN105807888A (en) Electronic equipment and information processing method
Grützmacher et al. Towards energy efficient sensor nodes for online activity recognition
CN110809083B (en) Mobile terminal information reminding method, mobile terminal and storage medium
CN108837271B (en) Electronic device, output method of prompt message and related product
CN110908505B (en) Interest identification method, device, terminal equipment and storage medium
CN114708641A (en) Sleep detection method and device, computer readable storage medium and terminal equipment
CN108399085B (en) Electronic device, application management method and related product
US10289196B2 (en) Techniques for ocular control

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant