CN110151137B - Sleep state monitoring method, device, equipment and medium based on data fusion - Google Patents

Sleep state monitoring method, device, equipment and medium based on data fusion Download PDF

Info

Publication number
CN110151137B
CN110151137B CN201910451708.1A CN201910451708A CN110151137B CN 110151137 B CN110151137 B CN 110151137B CN 201910451708 A CN201910451708 A CN 201910451708A CN 110151137 B CN110151137 B CN 110151137B
Authority
CN
China
Prior art keywords
sleep state
state
motion
variation
characteristic value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910451708.1A
Other languages
Chinese (zh)
Other versions
CN110151137A (en
Inventor
刘有群
朱侃杰
马娜
陈恋恋
殷明君
徐桢
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Neuis Technology Co ltd
Original Assignee
Shenzhen Ruyi Exploration Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Ruyi Exploration Technology Co ltd filed Critical Shenzhen Ruyi Exploration Technology Co ltd
Priority to CN201910451708.1A priority Critical patent/CN110151137B/en
Publication of CN110151137A publication Critical patent/CN110151137A/en
Application granted granted Critical
Publication of CN110151137B publication Critical patent/CN110151137B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4806Sleep evaluation
    • A61B5/4809Sleep detection, i.e. determining whether a subject is asleep or not

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Anesthesiology (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention discloses a sleep state monitoring method, a sleep state monitoring device, sleep state monitoring equipment and a sleep state monitoring medium based on data fusion, wherein the method comprises the steps of collecting at least one axis acceleration variable quantity triggered by overturning motion within a preset first time period; judging whether the suspected sleep state is entered according to the collected axial acceleration variation; if the suspected sleep state is entered, communicating with the intelligent terminal to obtain at least one first judgment parameter; and judging whether to enter a sleep state according to the first judgment parameter. According to the invention, the intelligent wearable device is used for collecting the activity of the user to judge whether the user enters the suspected sleep state, the intelligent terminal which is communicated with the intelligent wearable device is further used for collecting data, and whether the user enters the sleep state is further judged, and the intelligent terminal is used for communicating, so that the interference of low activity behavior in the suspected sleep state can be eliminated, a more accurate sleep judgment result can be obtained, the accuracy of sleep judgment can be obviously improved, and the viscosity of the user can be increased.

Description

Sleep state monitoring method, device, equipment and medium based on data fusion
Technical Field
The invention relates to the field of intelligent wearing, in particular to a sleep state monitoring method, a sleep state monitoring device, sleep state monitoring equipment and a sleep state monitoring medium based on data fusion.
Background
The existing sleep condition detection methods include a polysomnography method, a pressure sensor-based detection method, an overturn-based detection method and the like. The polysomnography method is a 'gold standard' in the sleep detection method, and is used for acquiring physiological signals of a human body through various sensors and finally recording different patterns for analysis. The method can provide a plurality of physiological parameters of a tested person, but the used equipment is expensive and complex to operate and needs professional personnel to operate; meanwhile, the testee needs to stick various sensors on the body, and the psychological load influence brought by the sensors is large.
The sleep condition detection method based on the pressure sensor is used for carrying out sleep quality analysis by collecting vibration caused in the sleeping process of a human body through the pressure sensor. Although the method can not generate any physiological or psychological influence on the tested person, the arrangement position of the sensor can generate great influence on the measurement precision, and the method is only suitable for single person measurement.
The sleep condition detection method based on the turnover detection has the basic working principle that acceleration change data of each axis during turnover movement in the sleep process are collected through an acceleration sensor worn on an arm, and then calculation of corresponding algorithms is carried out on the obtained data to obtain sleep quality indexes. The method has the advantages of accurate measurement, simple operation and convenient wearing. However, the single sleep state detection method based on the roll-over detection is difficult to determine whether the user is in a sleep state or a low activity state in a static state, for example, the user leans on a sofa to watch a television computer, the user lies on a bed to play a mobile phone, and the like, and at this time, data generated based on the roll-over detection is very similar to data characteristics generated when the user sleeps, and it is difficult to distinguish whether the user actually sleeps.
Disclosure of Invention
In order to solve the technical problem that the sleep state or the low activity state is difficult to distinguish in the prior art, embodiments of the present invention provide a sleep state monitoring method, apparatus, device, and medium based on data fusion.
In one aspect, the present invention provides a sleep state monitoring method based on data fusion, including:
collecting at least one axis acceleration variation triggered by the overturning motion within a preset first time period;
judging whether the suspected sleep state is entered according to the collected axial acceleration variation;
if the suspected sleep state is entered, communicating with the intelligent terminal to obtain at least one first judgment parameter;
and judging whether to enter a sleep state according to the first judgment parameter.
In another aspect, the present invention provides a sleep state monitoring device based on data fusion, the device comprising:
the shaft acceleration variation module is used for acquiring at least one shaft acceleration variation triggered by the overturning motion within a preset first time period;
the suspected sleep state judging module is used for judging whether to enter a suspected sleep state according to the collected axial acceleration variation;
the first judgment parameter acquisition module is used for communicating with the intelligent terminal to acquire at least one first judgment parameter if the suspected sleep state is entered;
and the sleep state judging module is used for judging whether to enter a sleep state according to the first judging parameter.
In another aspect, the present invention provides an apparatus, which includes a processor and a memory, where at least one instruction, at least one program, a set of codes, or a set of instructions is stored in the memory, and the at least one instruction, the at least one program, the set of codes, or the set of instructions is loaded and executed by the processor to implement a sleep state monitoring method based on data fusion.
In another aspect, the present invention provides a computer storage medium, wherein at least one instruction, at least one program, a set of codes, or a set of instructions is stored in the storage medium, and the at least one instruction, at least one program, set of codes, or set of instructions is loaded by a processor and executes a sleep state monitoring method based on data fusion.
The invention provides a sleep state monitoring method, a sleep state monitoring device, sleep state monitoring equipment and a sleep state monitoring medium based on data fusion. According to the invention, the intelligent wearable device is used for collecting the activity of the user to judge whether the user enters the suspected sleep state, the intelligent terminal which is communicated with the intelligent wearable device is further used for collecting data, and whether the user enters the sleep state is further judged, and the intelligent terminal is used for communicating, so that the interference of low activity behavior in the suspected sleep state can be eliminated, a more accurate sleep judgment result can be obtained, the accuracy of sleep judgment can be obviously improved, and the viscosity of the user can be increased.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions and advantages of the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a schematic illustration of an implementation environment provided by the present invention;
FIG. 2 is a flowchart of a sleep state monitoring method based on data fusion according to the present invention;
FIG. 3 is a flowchart illustrating a method for determining whether to enter a suspected sleep state according to a collected axial acceleration variation according to the present invention;
FIG. 4 is a flow chart illustrating the present invention for analyzing the collected axial acceleration variation over the first time period to obtain the directed activity amount;
FIG. 5 is another flow chart illustrating the present invention for analyzing the acceleration variation of the shaft collected during the first time period to obtain the directed activity;
fig. 6 is a flowchart for communicating with an intelligent terminal to obtain at least one first determination parameter, and determining whether to enter a sleep state according to the first determination parameter;
fig. 7 is a flowchart for communicating with an intelligent terminal to obtain at least one first determination parameter, and determining whether to enter a sleep state according to the first determination parameter;
fig. 8 is a flowchart illustrating another method for communicating with an intelligent terminal to obtain at least one first decision parameter, and determining whether to enter a sleep state according to the first decision parameter;
FIG. 9 is a flow chart for obtaining motion state parameters within a predetermined time according to the present invention;
FIG. 10 is a flowchart illustrating a discrete degree of a motion state characteristic value within a preset time according to the motion state characteristic value within the preset time according to the present invention;
FIG. 11 is a flowchart for determining whether to go to sleep according to the exercise status parameter, the light intensity average value, and the user engagement level;
FIG. 12 is a block diagram of a sleep state monitoring method based on data fusion according to the present invention;
fig. 13 is a hardware structural diagram of an apparatus for implementing the method provided by the embodiment of the invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, are within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or server that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In order to make the objects, technical solutions and advantages disclosed in the embodiments of the present invention more clearly apparent, the embodiments of the present invention are described in further detail below with reference to the accompanying drawings and the embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the embodiments of the invention and are not intended to limit the embodiments of the invention.
Referring to fig. 1, the implementation environment includes: intelligent wearing equipment 01 and intelligent terminal 02. Intelligent wearing equipment 01 and intelligent terminal 02 communication connection.
Intelligent wearing equipment 01 can be for intelligent silver mirror, intelligent bracelet, intelligent wrist-watch, intelligent wrist strap, intelligent ring, intelligent clothing etc..
The intelligent terminal device 02 may include: the physical devices may also include software running in the physical devices, such as applications, and the like. For example, the intelligent terminal device 02 may operate a relevant management device of the intelligent wearable device 01.
The intelligent wearable device 01 completes the sleep state monitoring of the user wearing the intelligent wearable device 01 by acquiring the second information from the intelligent terminal device 02 and combining the first information acquired by the intelligent wearable device 01.
Referring to fig. 2, a sleep state monitoring method based on data fusion according to an embodiment of the present invention is shown, where the method may use an intelligent wearable device in the above implementation environment as an implementation subject, and the method includes:
s101, collecting at least one axis acceleration variation triggered by the overturning motion within a preset first time period.
Specifically, the acquisition of the axis acceleration variation can be realized through the G-sensor arranged in the intelligent wearable device. G-sensor (Accelerometer-sensor) chinese is an acceleration sensor that can sense changes in acceleration. When the force of the target object acting on the object in the acceleration process, such as various movement changes of shaking, falling, rising, falling and the like, can be converted into an electric signal by the G-sensor, and then the function with good program design can be completed after the calculation and analysis of the microprocessor.
The axis acceleration change amount may reflect the intensity of the motion of the user wearing the smart wearable device, that is, the activity amount thereof. If the amount of activity is large, it is evident that the user is awake, and if the amount of activity is small within the preset first time period, it is reasonable to suspect that the user is in a relatively quiet state, where the user may be doing a low activity, such as lying down to watch a cell phone or sitting to watch television, may be going to sleep, or may have gone to sleep.
The principle of the sleep condition detection method based on rollover detection in the prior art is just the same, and the user is presumed to enter the sleep state by the fact that the axis acceleration variation is small, but the user also has the possibility of performing low activity behavior, so that the presumption is failed, and the sleep monitoring precision is influenced. How to effectively distinguish low activity and sleep is also a problem which is difficult to solve in the prior art.
And S103, judging whether the suspected sleep state is entered according to the collected axial acceleration variation.
Specifically, the determining whether to enter the suspected sleep state according to the collected axial acceleration variation includes, as shown in fig. 3:
and S1031, performing data analysis on the shaft acceleration variation acquired in the first time period to obtain the pointed activity amount of the shaft acceleration variation.
In a possible embodiment, the G-sensor may acquire axis acceleration variation in three directions, i.e., an X axis, a Y axis, and a Z axis, and then perform data analysis on the axis acceleration variation acquired in the first time period to obtain the activity amount pointed by the axis acceleration variation, as shown in fig. 4, including:
s1, obtaining comprehensive variable quantity of each moment, wherein the comprehensive variable quantity is the root of the square sum of the axial acceleration variable quantities in the X axis direction, the Y axis direction and the Z axis direction.
And S3, acquiring a target comprehensive variation, wherein the target comprehensive variation is a comprehensive variation with a numerical value larger than a preset variation threshold.
And S5, adding the target comprehensive variable quantities in the first time period to obtain the activity quantity.
In another possible embodiment, the frequency of activities may be further taken into account. The user is more frequently active in the awake state, and in the sleep state, the occasional turning-over and other activities are generated only in the deep sleep, and an obvious time interval exists between the deep sleep and the awake state, so the significance of the frequency of the activities on the sleep judgment is very clear, and the judgment accuracy on the suspected sleep state can be obviously improved by taking the frequency of the activities into consideration. Specifically, the analyzing the data of the axis acceleration variation acquired in the first time period to obtain the pointing activity amount thereof, as shown in fig. 5, includes:
and S2, acquiring comprehensive variable quantity of each moment, wherein the comprehensive variable quantity is the root of the square sum of the axial acceleration variable quantities in the X axis direction, the Y axis direction and the Z axis direction.
And S4, acquiring a target comprehensive variation, wherein the target comprehensive variation is a comprehensive variation with a numerical value larger than a preset variation threshold.
S6, acquiring corresponding weight of the target comprehensive variation, wherein the weight reflects the user activity frequency at the moment when the target comprehensive variation occurs.
The weight reflects the frequency of the user activity, and in one embodiment, the weight is obtained by:
calculating a reciprocal value corresponding to each target comprehensive variation, wherein the reciprocal value is the reciprocal of the difference between the acquisition time corresponding to the target comprehensive variation and the acquisition time corresponding to other target comprehensive variations which are nearest to the target comprehensive variation;
and normalizing each reciprocal value to obtain a weight value corresponding to each target comprehensive variable quantity.
And S8, carrying out weighted summation on each target comprehensive variable quantity in the first time period according to the corresponding weight of the target comprehensive variable quantity to obtain the activity quantity.
S1033, judging whether the activity amount is smaller than an activity amount low-limit threshold value.
Specifically, the activity amount threshold may be set according to actual conditions, and may be related to or unrelated to user behavior habits.
And S1035, if the activity amount is smaller than the activity amount low-limit threshold, determining to enter a suspected sleep state.
And S105, if the suspected sleep state is entered, communicating with the intelligent terminal to obtain at least one first judgment parameter.
And S107, judging whether to enter a sleep state according to the first judgment parameter.
The sleep state monitoring method based on data fusion disclosed by the embodiment of the invention not only judges whether the user enters the suspected sleep state by collecting the activity of the user through the intelligent wearable device, but also further judges whether the user enters the sleep state by collecting data through an intelligent terminal communicated with the intelligent wearable device, and can eliminate the interference of low activity behavior in the suspected sleep state by communicating with the intelligent terminal to obtain a more accurate sleep judgment result, thereby remarkably improving the accuracy of sleep judgment and increasing the viscosity of the user.
In a possible implementation manner, the communicating with the intelligent terminal to obtain at least one first determination parameter, and determining whether to enter the sleep state according to the first determination parameter, as shown in fig. 6, includes:
s10, acquiring a human eye state;
before the eye state is obtained, whether eye blinking occurs or not can be judged according to the eye state near the current time point, and if eye blinking does not occur, the eye state can be further judged.
S20, if the eye state is an open state, judging that the user does not enter a sleep state;
s30, if the eye state is a closed state, judging whether the eyeball is in a rotating state or not;
and S40, if yes, judging to enter a sleep state.
The eyes are closed but the eyeball rotation coincides with the alternating state of upper REM (rapid eye movement period) and N-REM (non-rapid eye movement period), and thus, it is judged to enter the sleep state.
In another possible implementation, the communicating with the intelligent terminal to obtain at least one first determination parameter, and determining whether to enter the sleep state according to the first determination parameter, as shown in fig. 7, includes:
s100, acquiring a sound signal acquired by the intelligent terminal.
S200, filtering the sound signal to remove interference sound waves.
The purpose is that the sound of electrical appliances such as air conditioners, televisions, purifiers and the like needs to be removed; removing the turning over is a sound made by the bed and the wall or the bed itself.
S300, acquiring the frequency and the fluctuation rule of the sound signal.
And S400, judging whether the user goes to sleep or not according to the frequency and fluctuation rule of the sound signal.
When a person is in a sleep state, the breathing frequency is low, and the corresponding sound signals are smooth and regular, which can be used as data for directly judging the sleep of the user.
In another possible implementation, the communicating with the intelligent terminal to obtain at least one first determination parameter, and determining whether to enter the sleep state according to the first determination parameter, as shown in fig. 8, includes:
s301, obtaining motion state parameters in preset time.
Specifically, the motion state parameter is used for representing the motion state of the intelligent terminal held by the user, so that the motion state of the user is reflected laterally.
Specifically, the acquiring the motion state parameter within the preset time, as shown in fig. 9, includes:
s3011, obtaining the characteristic value of the motion state within the preset time.
Specifically, the motion characteristic feature value may be obtained by collecting an angular velocity through a gyroscope in the intelligent terminal, or may be obtained by sensing a motion of the user through a magnetic sensor and an acceleration sensor. Which may be acceleration values, velocity values, flux values, etc.
S3013, calculating the discrete degree of the characteristic value of the motion state in the preset time according to the characteristic value of the motion state in the preset time.
Specifically, the calculating the discrete degree of the motion state characteristic value within the preset time according to the motion state characteristic value within the preset time as shown in fig. 10 includes:
s30131, calculating a first motion characteristic value, a second motion characteristic value and a third motion characteristic value which are collected at each moment in the preset time.
S30133, obtaining a valid characteristic value according to the square sum of the first motion characteristic value, the second motion characteristic value and the third motion characteristic value.
And S30135, calculating an average effective characteristic value according to each effective characteristic value.
S30137, representing the dispersion degree of the characteristic value of the motion state by the mean value of the square sum of the difference values of each effective characteristic value and the average effective characteristic value.
S3015, the discrete degree of the characteristic value of the motion state in the preset time is used as the parameter of the motion state.
And S303, acquiring the light intensity average value in the preset time.
Specifically, the light intensity is used for representing the illumination state of the position where the mobile phone held by the user is located. Specifically, the obtaining of the light intensity average value in the preset time includes:
and S3031, removing singular points of the light intensity collected within the preset time to obtain an effective light intensity value.
In particular, the singularity of the light intensity may be generated because the smart terminal is suddenly lighted. For example, when a mobile phone of a user is placed on a desk, a message of WeChat is received to cause a screen to be lightened for reminding, and a brighter point is calculated by the detection, so that the mobile phone judges whether the message is received at the moment to cause the screen to be lightened, and if the situation occurs, the data is removed.
And S3033, calculating the average value of the effective light intensity values.
S305, obtaining the user participation degree in the preset time, wherein the user participation degree is used for representing the participation degree of the user to the operation of the intelligent terminal.
Specifically, the user engagement may be calculated from a standard deviation of screen proximity and a screen brightness state. The screen proximity represents the proximity of a human face to the screen.
And S307, judging whether the user enters sleep according to the motion state parameters, the light intensity average value and the user participation degree.
Specifically, the determining whether to go to sleep according to the exercise state parameter, the light intensity average value, and the user engagement level, as shown in fig. 11, includes:
s3071, obtaining a motion state parameter weight, a light intensity average value weight and a user participation degree weight.
Specifically, the motion state parameter weight, the light intensity average value weight and the user engagement weight can be set according to actual needs.
S3073, according to the motion state parameter weight, the light intensity average value weight and the user participation weight, carrying out weighted summation on the motion state parameter, the light intensity average value and the user participation to obtain a sleep state score.
S3075, if the sleep score is smaller than a preset threshold value, judging to enter a sleep state.
In the embodiment of the invention, the sleep state score for judging whether to enter the sleep state is obtained by comprehensively considering the motion state parameter, the light intensity average value and the user participation degree, and compared with the mode that a certain variable is singly used and is measured and displayed through related experiments, the three variables jointly consider that the judgment of the sleep state by the sleep state score mode has outstanding accuracy, so that the combination of the motion state parameter, the light intensity average value and the user participation degree has strong directivity for the sleep state.
The embodiment of the invention can accurately judge whether to enter the sleep state or not from the combination of sound, human eye state or motion state parameters, light intensity average values and user participation degrees through communication with the intelligent terminal, and certainly the three feasible implementation modes can also be used in a superposition way. The intelligent terminal connected with the intelligent wearable device is combined to assist in monitoring and generate comprehensive data to judge whether the user is in a sleep state or not, common low-activity behaviors such as a user lying and playing a mobile phone can be effectively eliminated, and therefore the sleep judgment accuracy is remarkably improved.
Further, still include:
and S109, if the sleep state is judged to be entered, acquiring a user operation instruction, or communicating with the intelligent terminal to acquire at least one second judgment parameter.
And S1011, judging whether to enter a waking state according to the operation instruction or the second judgment parameter.
In the prior art, the single G-sensor is used for judging whether a user is awake during sleep or not or is judged based on the heart rate, namely, the activity of the user is similar to that of the user who is asleep during the waking period of time, and the heart rate acquisition equipment is often in a closed state due to serious power consumption, so that the user cannot be monitored in time when entering the awake state. Through a large amount of user behavior habit related researches, most users can operate hand rings or mobile phones when waking up. Therefore, the embodiment of the invention judges whether the user is awake by acquiring the user operation instruction or communicating with the intelligent terminal to acquire the at least one second judgment parameter.
Specifically, the user operation instruction may be any instruction for operating the intelligent wearable device, and the second determination parameter may also point to any instruction for operating the intelligent terminal.
The sleep state monitoring method based on data fusion disclosed by the embodiment of the invention can fully utilize the communication with the intelligent terminal, improve the accurate determination of sleep judgment and improve the waking timeliness, thereby improving the viscosity of a user.
An embodiment of the present invention further provides a sleep state monitoring device based on data fusion, as shown in fig. 12, the device includes:
an axis acceleration variation module 401, configured to collect at least one axis acceleration variation triggered by the flipping motion within a preset first time period;
a suspected sleep state determining module 403, configured to determine whether to enter a suspected sleep state according to the collected axis acceleration variation;
a first determination parameter obtaining module 405, configured to communicate with the intelligent terminal to obtain at least one first determination parameter if the suspected sleep state is entered;
a sleep state determining module 407, configured to determine whether to enter a sleep state according to the first determination parameter
Specifically, the embodiments of the sleep state monitoring device and the sleep state monitoring method based on data fusion are all based on the same inventive concept.
The embodiment of the present invention further provides a computer storage medium, where the computer storage medium may store a plurality of instructions, and the instructions are suitable for being loaded by a processor and executing various steps of the sleep state monitoring method based on data fusion according to the embodiment of the present invention, which is not described herein again.
Further, fig. 13 shows a hardware structure diagram of an apparatus for implementing the method provided by the embodiment of the present invention, and the apparatus may participate in constituting or including the apparatus provided by the embodiment of the present invention. As shown in fig. 13, the device 10 may include one or more (shown as 102a, 102b, … …, 102 n) processors 102 (the processors 102 may include, but are not limited to, a processing device such as a microprocessor MCU or a programmable logic device FPGA, etc.), a memory 104 for storing data, and a transmission device 106 for communication functions. Besides, the method can also comprise the following steps: a display, an input/output interface (I/O interface), a Universal Serial Bus (USB) port (which may be included as one of the ports of the I/O interface), a network interface, a power source, and/or a camera. It will be understood by those skilled in the art that the structure shown in fig. 13 is only an illustration and is not intended to limit the structure of the electronic device. For example, device 10 may also include more or fewer components than shown in FIG. 13, or have a different configuration than shown in FIG. 13.
It should be noted that the one or more processors 102 and/or other data processing circuitry described above may be referred to generally herein as "data processing circuitry". The data processing circuitry may be embodied in whole or in part in software, hardware, firmware, or any combination thereof. Further, the data processing circuitry may be a single, stand-alone processing module, or incorporated in whole or in part into any of the other elements in the device 10 (or mobile device). As referred to in the embodiments of the application, the data processing circuit acts as a processor control (e.g. selection of a variable resistance termination path connected to the interface).
The memory 104 may be used to store software programs and modules of application software, such as program instructions/data storage devices corresponding to the method described in the embodiment of the present invention, and the processor 102 executes various functional applications and data processing by running the software programs and modules stored in the memory 104, so as to implement the above-mentioned sleep state monitoring method based on data fusion. The memory 104 may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, memory 104 may further include memory located remotely from processor 102, which may be connected to device 10 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission device 106 is used for receiving or transmitting data via a network. Specific examples of such networks may include wireless networks provided by the communication provider of the device 10. In one example, the transmission device 106 includes a network adapter (NIC) that can be connected to other network devices through a base station so as to communicate with the internet. In one example, the transmission device 106 can be a Radio Frequency (RF) module, which is used for communicating with the internet in a wireless manner.
The display may be, for example, a touch screen type Liquid Crystal Display (LCD) that may enable a user to interact with a user interface of the device 10 (or mobile device).
It should be noted that: the precedence order of the above embodiments of the present invention is only for description, and does not represent the merits of the embodiments. And specific embodiments thereof have been described above. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, as for the device and server embodiments, since they are substantially similar to the method embodiments, the description is simple, and the relevant points can be referred to the partial description of the method embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (10)

1. A sleep state monitoring method based on data fusion is characterized by comprising the following steps:
collecting at least one axis acceleration variation triggered by the overturning motion within a preset first time period;
acquiring comprehensive variable quantity of each moment according to the acquired axial acceleration variable quantity, wherein the comprehensive variable quantity is the root of the square sum of the axial acceleration variable quantities in the X axis direction, the Y axis direction and the Z axis direction;
acquiring a target comprehensive variation, wherein the target comprehensive variation is a comprehensive variation with a numerical value larger than a preset variation threshold;
obtaining a reciprocal value corresponding to the target comprehensive variation, and performing normalization processing on the reciprocal value to determine a weight value corresponding to the target comprehensive variation, wherein the reciprocal value is the reciprocal of a difference value between a collection time corresponding to the target comprehensive variation and a collection time corresponding to other target comprehensive variations adjacent to the collection time;
weighting and summing the target comprehensive variable quantity in the first time period according to a weight value corresponding to the target comprehensive variable quantity to determine activity;
judging whether to enter a suspected sleep state according to the activity amount;
if the suspected sleep state is entered, communicating with the intelligent terminal to obtain at least one first judgment parameter;
and judging whether the intelligent terminal enters a sleep state or not according to the first judgment parameter, wherein the first judgment parameter comprises three combined parameters of human eye state parameters, sound signal parameters or motion state parameters, light intensity average values and user participation degrees, and the sound signal parameters are sound signal parameters collected by the intelligent terminal.
2. The method of claim 1, further comprising:
if the sleep state is judged to be entered, acquiring a user operation instruction, or communicating with the intelligent terminal to acquire at least one second judgment parameter;
and judging whether to enter the waking state according to the operation instruction or the second judgment parameter.
3. The method according to claim 1, wherein the communicating with the intelligent terminal to obtain at least one first determination parameter, and determining whether to enter the sleep state according to the first determination parameter comprises:
acquiring the state of human eyes;
if the eye state is open, judging that the user does not enter the sleep state;
if the eye state is a closed state, judging whether the eyeball is in a rotating state; and if so, judging to enter a sleep state.
4. The method according to claim 1, wherein the communicating with the intelligent terminal to obtain at least one first determination parameter, and determining whether to enter the sleep state according to the first determination parameter comprises:
acquiring a sound signal acquired by an intelligent terminal;
filtering the sound signal to remove interfering sound waves;
acquiring the frequency and the fluctuation rule of the sound signal;
and judging whether to go to sleep or not according to the frequency and fluctuation rule of the sound signal.
5. The method according to claim 1, wherein the communicating with the intelligent terminal to obtain at least one first determination parameter, and determining whether to enter the sleep state according to the first determination parameter comprises:
acquiring motion state parameters within preset time;
acquiring the average value of the light intensity within the preset time;
acquiring the user participation degree in the preset time, wherein the user participation degree is used for representing the participation degree of the user in the operation of the intelligent terminal;
and judging whether to sleep according to the motion state parameters, the light intensity average value and the user participation.
6. The method of claim 5, wherein the obtaining the motion state parameter within the preset time comprises:
acquiring a characteristic value of the motion state within the preset time;
calculating the discrete degree of the motion state characteristic value in the preset time according to the motion state characteristic value in the preset time;
and taking the discrete degree of the motion state characteristic value in the preset time as the motion state parameter.
7. The method according to claim 6, wherein the calculating the discrete degree of the characteristic value of the motion state in the preset time according to the characteristic value of the motion state in the preset time comprises:
calculating a first motion characteristic value, a second motion characteristic value and a third motion characteristic value which are acquired at each moment in the preset time;
obtaining an effective characteristic value according to the square sum of the first motion characteristic value, the second motion characteristic value and the third motion characteristic value;
calculating an average effective characteristic value according to each effective characteristic value;
and representing the dispersion degree of the characteristic value of the motion state by the mean value of the square sum of the difference values of each effective characteristic value and the average effective characteristic value.
8. A sleep state monitoring device based on data fusion, the device comprising: a shaft acceleration variation module for collecting the signals triggered by the turning motion within a preset first time period
One less axis acceleration change;
the comprehensive variable acquiring module is used for acquiring comprehensive variable at each moment, wherein the comprehensive variable is the root of the square sum of axial acceleration variable quantities in the X axis direction, the Y axis direction and the Z axis direction;
the target comprehensive variation obtaining module is used for obtaining a target comprehensive variation, and the target comprehensive variation is a comprehensive variation with a numerical value larger than a preset variation threshold;
the weight value determining module is used for acquiring a reciprocal value corresponding to the target comprehensive variation and carrying out normalization processing on the reciprocal value so as to determine a weight value corresponding to the target comprehensive variation, wherein the reciprocal value is the reciprocal of a difference value between an acquisition time corresponding to the target comprehensive variation and an acquisition time corresponding to other adjacent target comprehensive variations;
the activity amount determining module is used for weighting and summing the target comprehensive variation in the first time period according to a weight value corresponding to the target comprehensive variation, so as to determine the activity amount;
the suspected sleep state judging module is used for judging whether to enter a suspected sleep state according to the activity amount;
the first judgment parameter acquisition module is used for communicating with the intelligent terminal to acquire at least one first judgment parameter if the suspected sleep state is entered;
and the sleep state judging module is used for judging whether to enter a sleep state according to the first judging parameter, wherein the first judging parameter comprises three combined parameters of human eye state parameters, sound signal parameters or motion state parameters, light intensity average values and user participation degrees, and the sound signal parameters are sound signal parameters acquired by the intelligent terminal.
9. A sleep state monitoring device based on data fusion, characterized in that the device comprises a processor and a memory, wherein the memory stores at least one instruction, at least one program, a set of codes or a set of instructions, and the at least one instruction, the at least one program, the set of codes or the set of instructions is loaded and executed by the processor to realize a sleep state monitoring method based on data fusion according to any one of claims 1-7.
10. A computer storage medium having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded by a processor and executes a method of sleep state monitoring based on data fusion according to any one of claims 1-7.
CN201910451708.1A 2019-05-28 2019-05-28 Sleep state monitoring method, device, equipment and medium based on data fusion Active CN110151137B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910451708.1A CN110151137B (en) 2019-05-28 2019-05-28 Sleep state monitoring method, device, equipment and medium based on data fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910451708.1A CN110151137B (en) 2019-05-28 2019-05-28 Sleep state monitoring method, device, equipment and medium based on data fusion

Publications (2)

Publication Number Publication Date
CN110151137A CN110151137A (en) 2019-08-23
CN110151137B true CN110151137B (en) 2022-03-11

Family

ID=67629682

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910451708.1A Active CN110151137B (en) 2019-05-28 2019-05-28 Sleep state monitoring method, device, equipment and medium based on data fusion

Country Status (1)

Country Link
CN (1) CN110151137B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113495609A (en) * 2020-04-01 2021-10-12 华为技术有限公司 Sleep state judgment method and system, wearable device and storage medium
CN111839465A (en) * 2020-07-30 2020-10-30 歌尔科技有限公司 Sleep detection method and device, intelligent wearable device and readable storage medium
CN114343587A (en) * 2020-09-29 2022-04-15 Oppo广东移动通信有限公司 Sleep monitoring method and device, electronic equipment and computer readable medium
CN112432640B (en) * 2020-10-12 2023-05-09 深圳数联天下智能科技有限公司 Method, device, electronic equipment and medium for judging use state
CN114424927A (en) * 2020-10-29 2022-05-03 华为技术有限公司 Sleep monitoring method and device, electronic equipment and computer readable storage medium
CN112401838B (en) * 2020-11-16 2023-07-14 上海创功通讯技术有限公司 Method for detecting sleep state by wearable device and wearable device
CN113724737A (en) * 2021-08-30 2021-11-30 康键信息技术(深圳)有限公司 Method and device for monitoring sleep state, electronic equipment and storage medium
CN115988134A (en) * 2022-12-26 2023-04-18 北京奇艺世纪科技有限公司 User state monitoring method, device, equipment and medium based on mobile equipment
CN116095922B (en) * 2023-03-13 2023-08-18 广州易而达科技股份有限公司 Lighting lamp control method and device, lighting lamp and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015178011A (en) * 2015-06-09 2015-10-08 セイコーエプソン株式会社 sleep evaluation device and program
CN104994455A (en) * 2015-07-03 2015-10-21 深圳市前海安测信息技术有限公司 Headset volume adjusting method conducive to improving sleep quality and headset
CN105045386A (en) * 2015-06-30 2015-11-11 广东美的制冷设备有限公司 Sleeping state monitoring method, sleeping state monitoring terminal and air conditioner system
CN105407217A (en) * 2015-10-26 2016-03-16 南京步步高通信科技有限公司 Mobile terminal music playing method and mobile terminal
CN105559751A (en) * 2015-12-14 2016-05-11 安徽华米信息科技有限公司 Method, device and wearable device for monitoring states of light activities
CN107831985A (en) * 2017-11-13 2018-03-23 广东欧珀移动通信有限公司 A kind of method, mobile terminal and the storage medium of mobile terminal screen control
CN107943527A (en) * 2017-11-30 2018-04-20 西安科锐盛创新科技有限公司 The method and its system of electronic equipment is automatically closed in sleep
CN108009572A (en) * 2017-11-22 2018-05-08 中国地质大学(武汉) Mobile device fall detection method and its model forming method and mobile equipment

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015178011A (en) * 2015-06-09 2015-10-08 セイコーエプソン株式会社 sleep evaluation device and program
CN105045386A (en) * 2015-06-30 2015-11-11 广东美的制冷设备有限公司 Sleeping state monitoring method, sleeping state monitoring terminal and air conditioner system
CN104994455A (en) * 2015-07-03 2015-10-21 深圳市前海安测信息技术有限公司 Headset volume adjusting method conducive to improving sleep quality and headset
CN105407217A (en) * 2015-10-26 2016-03-16 南京步步高通信科技有限公司 Mobile terminal music playing method and mobile terminal
CN105559751A (en) * 2015-12-14 2016-05-11 安徽华米信息科技有限公司 Method, device and wearable device for monitoring states of light activities
CN107831985A (en) * 2017-11-13 2018-03-23 广东欧珀移动通信有限公司 A kind of method, mobile terminal and the storage medium of mobile terminal screen control
CN108009572A (en) * 2017-11-22 2018-05-08 中国地质大学(武汉) Mobile device fall detection method and its model forming method and mobile equipment
CN107943527A (en) * 2017-11-30 2018-04-20 西安科锐盛创新科技有限公司 The method and its system of electronic equipment is automatically closed in sleep

Also Published As

Publication number Publication date
CN110151137A (en) 2019-08-23

Similar Documents

Publication Publication Date Title
CN110151137B (en) Sleep state monitoring method, device, equipment and medium based on data fusion
CN110151136B (en) Method, device, equipment and medium for monitoring sleep state of conditional reference heart rate
CN106645978B (en) The wearing state detection method and detection device of intelligent wearable device
US10212994B2 (en) Smart watch band
US10134256B2 (en) Portable monitoring devices and methods of operating the same
US8768648B2 (en) Selection of display power mode based on sensor data
US8781791B2 (en) Touchscreen with dynamically-defined areas having different scanning modes
US8751194B2 (en) Power consumption management of display in portable device based on prediction of user input
US20170109990A1 (en) Method and system for fall detection
CN107049255A (en) A kind of wearable intelligent equipment and its sleep algorithm
CN103793042B (en) A kind of system and method body motion information interaction and shown
CN104615851B (en) A kind of Sleep-Monitoring method and terminal
CN101558368B (en) Device and method for deciding necessity of brainwave identification
US20150277572A1 (en) Smart contextual display for a wearable device
JP2015058096A (en) Exercise support device, exercise support method, and exercise support program
WO2022021707A1 (en) Sleep monitoring method and apparatus, and smart wearable device and readable storage medium
CN113679339A (en) Sleep monitoring method, device, system and storage medium
US7008387B2 (en) Portable device for collecting information about living body
CN106326672B (en) Sleep detection method and system
WO2014066703A2 (en) Smart contextual display for a wearable device
CN113520339B (en) Sleep data validity analysis method and device and wearable device
US20160361011A1 (en) Determining resting heart rate using wearable device
Fei et al. A wearable health monitoring system
CN109907747A (en) User Status monitoring method, device and wearable device
CN112120715A (en) Pressure monitoring and relieving system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20230427

Address after: 809, Block A, Zhongguan Times Square, No. 4168 Liuxian Avenue, Pingshan Community, Taoyuan Street, Nanshan District, Shenzhen City, Guangdong Province, 518055

Patentee after: Guangdong Neuis Technology Co.,Ltd.

Address before: 518101 1805, building 1, Longguang century building, zone n23, Haiwang community, Xin'an street, Bao'an District, Shenzhen, Guangdong Province

Patentee before: SHENZHEN RUYI EXPLORATION TECHNOLOGY CO.,LTD.

TR01 Transfer of patent right