CN114631809A - Head wearing equipment, eye fatigue monitoring method and device and storage medium - Google Patents

Head wearing equipment, eye fatigue monitoring method and device and storage medium Download PDF

Info

Publication number
CN114631809A
CN114631809A CN202210261964.6A CN202210261964A CN114631809A CN 114631809 A CN114631809 A CN 114631809A CN 202210261964 A CN202210261964 A CN 202210261964A CN 114631809 A CN114631809 A CN 114631809A
Authority
CN
China
Prior art keywords
data
eye
eye fatigue
information
fatigue
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210261964.6A
Other languages
Chinese (zh)
Inventor
张云鹏
王计平
杨斐
熊大曦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SUZHOU KEYI-SKY SEMICONDUCTOR TECHNOLOGIES Inc
Original Assignee
SUZHOU KEYI-SKY SEMICONDUCTOR TECHNOLOGIES Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SUZHOU KEYI-SKY SEMICONDUCTOR TECHNOLOGIES Inc filed Critical SUZHOU KEYI-SKY SEMICONDUCTOR TECHNOLOGIES Inc
Priority to CN202210261964.6A priority Critical patent/CN114631809A/en
Publication of CN114631809A publication Critical patent/CN114631809A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1103Detecting eye twinkling
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • A61B5/372Analysis of electroencephalograms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6814Head
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7455Details of notification to user or communication with user or patient ; user input means characterised by tactile indication, e.g. vibration or electrical stimulation

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Physiology (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Psychology (AREA)
  • Psychiatry (AREA)
  • Ophthalmology & Optometry (AREA)
  • Geometry (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

The application discloses a head-worn device, a method and a device for monitoring eyestrain and a storage medium, wherein the method comprises the following steps: acquiring target blink data and eye using duration of a target object through a first inclination angle sensor, a second inclination angle sensor and a forehead electroencephalogram sensor, wherein the target blink data comprise left eye blink data and right eye blink data; acquiring head posture data of the target object through a third tilt sensor; performing left eye fatigue detection based on the left eye blinking data, the eye using duration and the head posture data to obtain left eye fatigue information of the target object; performing right eye fatigue detection based on the right eye blinking data, the eye duration and the head posture data to obtain right eye fatigue information of the target object; and performing eye fatigue analysis based on the left eye fatigue information and the right eye fatigue information to obtain eye fatigue information and eye difference information of the target object. The technical scheme that this application provided can promote with the accuracy of eye fatigue monitoring when distinguishing the fatigue degree of left and right eyes.

Description

Head wearing equipment, eye fatigue monitoring method and device and storage medium
Technical Field
The specification relates to the technical field of wearable devices, in particular to a head-worn device, an eyestrain monitoring method and device and a storage medium.
Background
In daily life, when people are engaged in close-distance watching work or learning, if people stare at a computer screen or a book for a long time, eyes feel dry and tired, severe asthenopia occurs, even daily work and learning are seriously influenced, and attention is paid to how to effectively monitor the fatigue of eyes.
In the existing eye fatigue monitoring method, a signal collector collects myoelectric signals generated by eyeball movement of a user, and eye characteristic data of the user is determined from the myoelectric signals, so that the eye fatigue state is determined.
Disclosure of Invention
In view of the above problems in the prior art, an object of the present application is to provide a head-worn device, an eye fatigue monitoring method, an eye fatigue monitoring device, and a storage medium, which can improve the accuracy of eye fatigue monitoring while distinguishing the degree of fatigue of left and right eyes.
In order to achieve the above purpose, the present application provides the following solutions:
a head-worn device, the device comprising: dress body and nose and hold in the palm the structure, the nose hold in the palm the structure with dress this body coupling, the left and right sides that the nose held in the palm the structure includes first inclination sensor and second inclination sensor respectively, it includes to dress the body: the controller is respectively electrically connected with the first inclination angle sensor, the second inclination angle sensor, the forehead electroencephalogram sensor, the third inclination angle sensor and the communication module.
The application also discloses a method for monitoring eyestrain, which is implemented based on the head-worn device, and comprises the following steps:
acquiring target blink data and eye duration of a target object through a first tilt sensor, a second tilt sensor and a forehead EEG sensor, wherein the target blink data comprises left eye blink data and right eye blink data;
acquiring head posture data of the target object through a third tilt sensor;
performing left eye fatigue detection based on the left eye blinking data, the eye using duration and the head posture data to obtain left eye fatigue information of the target object;
performing right eye fatigue detection based on the right eye blinking data, the eye using duration and the head posture data to obtain right eye fatigue information of the target object;
and performing eye fatigue analysis based on the left eye fatigue information and the right eye fatigue information to obtain eye fatigue information and eye difference information of the target object.
The application also discloses with asthenopia monitoring device, the device is based on head wearing equipment as above-mentioned realizes, the device includes:
the first acquisition module is used for acquiring target blink data and eye using duration of a target object through a first inclination angle sensor, a second inclination angle sensor and a forehead electroencephalogram sensor, wherein the target blink data comprise left eye blink data and right eye blink data;
the second acquisition module is used for acquiring head posture data of the target object through a third tilt sensor;
the left eye fatigue detection module is used for carrying out left eye fatigue detection on the basis of the left eye blinking data, the eye using duration and the head posture data to obtain left eye fatigue information of the target object;
the right eye fatigue detection module is used for carrying out right eye fatigue detection on the basis of the right eye blinking data, the eye using duration and the head posture data to obtain right eye fatigue information of the target object;
and the eye fatigue analysis module is used for carrying out eye fatigue analysis based on the left eye fatigue information and the right eye fatigue information to obtain eye fatigue information and eye difference information of the target object.
The present application also discloses a computer-readable storage medium, in which at least one instruction or at least one program is stored, and the at least one instruction or the at least one program is loaded and executed by a processor to implement the eye fatigue monitoring method as described above.
The head-worn device, the eye fatigue monitoring method and device and the storage medium have the following technical effects:
according to the technical scheme, target blink data and eye using duration of a target object are collected through a first inclination angle sensor, a second inclination angle sensor and a forehead electroencephalogram sensor, the target blink data comprise left eye blink data and right eye blink data, and head posture data of the target object are collected through a third inclination angle sensor; on one hand, performing left eye fatigue detection based on the blink data of the left eye, the eye duration and the head posture data to obtain left eye fatigue information of the target object; performing right eye fatigue detection based on the right eye blinking data, the eye duration and the head posture data to obtain right eye fatigue information of the target object, and distinguishing the fatigue degree of the left eye and the fatigue degree of the right eye; on the other hand, eye fatigue analysis is carried out based on the left eye fatigue information and the right eye fatigue information, eye fatigue information and eye difference information of the target object are obtained, and accuracy of eye fatigue monitoring is improved.
Drawings
In order to more clearly illustrate the technical solution of the present invention, the drawings used in the description of the embodiment or the prior art will be briefly described below. It is obvious that the drawings in the following description are only some embodiments of the invention, and that for a person skilled in the art, other drawings can be derived from them without inventive effort.
FIG. 1 is a schematic diagram of an application environment provided by an embodiment of the present application;
fig. 2 is a block diagram of a head-mounted device according to an embodiment of the present disclosure;
fig. 3 is a block diagram of another head-mounted device according to an embodiment of the present disclosure;
fig. 4 is a block diagram of another head-mounted device according to an embodiment of the present disclosure;
FIG. 5 is a schematic flow chart of a method for monitoring eye fatigue provided by an embodiment of the present application;
fig. 6 is a schematic flowchart of a process for acquiring target blink data and eye duration of a target object through a first tilt sensor, a second tilt sensor and a forehead electroencephalogram sensor according to an embodiment of the present application;
fig. 7 is a schematic flow chart illustrating a process of performing eye use state analysis based on electroencephalogram data, first inclination data, and second inclination data to obtain target blink data within a preset time period according to the embodiment of the present application;
FIG. 8 is a schematic flow chart of a left eye fatigue detection, a right eye fatigue detection and an eye fatigue analysis provided in an embodiment of the present application;
fig. 9 is a schematic flowchart of a training method for a left eye fatigue detection network, a right eye fatigue detection network, and an eye fatigue analysis network according to an embodiment of the present application;
FIG. 10 is a schematic diagram of a network for monitoring eye fatigue, provided by an embodiment of the present application;
FIG. 11 is a schematic flow chart of another method for monitoring eye fatigue provided by an embodiment of the present application;
fig. 12 is an eyestrain monitoring device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, are within the scope of the present invention.
In the several embodiments provided in the present application, the described system embodiments are only illustrative, for example, the division of the above modules is only one logical function division, and there may be other division manners in actual implementation, for example, a plurality of modules or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of modules or units through some interfaces, and may be in an electrical or other form.
Referring to fig. 1, fig. 1 is a schematic diagram of an application environment according to an embodiment of the present application, and as shown in fig. 1, the application environment includes a head-mounted device 01 and a target-object mobile terminal 02.
Specifically, the head-mounted device 01 may include: the device comprises a controller, a first tilt sensor, a second tilt sensor, a forehead brain sensor, a third tilt sensor, a communication module, a storage module and the like. Specifically, the head-mounted device 01 may be configured to acquire target blink data and eye duration of a target object through the first tilt sensor, the second tilt sensor and the forehead electroencephalogram sensor, where the target blink data includes left-eye blink data and right-eye blink data; acquiring head posture data of the target object through a third tilt sensor; performing left eye fatigue detection based on the left eye blinking data, the eye using duration and the head posture data to obtain left eye fatigue information of the target object; performing right eye fatigue detection based on the right eye blinking data, the eye duration and the head posture data to obtain right eye fatigue information of the target object; performing eye fatigue analysis based on the left eye fatigue information and the right eye fatigue information to obtain eye fatigue information and eye difference information of the target object; the eye fatigue information and the eye difference information are pushed to the mobile terminal 02.
Specifically, the mobile terminal 02 may include: the smart phone, the vehicle-mounted terminal, the computer (e.g., desktop computer, tablet computer, and notebook computer), the digital assistant, the smart voice interaction device (e.g., smart speaker), the smart wearable device, and other types of physical devices, and may also be software running in the physical devices, such as a computer program. Specifically, the mobile terminal 02 is configured to receive the eye fatigue information and the eye difference information pushed by the head-mounted device 01.
In practical application, the head-mounted device 01 and the mobile terminal 02 can perform information interaction, and the head-mounted device 01 pushes the eye fatigue information and the eye difference information to the mobile terminal 02 in real time to realize eye fatigue monitoring.
The following describes a head-mounted device provided by an embodiment of the present application, as shown in fig. 2, specifically, the head-mounted device may include: dress body 1 and nose and hold in the palm structure 2, the nose holds in the palm structure 2 and dresses body 1 and be connected, and the nose holds in the palm the left and right sides of structure 2 and includes first inclination sensor 3 and second inclination sensor 4 respectively, and dress body 1 includes: the forehead electroencephalogram sensor system comprises a controller 5, a forehead electroencephalogram sensor 6, a third tilt sensor 7 and a communication module 8, wherein the controller 5 is electrically connected with the first tilt sensor 3, the second tilt sensor 4, the forehead electroencephalogram sensor 6, the third tilt sensor 7 and the communication module 8 respectively.
In a specific embodiment, the nose pad structure may be a flexible nose pad structure, and the flexible nose pad structure may be attached to both sides above the canthus and the bridge of the nose of the target subject when the target subject wears the head-worn device.
In a specific embodiment, the first tilt sensor and the second tilt sensor may be three-axis tilt sensors, and in particular, the first tilt sensor and the second tilt sensor may be respectively configured to acquire tilt data of two sides above the nose bridge of the eye angle of the target object.
In a particular embodiment, the third tilt sensor may comprise: tilt sensors and gyroscopes, in particular, a third tilt sensor may be used to collect head pose data of the target object.
In a particular embodiment, the forehead electrical sensor may comprise: the forehead single-channel electroencephalogram sensor can be used for collecting electroencephalogram signal data of a target object.
In a particular embodiment, the communication module may include, but is not limited to: bluetooth communication and WiFi communication.
According to the embodiment, the first inclination angle sensor and the second inclination angle sensor in the nose support structure can be used for collecting the inclination angle data of the two sides above the canthus and the nose bridge of the target object, and distinguishing blink data of the left eye and the right eye, so that the left eye and the right eye fatigue can be monitored.
In an alternative embodiment, as shown in fig. 3, the wearable body 1 may further include: the vibration module 9, the controller 5 and the vibration module 9 are electrically connected.
In a specific embodiment, the vibration module is used for performing vibration reminding on the target object.
In an alternative embodiment, as shown in fig. 4, the wearable body 1 may further include: the controller 5 is electrically connected with the illumination sensor 10 and the distance measuring sensor 11 respectively.
In a particular embodiment, the illumination sensor may be used to acquire illumination data of the environment in which the target object is located.
In a particular embodiment, the ranging sensor may include: infrared laser ranging sensing, specifically, ranging sensor can be used to gather the eye distance of target object.
In an embodiment of the present specification, the wearable body may further include: and the storage module is electrically connected with the controller.
It can be seen from the above embodiments that the illumination data and the eye-using distance of the environment where the target object is located can be collected by the illumination sensor and the ranging sensor, and further eye-using monitoring is performed.
The following describes a method for monitoring eye fatigue, which is implemented based on the above head-worn device, provided by an embodiment of the present application, and fig. 5 is a schematic flow chart of the method for monitoring eye fatigue, provided by the embodiment of the present application. It is noted that the present specification provides the method steps as described in the examples or flowcharts, but may include more or less steps based on routine or non-inventive labor. The order of steps recited in the embodiments is merely one manner of performing the steps in a multitude of orders and does not represent the only order of execution. In actual system or product execution, sequential execution or parallel execution (e.g., parallel processor or multi-threaded environment) may be used according to the embodiments or methods shown in the figures. Specifically, as shown in fig. 5, the method may include:
s501, acquiring target blink data and eye using duration of a target object through a first inclination angle sensor, a second inclination angle sensor and a forehead electroencephalogram sensor, wherein the target blink data comprise left eye blink data and right eye blink data.
In this specification, the target blink data may be statistical data associated with the blink action of the target subject, and the eye duration may be a total blink duration of the target subject within a preset statistical time period. Specifically, the target blinking data may include left-eye blinking data and right-eye blinking data, the left-eye blinking data may be statistical data associated with a left-eye blinking motion of the target subject, and the right-eye blinking data may be statistical data associated with a right-eye blinking motion of the target subject.
In an embodiment of the present specification, as shown in fig. 6, the acquiring target blink data and eye duration of the target object by the first tilt sensor, the second tilt sensor and the forehead electroencephalogram sensor may include:
s601, respectively acquiring first inclination angle data and second inclination angle data of a target object in a preset time period through a first inclination angle sensor and a second inclination angle sensor.
In the embodiment of the present specification, the preset time period may be set in combination with the monitoring accuracy of the eyestrain. Alternatively, the preset time period may be 60 seconds.
In practical application, when the target object blinks, slight vibration of the nose pad structure can be linked, and the first inclination angle sensor and the second inclination angle sensor in the nose pad structure can acquire angle change in three-dimensional (XYZ-axis) direction caused by the slight vibration. Specifically, filtering and amplifying a first initial signal acquired by a first tilt sensor to obtain a first amplified signal, and performing a/D (analog-to-digital) conversion on the first amplified signal to obtain first tilt data; and filtering and amplifying the second initial signal acquired by the second tilt sensor to obtain a second amplified signal, and performing A/D conversion on the second amplified signal to obtain second tilt data.
Specifically, the first inclination angle data may be three-dimensional inclination angle data on the left side of the nose pad structure, the second inclination angle data may be three-dimensional inclination angle data on the right side of the nose pad structure, where the three-dimensional inclination angle data may include X-axis inclination angle data, Y-axis inclination angle data, and Z-axis inclination angle data, and correspondingly, the first inclination angle data may include: the first X-axis tilt data, the first Y-axis tilt data, and the first Z-axis tilt data, and the second tilt data may include: second X-axis tilt angle data, second Y-axis tilt angle data, and second Z-axis tilt angle data.
And S602, acquiring electroencephalogram signal data of the target object in a preset time period through the forehead electroencephalogram sensor.
Specifically, the electroencephalogram signal data may be amplitude statistics of a forehead electroencephalogram signal, and the amplitude statistics may include an amplitude standard deviation. In practical application, the sampling frequency of the forehead electroencephalogram signal can be set according to the sampling precision, and in an optional embodiment, the sampling frequency can be 1 KHz.
S603, the time length of the electroencephalogram signal data larger than the threshold value of the blink signal in the preset time period is used as the eye duration.
Specifically, whether the current electroencephalogram data are larger than a blink signal threshold value or not is judged according to a preset judgment frequency, under the condition that the current electroencephalogram data are larger than the blink signal threshold value, the state corresponding to the current electroencephalogram data is judged to be a blink state, and the time length of the blink state in a preset time period is used as the eye using time length.
In a specific embodiment, the method for determining the blink signal threshold value may include:
1) acquiring first electroencephalogram data of a target object in an eye closing process and second electroencephalogram data of the target object in an eye blinking process;
specifically, the first electroencephalogram signal data may be an amplitude standard deviation of a forehead electroencephalogram signal in an eye closing process of the target object, and the second electroencephalogram signal data may be an amplitude standard deviation of the forehead electroencephalogram signal in an eye blinking process of the target object. In an optional embodiment, the first electroencephalogram signal data is acquired in the process of closing the eye of the target object for 10 seconds, and the second electroencephalogram signal data is acquired in the process of blinking the eye of the target object for 10 seconds.
2) Determining a blink signal threshold based on the first brain electrical signal data and the second brain electrical signal data.
Specifically, the mean value of the first electroencephalogram signal data and the second electroencephalogram signal data can be calculated, and the mean value is used as the blink signal threshold.
S604, analyzing the eye using state based on the electroencephalogram signal data, the first inclination angle data and the second inclination angle data to obtain target blink data in a preset time period.
In a specific embodiment, the eye state analysis is performed based on the electroencephalogram data, the first inclination angle data and the second inclination angle data to obtain the blink data for the left eye and the blink data for the right eye, and specifically, the blink data for the left eye may include: a blink frequency for the left eye and a duration of each blink intensity for a plurality of blink intensities for the left eye; the right eye blink data may include: the blink frequency for the right eye and the duration of each blink intensity at various blink intensities for the right eye.
In a specific embodiment, as shown in fig. 7, the performing the eye state analysis based on the electroencephalogram data, the first inclination data, and the second inclination data to obtain the target blink data within the preset time period may include:
s701, traversing the electroencephalogram signal data at a plurality of moments in a preset time period.
Specifically, the plurality of times may be set in combination with the accuracy of monitoring the eye fatigue. In an alternative embodiment, the plurality of time instants may comprise every second within 60 seconds.
S702, under the condition that the current traversed electroencephalogram signal data are larger than the blink signal threshold, carrying out inclination change analysis on the first inclination data and the second inclination data at the corresponding moment of the current traversed electroencephalogram signal data to obtain an inclination change analysis result.
And S703, determining a target blinking state corresponding to the currently traversed electroencephalogram data according to the inclination change analysis result.
In a specific embodiment, the first tilt angle data includes: first X-axis tilt data, first Y-axis tilt data, and first Z-axis tilt data, the second tilt data comprising: under the condition of the second X-axis inclination data, the second Y-axis inclination data and the second Z-axis inclination data, performing inclination change analysis on the first inclination data and the second inclination data at the moment corresponding to the currently traversed electroencephalogram data, and obtaining an inclination change analysis result may include: carrying out angle comparison on first X-axis inclination angle data and second X-axis inclination angle data at the moment corresponding to the currently traversed electroencephalogram data to obtain a first comparison result; angle comparison is carried out on the first Y-axis inclination angle data and the second Y-axis inclination angle data at the moment corresponding to the currently traversed electroencephalogram data to obtain a second comparison result; angle comparison is carried out on the first Z-axis inclination angle data and the second Z-axis inclination angle data at the moment corresponding to the currently traversed electroencephalogram data to obtain a third comparison result; obtaining an inclination angle change analysis result based on the first comparison result, the second comparison result and the third comparison result;
accordingly, the target blink state may include: the determining, according to the analysis result, a target blink state corresponding to the currently traversed electroencephalogram data may include: under the condition that the inclination angle change analysis result meets a first angle change condition, the target blinking state corresponding to the currently traversed electroencephalogram signal data is that the left eye blinks; and under the condition that the inclination angle change analysis result meets a second angle change condition, the target blinking state corresponding to the currently traversed electroencephalogram data is right eye blinking. In practical applications, the first angle change condition and the second angle change condition can be set in combination with a large amount of experimental data.
S704, adding the currently traversed electroencephalogram data into the first data group under the condition that the target blinking state is the blinking of the left eye.
S705, under the condition that the target blinking state is right eye blinking, adding the currently traversed electroencephalogram data into a second data group.
And S706, after traversing the electroencephalogram signal data at multiple moments in the preset time period, performing statistical analysis on the electroencephalogram signal data in the first data group to obtain the blink frequency of the left eye and the first duration time corresponding to the multiple blink intensities of the left eye.
And S707, performing statistical analysis on the electroencephalogram signal data in the second data group to obtain the blink frequency of the right eye and the second duration time corresponding to the plurality of blink intensities of the right eye.
S708, the blink frequency and the first duration of the left eye are used as the blink data of the left eye.
And S709, taking the blinking frequency and the second duration of the right eye as right eye blinking data.
Specifically, the number of blinks of the left eye and the number of blinks of the right eye in the target blink state corresponding to the electroencephalogram data at multiple moments in the preset time period are respectively counted, and then the blink frequency of the left eye and the blink frequency of the right eye are obtained.
Specifically, multiple intensity thresholds can be set between the blink signal threshold and the second electroencephalogram signal data, and in practical application, the numerical values of the multiple intensity thresholds can be set in combination with the blink intensity distinguishing precision. In an alternative embodiment, the plurality of intensity thresholds may include a first intensity threshold and a second intensity threshold, where the first intensity threshold is smaller than the second intensity threshold, and when the electroencephalogram data is greater than the blink signal threshold and is less than or equal to the first intensity threshold, the target blink state corresponding to the electroencephalogram data is considered as a low-intensity blink, when the electroencephalogram data is greater than the first intensity threshold and is less than or equal to the second intensity threshold, the target blink state corresponding to the electroencephalogram data is considered as a medium-intensity blink, and when the electroencephalogram data is greater than the second intensity threshold and is less than or equal to the second electroencephalogram data, the target blink state corresponding to the electroencephalogram data is considered as a high-intensity blink;
correspondingly, counting a target blinking state corresponding to the electroencephalogram signal data in a preset time period as a first time length of blinking of a left eye, and further determining the duration of low-intensity blinking of the left eye, the duration of high-intensity blinking of the left eye and the duration of high-intensity blinking of the left eye in the first time length; and counting a second time length of the right eye blinking corresponding to the target blinking state of the electroencephalogram signal data in the preset time period, and further determining the duration of the right eye low-intensity blinking, the duration of the right eye moderate-intensity blinking and the duration of the right eye high-intensity blinking in the second time length.
As can be seen from the above embodiments, the first tilt angle data and the second tilt angle data of the target object are respectively acquired by the first tilt angle sensor and the second tilt angle sensor, and tilt angle change analysis is performed on the first tilt angle data and the second tilt angle data to obtain a tilt angle change analysis result; and based on the inclination angle change analysis result and the electroencephalogram signal data, the blink data of the left eye and the right eye are distinguished, so that the fatigue detection of the left eye and the right eye can be carried out subsequently.
And S502, acquiring head posture data of the target object through a third tilt sensor.
In a specific embodiment, the acquiring the head pose data of the target object through the third tilt sensor may include: acquiring head inclination angle data of a target object through a third inclination angle sensor; and analyzing the head posture of the head inclination angle data to obtain head posture data.
Specifically, the head inclination angle data may be an angle of the current head position with respect to the head end position, and the head posture data may include a head-up state, a head-down state, and a head-up state.
In a specific embodiment, the head inclination angle data corresponding to the head correction position is 0 °, and the head posture data at this time is in a head-up state; when the head inclination angle data is more than 0 degree, the head posture data at the moment is in a head lowering state; when the head inclination angle data is less than 0 °, the head posture data at this time is in a head-up state.
S503, performing left eye fatigue detection based on the left eye blinking data, the eye using duration and the head posture data to obtain left eye fatigue information of the target object.
S504, performing right eye fatigue detection based on the right eye blinking data, the eye using duration and the head posture data to obtain right eye fatigue information of the target object.
And S505, performing eye fatigue analysis based on the left eye fatigue information and the right eye fatigue information to obtain eye fatigue information and eye difference information of the target object.
In this embodiment, the left eye fatigue information may represent left eye fatigue representing a left eye fatigue state, the right eye fatigue information may represent right eye fatigue representing a right eye fatigue state, the eye fatigue information may represent eye fatigue representing a binocular eye fatigue state, and the eye difference information may represent left and right eye fatigue difference representing left and right eye fatigue difference.
In a specific embodiment, the left-eye fatigue information may include a left-eye fatigue value obtained by quantizing a left-eye fatigue degree according to a certain rule, the right-eye fatigue information may include a right-eye fatigue value obtained by quantizing a right-eye fatigue degree according to a certain rule, the eye fatigue information may include an eye fatigue value obtained by quantizing a binocular fatigue degree according to a certain rule, and the eye difference information may include a difference value between the left-eye fatigue value and the right-eye fatigue value.
In a particular embodiment, a fatigue level corresponding to the eye fatigue value may be determined based on a comparison of the eye fatigue value to an eye fatigue threshold. Optionally, the eye fatigue threshold may include a first fatigue threshold and a second fatigue threshold, wherein the first fatigue threshold is less than the second fatigue threshold. Specifically, when the eye fatigue value is smaller than the first fatigue threshold value, the fatigue grade corresponding to the eye fatigue value is light eye fatigue; when the eye fatigue value is greater than or equal to the first fatigue threshold value and smaller than the second fatigue threshold value, the fatigue grade corresponding to the eye fatigue value is moderate eye fatigue; when the eye fatigue value is equal to or greater than the second fatigue threshold value, the fatigue level corresponding to the eye fatigue value is high eye fatigue.
In practical applications, the first fatigue threshold and the second fatigue threshold may be preset in combination with the accuracy of the eye fatigue monitoring and the range of eye fatigue values.
In a specific embodiment, as shown in fig. 8, the obtaining the fatigue information of the left eye of the target object based on the eye blinking data, the eye duration and the head posture data may include:
and S801, inputting the blink data of the left eye, the eye using duration and the head posture data into a left eye fatigue detection network to perform left eye fatigue detection, and obtaining left eye fatigue information.
The obtaining of the right eye fatigue information of the target object based on the right eye blinking data, the eye duration and the head posture data may include:
s802, inputting the right eye blinking data, the eye using duration and the head posture data into a right eye fatigue detection network to carry out right eye fatigue detection, and obtaining right eye fatigue information.
The performing fatigue analysis based on the left eye fatigue information and the right eye fatigue information to obtain the eye fatigue information and the eye difference information of the target object may include:
and S803, inputting the left eye fatigue information and the right eye fatigue information into an eye fatigue analysis network for eye fatigue analysis, and obtaining eye fatigue information and eye difference information.
Specifically, the left eye fatigue detection network may include, but is not limited to, GRU (gate control loop unit network), RNN (recurrent neural network), or LSTM (long short term memory network).
Specifically, the right eye fatigue detection network may include, but is not limited to: GRU, RNN, or LSTM.
Specifically, the network for analyzing eye fatigue may include, but is not limited to: GRU, SVM (support vector machine), BP (feed forward neural network), K-means (K-means clustering algorithm).
As can be seen from the above embodiments, on one hand, the left eye blinking data, the eye using duration, and the head posture data are input into the left eye fatigue detection network for left eye fatigue detection, and the right eye blinking data, the eye using duration, and the head posture data are input into the right eye fatigue detection network for right eye fatigue detection, so that the fatigue information of the left eye and the right eye can be distinguished, and on the other hand, the left eye fatigue information and the right eye fatigue information are input into the eye fatigue analysis network for eye fatigue analysis, so that the accuracy of eye fatigue analysis can be improved.
In this embodiment, a preset left eye fatigue detection network, a preset right eye fatigue detection network, and a preset eye fatigue analysis network may be jointly trained through sample eye data of a sample object to obtain the left eye fatigue detection network, the right eye fatigue detection network, and the eye fatigue analysis network.
In a specific embodiment, the above-mentioned head-worn device or other similar devices may be used to collect raw eye data of a sample object, and after performing preprocessing and feature extraction on the raw eye data, a sample eye processing is obtained, and specifically, the preprocessing method may include: resampling, filtering, normalizing, etc., and the feature extraction method may include, but is not limited to: a feature extraction method based on a convolutional neural network, and a feature extraction method similar to steps S601 to S604 and steps S701 to S709.
In a specific embodiment, as shown in fig. 9, the training method for the left eye fatigue detection network, the right eye fatigue detection network, and the eye fatigue analysis network may include:
s901, sample blink data, sample eye using duration and sample head posture data of the sample object are obtained, wherein the sample blink data comprises sample left eye blink data and sample right eye blink data.
And S902, acquiring the eye fatigue information for marking and the eye difference information for marking corresponding to the sample object.
In practical application, before network training, training data may be determined, specifically, in the embodiment of the present application, sample eye data of a sample object including eye fatigue information for labeling and eye difference information for labeling may be obtained as the training data, where the sample eye data may include sample blink data, sample eye duration, and sample head posture data.
Specifically, the eye fatigue information for labeling may be preset eye fatigue information that is pre-labeled with the eye data for the sample, and the eye difference information for labeling may be preset eye difference information that is pre-labeled with the eye data for the sample.
And S903, inputting the blink data of the left eye of the sample, the eye using time length of the sample and the head posture data of the sample into a preset left eye fatigue detection network for left eye fatigue detection to obtain sample left eye fatigue information of the sample object.
And S904, inputting the sample right eye blinking data, the sample eye using time length and the sample head posture data into a preset right eye fatigue detection network for right eye fatigue detection, and obtaining sample right eye fatigue information of the sample object.
S905, inputting the sample left eye fatigue information and the sample right eye fatigue information into a preset eye fatigue analysis network for eye fatigue analysis, and obtaining sample eye fatigue information and sample eye difference information of the sample object.
And S906, determining target loss information based on the sample eye fatigue information, the sample eye difference information, the labeling eye fatigue information and the labeling eye difference information.
And S907, training a preset left eye fatigue detection network, a preset right eye fatigue detection network and a preset eye fatigue analysis network based on the target loss information to obtain the left eye fatigue detection network, the right eye fatigue detection network and the eye fatigue analysis network.
In an alternative embodiment, the target loss information may include eye fatigue information loss and eye difference information loss;
accordingly, the determining the target loss information based on the sample eye fatigue information and the sample eye difference information, and the labeling eye fatigue information and the labeling eye difference information may include:
determining the loss of the eye fatigue information according to the eye fatigue information for marking and the eye fatigue information for the sample; and determining the loss of the eye use difference information according to the labeling eye use difference information and the sample eye use difference information.
In a specific embodiment, the determining of the eye fatigue information loss according to the eye fatigue information for labeling and the eye fatigue information for samples may include determining the eye fatigue information loss between the eye fatigue information for labeling and the eye fatigue information for samples based on a first preset loss function; the determining the loss of the eye difference information according to the eye difference information for labeling and the eye difference information for samples may include: and determining eye use difference information loss between the labeling eye use difference information and the sample eye use difference information based on a second preset loss function. Optionally, the first preset loss function and the second preset loss function may be the same loss function or different loss functions.
In a particular embodiment, the loss of eye strain information may characterize a difference between the annotation eye strain information and the sample eye strain information, and the loss of eye difference information may characterize a difference between the annotation eye difference information and the sample eye difference information.
In a specific embodiment, the first predetermined loss function may include, but is not limited to, a cross entropy loss function, a logic loss function, an exponential loss function, etc., and the second predetermined loss function may include, but is not limited to, a cross entropy loss function, a logic loss function, an exponential loss function, etc.
In an optional embodiment, training the preset left eye fatigue detection network, the preset right eye fatigue detection network and the preset eye fatigue analysis network based on the target loss information, and obtaining the left eye fatigue detection network, the right eye fatigue detection network and the eye fatigue analysis network may include: updating network parameters of a preset left eye fatigue detection network, a preset right eye fatigue detection network and a preset eye fatigue analysis network based on the target loss information; based on the updated preset left eye fatigue detection network, the preset right eye fatigue detection network and the preset eye fatigue analysis network, repeating the step S903, based on the target loss information, updating the media content label identification training iterative operation of the network parameters of the preset feature fusion network, until the eye fatigue monitoring training iterative operation of the network parameters of the preset left eye fatigue detection network, the preset right eye fatigue detection network and the preset eye fatigue analysis network is updated based on the target loss information, and the eye fatigue monitoring convergence condition is reached; and taking the preset left eye fatigue detection network, the preset right eye fatigue detection network and the preset eye fatigue analysis network which are obtained under the condition of reaching the eye fatigue monitoring convergence condition as a left eye fatigue detection network, a right eye fatigue detection network and an eye fatigue analysis network.
In an alternative embodiment, the above condition for achieving convergence with eye fatigue monitoring may be that the number of training iterations reaches a preset number of training. Optionally, the convergence condition of eye fatigue monitoring may be reached by setting the target loss information to be less than a specified threshold. In the embodiment of the present specification, the preset training times and the specified threshold may be preset in combination with the training speed and accuracy of the network in practical application.
In a specific embodiment, as shown in fig. 10, an eye fatigue monitoring network including the left eye fatigue detection network, the right eye fatigue detection network and the eye fatigue analysis network is established, and the target blink data, the eye duration and the head posture data of the target object are input into the eye fatigue monitoring network for eye fatigue monitoring, so as to obtain eye fatigue information and eye difference information of the target object.
As can be seen from the above embodiments, on one hand, the eyestrain monitoring network with high generalization capability is obtained based on the machine learning training of the sample blink data of the sample object and the corresponding eyestrain information for labeling and eye difference information for labeling; on the other hand, the accuracy of monitoring the eye fatigue by the network can be better improved while the training efficiency is improved by performing combined training on the preset left eye fatigue detection network, the preset right eye fatigue detection network and the preset eye fatigue analysis network.
In an optional embodiment, the method may further include: and controlling the vibration module to carry out vibration reminding according to the vibration mode corresponding to the eye fatigue information.
Specifically, the vibration pattern includes a target vibration frequency and a target vibration duration.
In an alternative embodiment, in the case that the eye fatigue information is light eye fatigue, the target vibration frequency may be 2Hz, and the target vibration duration may be 1 second; under the condition that the eye fatigue information is moderate eye fatigue, the target vibration frequency can be 5Hz, and the target vibration duration can be 2 seconds; in the case that the eye fatigue information is high eye fatigue, the target vibration frequency may be 10Hz, and the target vibration duration may include a single vibration duration and an interval vibration duration, and optionally, the single vibration duration may be 2 seconds, and the interval vibration duration may be 5 seconds.
According to the embodiment, the vibration reminding corresponding to the eye fatigue information is carried out by controlling the vibration module, the target object is reminded of paying attention to eye health, the overfatigue is avoided, and the use experience of a user is improved.
In an alternative embodiment, as shown in fig. 11, the method may further include:
s506, acquiring illumination data of the environment where the target object is located through the illumination sensor.
In particular, the lighting data may include, but is not limited to: illumination, stroboscopic, color rendering, color temperature.
And S507, acquiring the eye distance of the target object through a distance measuring sensor.
Specifically, the eye gaze distance may be a distance from an eye gaze direction of the target object to an in-front blocking object.
And S508, carrying out angle calibration based on the head inclination angle data and the preset calibration inclination angle data to obtain the eye using angle of the target object.
Specifically, the eye angle here is the angle between the eye gaze direction and the spatial vertical plane.
Specifically, the angle difference between the head tilt angle data and the preset calibration tilt angle data is used as the eye using angle of the target object.
In practical application, when the target object wears the head-wearing device for the first time, the collected data of the head-wearing device can be initialized, when the target object enters a standard sitting state, the head inclination angle data at the moment are collected through the third inclination angle, and the head inclination angle data at the moment are used as preset calibration inclination angle data.
S509, obtains the eye fatigue information for history of the target object.
Specifically, the historical eye fatigue information may be eye fatigue information of the target object at the historical time.
In an alternative embodiment, the historical eyestrain information of the target subject may be obtained from a memory module of the head-worn device.
In another optional embodiment, historical eye fatigue information of the target object can be obtained from the cloud.
S510, fatigue change information of the target object is generated based on the historical eye fatigue information and the eye fatigue information.
Specifically, the fatigue change information may represent a change in eye fatigue of the target object.
In a specific embodiment, in the case where the historical eye fatigue information includes the historical eye fatigue value a and the eye fatigue information includes the eye fatigue value b, the fatigue change information is (a-b)/a × 100%.
And S511, generating eye fatigue monitoring information based on the illumination data, the eye using distance, the eye using angle, the eye using time length and the fatigue change information.
And S512, pushing the eye fatigue monitoring information to the mobile terminal corresponding to the target object through the communication module.
In this embodiment, the mobile terminal corresponding to the target object may include, but is not limited to: smart phones, vehicle terminals, computers (e.g., desktop computers, tablet computers, and notebook computers), digital assistants, smart voice interaction devices (e.g., smart speakers), and smart wearable devices, and may also be software running in the physical devices, such as computer programs and APPs (application programs).
As can be seen from the foregoing embodiments of the present application, according to the technical solutions provided by the embodiments of the present application, on one hand, the head posture data of the target object is collected by the third tilt sensor, the first tilt data and the second tilt data of the target object are respectively collected by the first tilt sensor and the second tilt sensor, and the first tilt data and the second tilt data are subjected to tilt change analysis to obtain a tilt change analysis result; based on the inclination angle change analysis result and the electroencephalogram signal data, the blink data of the left eye and the right eye are distinguished, so that the fatigue detection of the left eye and the right eye can be carried out subsequently; on the other hand, the left eye blinking data, the eye using duration and the head posture data are input into a left eye fatigue detection network for left eye fatigue detection, and the right eye blinking data, the eye using duration and the head posture data are input into a right eye fatigue detection network for right eye fatigue detection, so that the fatigue information of the left eye and the right eye can be distinguished; on the other hand, the accuracy of the network for monitoring the eye fatigue can be better improved while the training efficiency is improved by performing combined training on the preset left eye fatigue detection network, the preset right eye fatigue detection network and the preset eye fatigue analysis network; on the other hand, vibration reminding corresponding to the eye fatigue information is performed by controlling the vibration module, the target object is reminded of paying attention to eye health, excessive fatigue is avoided, and the use experience of the user is improved.
An embodiment of the present application provides an eyestrain monitoring device, where the above-mentioned device is implemented based on the above-mentioned head-worn apparatus, as shown in fig. 12, the above-mentioned device includes:
the first acquisition module 1210 is configured to acquire target blink data and eye duration of a target object through a first tilt sensor, a second tilt sensor and a forehead electroencephalogram sensor, where the target blink data includes left eye blink data and right eye blink data;
the second acquisition module 1220 is configured to acquire the head posture data of the target object through the third tilt sensor;
the left eye fatigue detection module 1230 is configured to perform left eye fatigue detection based on the left eye blinking data, the eye using duration and the head posture data to obtain left eye fatigue information of the target object;
the right eye fatigue detection module 1240 is used for performing right eye fatigue detection based on the right eye blinking data, the eye duration and the head posture data to obtain the right eye fatigue information of the target object;
and an eyestrain analyzing module 1250 configured to perform eyestrain analysis based on the left-eye eyestrain information and the right-eye eyestrain information to obtain eyestrain information and eye difference information of the target object.
In an embodiment of the present disclosure, the first collecting module 1210 may include:
the device comprises an inclination angle acquisition unit, a first inclination angle sensor and a second inclination angle sensor, wherein the inclination angle acquisition unit is used for respectively acquiring first inclination angle data and second inclination angle data of a target object in a preset time period through the first inclination angle sensor and the second inclination angle sensor;
the electroencephalogram signal data acquisition unit is used for acquiring electroencephalogram signal data of the target object within a preset time period through the forehead electroencephalogram sensor;
the eye duration unit is used for taking the time length of the electroencephalogram signal data larger than the threshold value of the blink signal in the preset time period as the eye duration;
and the eye state analysis unit is used for carrying out eye state analysis based on the electroencephalogram signal data, the first inclination angle data and the second inclination angle data to obtain target blink data in a preset time period.
In a specific embodiment, the eye state analyzing unit may include:
the traversing unit is used for traversing the electroencephalogram data at a plurality of moments in a preset time period;
the inclination angle change analysis unit is used for carrying out inclination angle change analysis on the first inclination angle data and the second inclination angle data at the corresponding moment of the currently traversed electroencephalogram signal data under the condition that the currently traversed electroencephalogram signal data are larger than the blink signal threshold value to obtain an inclination angle change analysis result;
the target blink state determining unit is used for determining a target blink state corresponding to the currently traversed electroencephalogram data according to the inclination angle change analysis result;
the first data group unit is used for adding the currently traversed electroencephalogram data into the first data group under the condition that the target blinking state is the blinking of the left eye;
the second data group unit is used for adding the currently traversed electroencephalogram data into a second data group under the condition that the target blinking state is right eye blinking;
the first statistical analysis unit is used for performing statistical analysis on the electroencephalogram data in the first data group after the electroencephalogram data at a plurality of moments in a preset time period are traversed to obtain the blink frequency of the left eye and the first duration time corresponding to a plurality of blink intensities of the left eye;
the second statistical analysis unit is used for performing statistical analysis on the electroencephalogram signal data in the second data group to obtain the blink frequency of the right eye and second duration time corresponding to the plurality of blink intensities of the right eye;
a left-eye blink data unit for taking the blink frequency and the first duration of the left eye as left-eye blink data;
and a right eye blinking data unit for regarding the blinking frequency and the second duration of the right eye as right eye blinking data.
In a specific embodiment, the second acquiring module 1220 may include:
the head inclination angle data acquisition unit is used for acquiring head inclination angle data of the target object through the third inclination angle sensor;
and the head posture analysis unit is used for carrying out head posture analysis on the head inclination angle data to obtain head posture data.
In a specific embodiment, the left eye fatigue detection module 1230 may include:
the left eye fatigue detection network unit is used for inputting the blink data of the left eye, the eye using duration and the head posture data into the left eye fatigue detection network to carry out left eye fatigue detection so as to obtain left eye fatigue information;
the right eye fatigue detection module 1240 may include:
the right eye fatigue detection network unit is used for inputting the right eye blinking data, the eye using duration and the head posture data into the right eye fatigue detection network to carry out right eye fatigue detection so as to obtain right eye fatigue information;
the eyestrain analysis module 1250 may include:
and the eye fatigue analysis network unit is used for inputting the eye fatigue information and the right eye fatigue information into the eye fatigue analysis network to carry out eye fatigue analysis so as to obtain the eye fatigue information and eye difference information.
In a specific embodiment, the apparatus may further include:
the sample data acquisition module is used for acquiring sample blink data, sample eye using duration and sample head posture data of a sample object, wherein the sample blink data comprises sample left eye blink data and sample right eye blink data;
the marking data acquisition module is used for acquiring marking eye fatigue information and marking eye difference information corresponding to the sample object;
the preset left eye fatigue detection network module is used for inputting the blink data of the left eye of the sample, the eye using time length of the sample and the head posture data of the sample into a preset left eye fatigue detection network to carry out left eye fatigue detection so as to obtain the left eye fatigue information of the sample object;
the preset right eye fatigue detection network module is used for inputting the sample right eye blinking data, the sample eye using time length and the sample head posture data into a preset right eye fatigue detection network to carry out right eye fatigue detection so as to obtain sample right eye fatigue information of the sample object;
the system comprises a preset eye fatigue analysis network module, a sample object analysis module and a sample object analysis module, wherein the preset eye fatigue analysis network module is used for inputting the left eye fatigue information and the right eye fatigue information of the sample into a preset eye fatigue analysis network to carry out eye fatigue analysis so as to obtain the eye fatigue information for the sample object and the eye difference information for the sample object;
the target loss information determining module is used for determining target loss information based on the sample eye fatigue information, the sample eye difference information, the labeling eye fatigue information and the labeling eye difference information;
and the training module is used for training a preset left eye fatigue detection network, a preset right eye fatigue detection network and a preset eye fatigue analysis network based on the target loss information to obtain the left eye fatigue detection network, the right eye fatigue detection network and the eye fatigue analysis network.
In an optional embodiment, the apparatus may further include:
and the vibration module is used for controlling the vibration module to carry out vibration reminding according to the vibration mode corresponding to the eye fatigue information.
In an optional embodiment, the apparatus may further include:
the illumination data acquisition module is used for acquiring illumination data of the environment where the target object is located through the illumination sensor;
the eye distance acquisition module is used for acquiring the eye distance of the target object through the distance measurement sensor;
the eye angle calibration module is used for carrying out angle calibration based on the head inclination angle data and preset calibration inclination angle data to obtain an eye angle of the target object;
the historical eye fatigue information acquisition module is used for acquiring the historical eye fatigue information of the target object;
the fatigue change information generation module is used for generating fatigue change information of the target object based on the historical eye fatigue information and the eye fatigue information;
the eye fatigue monitoring information generating module is used for generating eye fatigue monitoring information based on illumination data, eye distance, eye angle, eye duration and fatigue change information;
and the information pushing module is used for pushing the eye fatigue monitoring information to the mobile terminal corresponding to the target object through the communication module.
The device and method embodiments in the device embodiment described above are based on the same inventive concept. And will not be described in detail herein.
The present application further provides a storage medium, which may be disposed in a head-worn device to store at least one instruction or at least one program for implementing the eye fatigue monitoring method in one of the method embodiments, where the at least one instruction or the at least one program is loaded and executed by the processor to implement the eye fatigue monitoring method provided in the method embodiment.
Optionally, in this embodiment, the storage medium may include, but is not limited to: Read-Only Memory (ROM), Random Access Memory (RAM), and the like, which can store program codes.
As can be seen from the embodiments of the head-worn device, the eye fatigue monitoring method, the eye fatigue monitoring device, and the storage medium provided by the present application, according to the technical solutions provided by the embodiments of the present application, on one hand, the head posture data of the target object is collected by the third tilt sensor, the first tilt data and the second tilt data of the target object are respectively collected by the first tilt sensor and the second tilt sensor, and the first tilt data and the second tilt data are subjected to tilt change analysis to obtain a tilt change analysis result; based on the inclination angle change analysis result and the electroencephalogram signal data, the blink data of the left eye and the right eye are distinguished, so that the fatigue detection of the left eye and the right eye can be carried out subsequently; on the other hand, the left eye blinking data, the eye using duration and the head posture data are input into a left eye fatigue detection network for left eye fatigue detection, and the right eye blinking data, the eye using duration and the head posture data are input into a right eye fatigue detection network for right eye fatigue detection, so that the fatigue information of the left eye and the right eye can be distinguished; on the other hand, the accuracy of the network for monitoring the eye fatigue can be better improved while the training efficiency is improved by performing combined training on the preset left eye fatigue detection network, the preset right eye fatigue detection network and the preset eye fatigue analysis network; on the other hand, vibration reminding corresponding to the eye fatigue information is carried out by controlling the vibration module, the target object is reminded of paying attention to eye health, over fatigue is avoided, and the use experience of a user is improved.
It is noted that while for simplicity of explanation, the foregoing method embodiments have been described as a series of acts, those skilled in the art will appreciate that the present invention is not limited by the order of acts, as some steps may, in accordance with the present invention, occur in other orders and concurrently. Further, the above embodiments may be arbitrarily combined to obtain other embodiments.
In the foregoing embodiments, the descriptions of the embodiments have respective emphasis, and reference may be made to related descriptions of other embodiments for parts that are not described in detail in a certain embodiment. Those of skill in the art will further appreciate that the various illustrative logical blocks, units, and steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate the interchangeability of hardware and software, various illustrative components, elements, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design requirements of the overall system. Those skilled in the art may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the embodiments of the invention.
The foregoing description has disclosed fully preferred embodiments of the present invention. It should be noted that those skilled in the art can make modifications to the embodiments of the present invention without departing from the scope of the appended claims. Accordingly, the scope of the appended claims is not to be limited to the specific embodiments described above.

Claims (13)

1. A head-worn device, the device comprising: dress body and nose and hold in the palm the structure, the nose hold in the palm the structure with dress this body coupling, the left and right sides that the nose held in the palm the structure includes first inclination sensor and second inclination sensor respectively, it includes to dress the body: the controller is respectively electrically connected with the first inclination angle sensor, the second inclination angle sensor, the forehead electroencephalogram sensor, the third inclination angle sensor and the communication module.
2. The apparatus of claim 1, wherein the wearable body further comprises: and the controller is electrically connected with the vibration module.
3. The apparatus of claim 1 or 2, wherein the wearing body further comprises: the controller is respectively electrically connected with the illumination sensor and the distance measuring sensor.
4. A method of monitoring eyestrain, implemented on the basis of a head-worn device according to any one of claims 1 to 3, characterized in that the method comprises:
acquiring target blink data and eye duration of a target object through a first tilt sensor, a second tilt sensor and a forehead EEG sensor, wherein the target blink data comprises left eye blink data and right eye blink data;
acquiring head posture data of the target object through a third tilt sensor;
performing left eye fatigue detection based on the left eye blinking data, the eye using duration and the head posture data to obtain left eye fatigue information of the target object;
performing right eye fatigue detection based on the right eye blinking data, the eye using duration and the head posture data to obtain right eye fatigue information of the target object;
and performing eye fatigue analysis based on the left eye fatigue information and the right eye fatigue information to obtain eye fatigue information and eye difference information of the target object.
5. The method of claim 4, wherein the acquiring target blink data and eye wear durations for the target subject via the first tilt sensor, the second tilt sensor and a forehead electrical sensor comprises:
respectively acquiring first inclination angle data and second inclination angle data of the target object in a preset time period through the first inclination angle sensor and the second inclination angle sensor;
acquiring electroencephalogram signal data of the target object within the preset time period through the forehead electroencephalogram sensor;
taking the time length of the electroencephalogram data larger than the threshold value of the blink signal in the preset time period as the eye duration;
and analyzing the eye using state based on the electroencephalogram signal data, the first inclination angle data and the second inclination angle data to obtain the target blink data in the preset time period.
6. The method of claim 5, wherein the performing eye state analysis based on the electroencephalographic signal data, the first tilt data, and the second tilt data to obtain the target blink data for the preset time period comprises:
traversing the electroencephalogram data at a plurality of moments in the preset time period;
under the condition that the current traversed electroencephalogram data are larger than the blink signal threshold value, performing inclination change analysis on first inclination data and second inclination data at the corresponding moment of the current traversed electroencephalogram data to obtain an inclination change analysis result;
determining a target blinking state corresponding to the currently traversed electroencephalogram data according to the inclination change analysis result;
adding the currently traversed electroencephalogram data into a first data group under the condition that the target blinking state is the blinking of the left eye;
adding the currently traversed electroencephalogram data into a second data group under the condition that the target blinking state is right eye blinking;
after the electroencephalogram signal data traversing at multiple moments in the preset time period are finished, performing statistical analysis on the electroencephalogram signal data in the first data group to obtain the blink frequency of the left eye and the first duration time corresponding to multiple blink intensities of the left eye;
performing statistical analysis on the electroencephalogram data in the second data group to obtain the blink frequency of the right eye and second duration time corresponding to the various blink intensities of the right eye;
taking the blink frequency and the first duration of the left eye as the left eye blink data;
and taking the blink frequency and the second duration of the right eye as the right eye blink data.
7. The method of claim 4, wherein the acquiring, by a third tilt sensor, head pose data of the target object comprises:
acquiring head inclination angle data of the target object through a third inclination angle sensor;
and analyzing the head posture of the head inclination angle data to obtain the head posture data.
8. The method of any of claims 4 to 7, wherein performing left eye fatigue detection based on the left eye blink data, the eye length and the head pose data to obtain left eye fatigue information of the target object comprises:
inputting the left eye blinking data, the eye using duration and the head posture data into a left eye fatigue detection network for left eye fatigue detection to obtain left eye fatigue information;
the detecting the right eye fatigue based on the right eye blinking data, the eye using duration and the head posture data to obtain the right eye fatigue information of the target object comprises:
inputting the right eye blinking data, the eye using duration and the head posture data into a right eye fatigue detection network for right eye fatigue detection to obtain the right eye fatigue information;
the performing fatigue analysis based on the left eye fatigue information and the right eye fatigue information to obtain eye fatigue information and eye difference information of the target object includes:
and inputting the left eye fatigue information and the right eye fatigue information into an eye fatigue analysis network for eye fatigue analysis to obtain the eye fatigue information and the eye difference information.
9. The method of claim 8, further comprising:
acquiring sample blink data, sample eye using duration and sample head posture data of a sample object, wherein the sample blink data comprises sample left eye blink data and sample right eye blink data;
acquiring eye fatigue information for marking and eye difference information for marking corresponding to the sample object;
inputting the sample left eye blinking data, the sample eye using duration and the sample head posture data into a preset left eye fatigue detection network for left eye fatigue detection to obtain sample left eye fatigue information of the sample object;
inputting the sample right eye blinking data, the sample eye using time length and the sample head posture data into a preset right eye fatigue detection network for right eye fatigue detection to obtain sample right eye fatigue information of the sample object;
inputting the sample left eye fatigue information and the sample right eye fatigue information into a preset eye fatigue analysis network for eye fatigue analysis to obtain sample eye fatigue information and sample eye difference information of the sample object;
determining target loss information based on the sample eye fatigue information and the sample eye difference information, and the labeling eye fatigue information and the labeling eye difference information;
training the preset left eye fatigue detection network, the preset right eye fatigue detection network and the preset eye fatigue analysis network based on the target loss information to obtain the left eye fatigue detection network, the right eye fatigue detection network and the eye fatigue analysis network.
10. The method of any of claims 4 to 7, further comprising:
and controlling a vibration module to carry out vibration reminding according to a vibration mode corresponding to the eye fatigue information.
11. The method of claim 7, further comprising:
acquiring illumination data of the environment where the target object is located through an illumination sensor;
acquiring the eye use distance of the target object through a distance measuring sensor;
carrying out angle calibration based on the head inclination angle data and preset calibration inclination angle data to obtain the eye using angle of the target object;
acquiring historical eye fatigue information of the target object;
generating fatigue change information of the target object based on the historical eye fatigue information and the eye fatigue information;
generating eye fatigue monitoring information based on the illumination data, the eye using distance, the eye using angle, the eye using time length and the fatigue change information;
and pushing the eye fatigue monitoring information to a mobile terminal corresponding to the target object through a communication module.
12. An eyestrain monitoring device implemented on the basis of a head-worn apparatus according to any one of claims 1 to 3, characterized in that the device comprises:
the first acquisition module is used for acquiring target blink data and eye using duration of a target object through a first inclination angle sensor, a second inclination angle sensor and a forehead electroencephalogram sensor, wherein the target blink data comprise left eye blink data and right eye blink data;
the second acquisition module is used for acquiring head posture data of the target object through a third tilt sensor;
the left eye fatigue detection module is used for carrying out left eye fatigue detection on the basis of the left eye blinking data, the eye using duration and the head posture data to obtain left eye fatigue information of the target object;
the right eye fatigue detection module is used for carrying out right eye fatigue detection on the basis of the right eye blinking data, the eye using duration and the head posture data to obtain right eye fatigue information of the target object;
and the eye fatigue analysis module is used for carrying out eye fatigue analysis based on the left eye fatigue information and the right eye fatigue information to obtain eye fatigue information and eye difference information of the target object.
13. A computer-readable storage medium, having at least one instruction or at least one program stored therein, which is loaded and executed by a processor to implement the eye fatigue monitoring method according to any one of claims 4 to 11.
CN202210261964.6A 2022-03-16 2022-03-16 Head wearing equipment, eye fatigue monitoring method and device and storage medium Pending CN114631809A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210261964.6A CN114631809A (en) 2022-03-16 2022-03-16 Head wearing equipment, eye fatigue monitoring method and device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210261964.6A CN114631809A (en) 2022-03-16 2022-03-16 Head wearing equipment, eye fatigue monitoring method and device and storage medium

Publications (1)

Publication Number Publication Date
CN114631809A true CN114631809A (en) 2022-06-17

Family

ID=81950221

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210261964.6A Pending CN114631809A (en) 2022-03-16 2022-03-16 Head wearing equipment, eye fatigue monitoring method and device and storage medium

Country Status (1)

Country Link
CN (1) CN114631809A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024007364A1 (en) * 2022-07-05 2024-01-11 清华大学 Wearable-device-based light environment and posture detection method, apparatus and system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024007364A1 (en) * 2022-07-05 2024-01-11 清华大学 Wearable-device-based light environment and posture detection method, apparatus and system

Similar Documents

Publication Publication Date Title
US9874862B2 (en) Method and device to monitor and analyze biosignal of user
EP2698112B1 (en) Real-time stress determination of an individual
US20130096453A1 (en) Brain-computer interface devices and methods for precise control
CN107209807A (en) Pain management wearable device
KR102029219B1 (en) Method for recogniging user intention by estimating brain signals, and brain-computer interface apparatus based on head mounted display implementing the method
US20210113129A1 (en) A system for determining emotional or psychological states
Ma et al. Wearable driver drowsiness detection using electrooculography signal
CN105306703A (en) Emotion recognition wearable device based on smartphone
CN109949438B (en) Abnormal driving monitoring model establishing method and device and storage medium
CN108634969B (en) Emotion detection device, emotion detection system, emotion detection method, and storage medium
KR102057705B1 (en) A smart hand device for gesture recognition and control method thereof
CN110584657B (en) Attention detection method and system
Soundariya et al. Eye movement based emotion recognition using electrooculography
KR20160108967A (en) Device and method for bio-signal measurement
Cho Automated mental stress recognition through mobile thermal imaging
KR101723841B1 (en) Apparatus for eye-brain interface and method for controlling the same
CN114631809A (en) Head wearing equipment, eye fatigue monitoring method and device and storage medium
Huda et al. Recognition of reading activity from the saccadic samples of electrooculography data
Ban et al. Soft wireless headband bioelectronics and electrooculography for persistent human–machine interfaces
KR101527273B1 (en) Method and Apparatus for Brainwave Detection Device Attached onto Frontal Lobe and Concentration Analysis Method based on Brainwave
CN110366388A (en) Information processing method, information processing unit and program
CN112957049A (en) Attention state monitoring device and method based on brain-computer interface equipment technology
KR20150077571A (en) Bio-Signal Based Eye-Tracking System Using Dual Machine Learning Structure and Eye-Tracking Method using The Same
CN114949531A (en) VR dual-mode emotion automatic control system based on physiological signals
Dietrich et al. Towards EEG-based eye-tracking for interaction design in head-mounted devices

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination