CN115177240A - Light environment and attitude detection method, device and system based on wearable equipment - Google Patents

Light environment and attitude detection method, device and system based on wearable equipment Download PDF

Info

Publication number
CN115177240A
CN115177240A CN202210793682.0A CN202210793682A CN115177240A CN 115177240 A CN115177240 A CN 115177240A CN 202210793682 A CN202210793682 A CN 202210793682A CN 115177240 A CN115177240 A CN 115177240A
Authority
CN
China
Prior art keywords
illumination
posture
target object
data
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210793682.0A
Other languages
Chinese (zh)
Inventor
林波荣
曾云一
孙弘历
余娟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Original Assignee
Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University filed Critical Tsinghua University
Priority to CN202210793682.0A priority Critical patent/CN115177240A/en
Priority to PCT/CN2022/106272 priority patent/WO2024007364A1/en
Publication of CN115177240A publication Critical patent/CN115177240A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6803Head-worn items, e.g. helmets, masks, headphones or goggles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6814Head
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6814Head
    • A61B5/6815Ear
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6814Head
    • A61B5/6821Eye
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Physiology (AREA)
  • Dentistry (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Otolaryngology (AREA)
  • Ophthalmology & Optometry (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Automation & Control Theory (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

The disclosure relates to a method, a device, a system and a storage medium for detecting light environment and posture based on wearable equipment, wherein the method comprises the following steps: receiving illumination data of ambient light collected by a wearable device and motion data of a target object wearing the wearable device, wherein the wearable device is worn near the eyes of the target object; determining a pose angle of the target object head from the motion data; and detecting the posture and/or the light environment of the target object according to the posture angle and/or the illumination data to obtain a detection result. The wearable device can realize long-time-course multi-place health light environment perception, and can detect and prompt health effects of light from biological rhythms and vision health layers at the same time.

Description

Light environment and attitude detection method, device and system based on wearable equipment
Technical Field
The present disclosure relates to the field of light environment and gesture detection, and in particular, to a method, an apparatus, and a system for detecting light environment and gesture based on wearable devices.
Background
The eyes are used as a window of soul, and the received information influences the emotion and even the health of the human. More than 70% of information acquired by a person in one day comes from light, and light not only provides vision, but also is an important time-giving factor of a human body biological clock, and influences the circadian rhythm and the working state of the person. The effect of light on vision and biological clocks is closely related to the illuminance and spectral distribution at the human eye. For example, too strong light can damage the retina, and too weak light can easily cause asthenopia; the light environment in the blue band is beneficial to improve daytime vitality, but night-time blue light exposure can induce sleep disorders.
How to identify and prompt the causes generated by human health, biological rhythm and the like in time in daily life is a key problem to be solved.
Disclosure of Invention
In view of this, the present disclosure provides a method, an apparatus, a system and a storage medium for detecting a light environment and a gesture based on a wearable device.
According to one aspect of the present disclosure, a method for detecting a light environment and a light posture is provided, which includes:
receiving illumination data of ambient light collected by a wearable device and motion data of a target object wearing the wearable device, wherein the wearable device is worn near the eyes of the target object;
determining a pose angle of the target object head from the motion data;
and detecting the posture and/or the light environment of the target object according to the posture angle and/or the illumination data to obtain a detection result.
In a possible implementation manner, detecting the posture and/or the light environment of the target object according to the posture angle and/or the illumination data to obtain a detection result, including:
determining spectral information from the illumination data;
and determining that the main light source of the light environment is natural light illumination or artificial illumination according to the spectral information.
In a possible implementation manner, detecting the posture and/or the light environment of the target object according to the posture angle and/or the illumination data to obtain a detection result, including:
determining illumination according to the illumination data;
and when the illumination exceeds a first threshold value, generating first prompt information, wherein the first prompt information indicates that the illumination is too high or the illumination adjusting mode.
In a possible implementation manner, detecting the posture and/or the light environment of the target object according to the posture angle and/or the illumination data to obtain a detection result, including:
and determining the posture of the target object as computer office or paper office according to the posture angle, and judging whether the target object is in bad sitting posture or not according to the posture angle.
In a possible implementation manner, detecting the posture and/or the light environment of the target object according to the posture angle and/or the illumination data to obtain a detection result, including:
determining a color temperature according to the illumination data;
and when the posture of the target object is judged to be computer working and the color temperature exceeds a second threshold value, generating second prompt information, wherein the second prompt information indicates that the color temperature is too high or the color temperature adjusting mode.
In a possible implementation manner, detecting the posture and/or the light environment of the target object according to the posture angle and/or the illumination data to obtain a detection result, including:
determining a target time interval to which the current time belongs;
determining physiological equivalent illumination according to the illumination data;
and when the physiological equivalent illumination does not accord with the preset condition corresponding to the target time period, generating third prompt information, wherein the third prompt information indicates that the physiological equivalent illumination is abnormal or the physiological equivalent illumination adjustment mode.
In a possible implementation manner, detecting the posture and/or the light environment of the target object according to the posture angle and/or the illumination data to obtain a detection result, further including:
and generating fourth prompt information when the posture of the target object is judged to be the poor sitting posture, wherein the fourth prompt information indicates the poor sitting posture or the sitting posture adjustment mode.
In one possible implementation, the method further includes:
storing the illumination data, the motion data and the detection result;
determining statistical information aiming at the detection of the luminous environment and the posture according to the detection result in the preset time period;
and generating fifth prompt information according to the statistical information.
In one possible implementation, the statistical information includes:
natural light illumination time; and/or
The standard reaching time of any one or more of the illumination, the color temperature, the physiological equivalent illumination and the posture of the target object accounts for the proportion of the preset time period.
In a possible implementation manner, generating a fifth prompting message according to the statistical information includes:
the fifth cue information indicates that more sufficient natural light is being accepted when the natural light illumination time is insufficient;
when the illuminance and/or the posture standard-reaching rate of the target object is low, the fifth prompt message indicates that the myopia risk exists;
and when the standard reaching rate of the physiological equivalent illumination is low, the fifth prompt message indicates that the physiological rhythm health risk exists.
According to another aspect of the present disclosure, a light environment and gesture detection apparatus based on a wearable device is provided, including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to implement the above-described method when executing the memory-stored instructions.
According to another aspect of the present disclosure, a non-transitory computer-readable storage medium is provided, having computer program instructions stored thereon, wherein the computer program instructions, when executed by a processor, implement the above-mentioned method.
According to another aspect of the present disclosure, a light environment and gesture detection system based on a wearable device is provided, including:
a wearable device for acquiring illumination data of ambient light and motion data of a target object wearing the wearable device, the wearable device being worn near an eye of the target object;
detecting device according to the light environment and posture;
and the communication equipment is used for transmitting the illumination data and the motion data to the light environment and posture detection device.
According to another aspect of the present disclosure, there is provided a computer program product comprising computer readable code, or a non-transitory computer readable storage medium carrying computer readable code, which when run in a processor of an electronic device, the processor in the electronic device performs the above method.
Based on the light environment sensing method and device, the posture and/or the light environment of the target object are/is detected based on the illumination data and the motion data collected by the wearable device, so that the long-term and multi-place health light environment sensing can be realized through the wearable device, and the health effect of light can be detected and prompted from the biorhythm and vision health level simultaneously based on the illumination data and the motion data. The wearable device has the characteristics of small size and convenience in wearing, can be worn near the eyes of a target object, can evaluate the light environment of the eyes of the target object and the posture of the target object per se for a long time and in multiple places, is not limited by time and space, and can be applied to daily work activities and night daily life.
Other features and aspects of the present disclosure will become apparent from the following detailed description of exemplary embodiments, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate exemplary embodiments, features, and aspects of the disclosure and, together with the description, serve to explain the principles of the disclosure.
FIG. 1 shows a schematic diagram of a light environment and gesture detection system according to an embodiment of the present disclosure.
FIG. 2a shows a flow diagram of a light environment and gesture detection method according to an embodiment of the present disclosure.
FIG. 2b shows a flowchart of a light environment and gesture detection method according to an embodiment of the present disclosure
Fig. 3 shows a spectral diagram of a spectral sensor having 8 spectral response bands.
Fig. 4 shows a schematic diagram of a three-axis acceleration sensor.
FIG. 5 shows a schematic diagram of the pose of a target object.
Fig. 6 shows a schematic diagram of wired transmission of data according to an embodiment of the present disclosure.
Fig. 7 shows a schematic diagram of a wireless data transmission according to an embodiment of the disclosure.
Fig. 8 shows a schematic diagram of wearable device wear according to an embodiment of the present disclosure.
FIG. 9 illustrates a block diagram of an apparatus 1900 for light environment and gesture detection, according to an exemplary embodiment.
Detailed Description
Various exemplary embodiments, features and aspects of the present disclosure will be described in detail below with reference to the accompanying drawings. In the drawings, like reference numbers can indicate functionally identical or similar elements. While the various aspects of the embodiments are presented in drawings, the drawings are not necessarily drawn to scale unless specifically indicated.
The word "exemplary" is used exclusively herein to mean "serving as an example, embodiment, or illustration. Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
Furthermore, in the following detailed description, numerous specific details are set forth in order to provide a better understanding of the present disclosure. It will be understood by those skilled in the art that the present disclosure may be practiced without some of these specific details. In some instances, methods, means, elements and circuits that are well known to those skilled in the art have not been described in detail so as not to obscure the present disclosure.
The information received by the eyes affects the mood and health of the human body, and the received light not only provides vision but also affects the biological clock of the human body. Moreover, the biorhythmic effects of light and visual health are interrelated. On the one hand, they all need to acquire illumination data of human eyes, for example, the retina is damaged by too strong light in daytime, and sleep disorder is induced by night blue light exposure; on the other hand, they are all associated with the working state of a person, for example, the light environment can affect the working state of the person, and the substandard light environment and posture can affect the visual health of the person. The eyesight health is an important part of national health and relates to the whole life cycle of people in all ages. Visual impairment affects the physical and mental health and quality of life of the people and is a serious social problem.
Based on the light environment and the gesture detection method and device, the wearable device acquires the illumination data and the motion data to detect the gesture of the target object and/or the light environment in which the target object is located, so that the wearable device can be used for realizing long-term and multi-place health light environment perception, and the light health effect can be detected and prompted from the biological rhythm and vision health level at the same time based on the illumination data and the motion data. The wearable device has the characteristics of small size and convenience in wearing, can be worn near the eyes of a target object, can evaluate the light environment of the eyes of the target object and the posture of the target object per se for a long time and in multiple places, is not limited by time and space, and can be applied to daily work activities and night daily life.
Fig. 1 shows a schematic diagram of a wearable device based light environment and gesture detection system according to an embodiment of the present disclosure. As shown in fig. 1, the system includes:
a wearable device 11 for acquiring illumination data of ambient light and motion data of a target object wearing the wearable device, the wearable device being worn near the eyes of the target object. The wearable device 11 may include a spectral sensor and a motion sensor, and the spectral sensor may output response values of different spectral bands by collecting ambient light irradiated to the wearable device, and the response values may be used as illumination data. The motion sensor may obtain motion data of a head of a target object wearing the wearable device. These sensors can be miniaturized, so the diameter of the wearable device 11 can be controlled within 1cm, and the wearable device can be worn near the eyes of the user, as shown in fig. 8, which shows a schematic diagram of wearable device wearing according to an embodiment of the present disclosure, the wearable device is fixed near the eyes, and the light sensor probe in the spectrum sensor can be facing the incoming light direction. Where "near the eyes" may refer to any location within the area of the head, such as the forehead, earlobe, etc., near the eyes. Wearable devices are worn in ways including, but not limited to: 1. the glasses frame is fixed by a clip; 2. the ear lobe is fixed on the ear lobe by a clamp; 3. is fixed on the front side of the forehead by an elastic band. The wearable device can be provided with a switch button, and when the user selects to turn on the wearable device, the wearable device starts and starts to collect illumination data and motion data through the wearable device.
And the communication equipment 12 is used for transmitting the illumination data and the motion data to the light environment and posture detection device. The communication device 12 may be connected to the wearable device 11 through a wired or wireless connection, and the communication device 12 may be integrated with the wearable device 11. The communication device 12 acquires the illumination data and the motion data acquired by the wearable device 11, and transmits the illumination data and the motion data to the light environment and posture detection device 13, which can be in wired transmission mode or in wireless transmission mode such as bluetooth and WIFI. The light environment and posture detecting device 13 may be an electronic device such as a smart phone, a tablet, a computer, or the like.
Fig. 6 shows a schematic diagram of wired data transmission according to an embodiment of the present disclosure, which includes a wearable device 11, a communication device 12, and a light environment and gesture detection apparatus 13. Wherein the communication device 12 is a data line, and the light environment and posture detecting device 13 is a smart phone. In the figure, a white dot is a light sensor probe, faces the light incoming direction when being worn, and collects ambient light irradiated to the wearable device. The data line is the power supply of wearable equipment, and with the data transmission to light environment and the gesture detection device of wearable equipment monitoring simultaneously, operation, storage and show are carried out in light environment and gesture detection device.
Fig. 7 shows a schematic diagram of data wireless transmission according to an embodiment of the present disclosure, which includes a wearable device 11, a communication device 12, and a light environment and posture detection device 13. Wherein, the communication device 12 comprises a data line and a bluetooth module, and the light environment and gesture detection device 13 is a smart phone. In the figure, a white dot is a light sensor probe, faces the light incoming direction when being worn, and collects ambient light irradiated to the wearable device. The data line supplies power for wearable equipment, and the bluetooth module in the data transmission to communication equipment that will wearable equipment monitored simultaneously. The Bluetooth module transmits the data to the light environment and posture detection device, and the data is operated, stored and displayed in the light environment and posture detection device.
The light environment and posture detecting device 13 may include an arithmetic unit, a real-time prompting unit, a storage module, and a historical data prompting unit.
And the arithmetic unit can process the illumination data and the motion data to obtain a detection result. Such as may be implemented on a cell phone, tablet, computer, etc. According to the illumination data output by the spectrum sensor in the wearable device 11, the spectrum information is determined to judge whether the main light source of the current light environment is natural light or artificial illumination, and then the illumination, the color temperature and the physiological illumination of the current environment light are obtained through illumination data calculation to judge whether the light environment reaches the standard. From the motion data of the target object head output by the motion sensor in the wearable device 11, the posture angle of the target object head can be determined to determine whether the posture is up to standard.
And the real-time prompting unit gives a prompt or suggestion according to the data obtained by the operation unit. For example, the visual presentation may be performed through an application program on an electronic device such as a mobile phone, a tablet, a computer, or the like. The influence of light environment on vision, biorhythm and multi-dimensional real-time evaluation on the eye habit can be clearly provided.
And the storage module is used for storing the data acquired by the wearable equipment 11 and the detection result of the arithmetic unit. For example, the storage may be local to the electronic device, or may be cloud storage.
And the historical data prompting unit is used for prompting the health of the user according to the data and the detection result stored in the storage module for a period of time in the past. For example, the visual presentation can be performed through an application program on an electronic device such as a mobile phone, a tablet, a computer, and the like, and the multi-dimensional long-term evaluation on the eye habit is provided.
In one possible implementation, fig. 2a shows a flowchart of a wearable device-based light environment and gesture detection method according to an embodiment of the present disclosure, which can be implemented by the light environment and gesture detection apparatus 13 in fig. 1. As shown in fig. 2a, the method comprises:
s201, receiving illumination data of ambient light collected by a wearable device and motion data of a target object wearing the wearable device, wherein the wearable device is worn near the eyes of the target object.
Among other things, the wearable device may include a spectral sensor and a motion sensor. The illumination data may be response values of different spectral bands output by the spectral sensor by collecting ambient light (e.g., ambient light near the eyes) that illuminates the wearable device, where the response values refer to the magnitude of the signal. The motion data may be the motion acceleration of the head of the target object wearing the wearable device obtained by a motion sensor, and for example, the three-axis acceleration sensor shown in fig. 4 is a sensor for measuring the spatial acceleration, that is, measuring the speed change of an object in space. The motion data collected by the motion sensor can be accelerations x, y and z in three directions of a three-axis rectangular coordinate system. This openly through miniaturized wearable equipment, realize that the wearable monitoring of multi-parameter such as illuminance, spectral distribution, working kind, gesture. Meanwhile, the wearable device is worn near the eyes of the target object, so that long-time-course and multi-place perception can be realized, and the wearable device has wider application scenes.
S202, determining the posture angle of the head of the target object according to the motion data.
Taking the three-axis acceleration sensor shown in fig. 4 as an example: and (3) establishing a three-axis rectangular coordinate system, and acquiring motion data by the motion sensor, namely the acceleration components in the three-axis direction are (x, y and z). If the included angle between the current direction a and the xz plane is phi and the included angle between the current direction a and the yz plane is omega, the calculation formula corresponding to the two angles is:
Figure BDA0003731329870000051
Figure BDA0003731329870000052
thereby calculating the pose angle of the head of the target object.
S203, detecting the posture and/or the light environment of the target object according to the posture angle and/or the illumination data to obtain a detection result.
Since the wearable device is worn near the eyes of the target object and the measured data of the wearable device can be considered to represent the illumination data received by the eyes and the motion data of the head, the posture and the light environment of the target object can be obtained by detecting the calculated posture angle and the illumination data.
Several exemplary ways of detecting the posture and/or the light environment of the target object according to the posture angle and/or the illumination data in step S203 to obtain a detection result are introduced below.
In one possible implementation, step S203 may include: determining spectral information according to the illumination data; and determining that the main light source of the light environment is natural light illumination or artificial illumination according to the spectral information.
Wherein the illumination data may include response values of the spectral sensor by collecting different spectral bands of ambient light output to the wearable device. Spectral information can then be obtained by fitting the plurality of response values.
Wherein, as shown in fig. 3, a spectrum diagram of a spectrum sensor with 8 spectral response bands is shown, and the spectral distribution can be obtained by smoothly connecting the response values of different bands. The wavelength of visible light electromagnetic wave that can be perceived by human eyes is between 380 and 780 nm, so the graph is set to represent the wavelength with the horizontal axis ranging from 400 to 700 nm and the vertical axis representing the relative sensitivity (response value). From fig. 3 it can be observed that there is a large difference in the spectral distribution of natural light and artificial illumination: the natural light spectrum exhibits continuous characteristics, while the artificial illumination spectrum has characteristics of peaks and valleys; in addition, natural light provides significantly higher eye illumination than artificial lighting, and natural light has a larger spectral response. By these features it is possible to distinguish whether the ambient light is natural or artificial lighting. In one possible case, there is both natural light illumination and artificial illumination, where natural light illumination is predominant if the spectral distribution is closer to a continuous spectrum; if the spectral distribution is closer to the peak-to-valley distribution spectrum, artificial illumination is dominant.
In one possible implementation, step S203 may include: determining an illumination intensity according to the illumination data; and when the illumination exceeds a first threshold value, generating first prompt information, wherein the first prompt information indicates that the illumination is too high or the illumination adjusting mode.
The illumination of the current ambient light can be calculated through the illumination data. Illuminance refers to the luminous flux of visible light received per unit area, and is used to indicate the intensity of illumination and the amount of illumination that an object surface area is illuminated. If the eye lens is in a strong light environment, the function of the eye lens is affected, and cataract can be caused after damage; too high illumination may also cause discomfort glare and damage to the retina. Therefore, when the illumination is too high, real-time first prompt information is generated, damage is reduced in time, and long-time damage is avoided, such as suggesting to turn down the current light or take a sun-shading measure. The first threshold value can be flexibly set according to personal conditions and actual detection scenes, and only the physiological and health standards of a human body are met.
In one possible implementation, step S203 may include: and determining the posture of the target object as computer office or paper office according to the posture angle, and judging whether the target object is in bad sitting posture or not according to the posture angle.
As shown in fig. 5, after the posture angle of the head of the target object is obtained, it is possible to determine whether the work type and posture of the target object are standard or not. In a possible judging mode, firstly, judging the working type of the target object according to the posture angle of the head of the target object, and if the head is close to parallel in the vertical direction, working for a computer; if the angle of inclination is downward, the paper office is performed; and then, judging the posture, and if the downward inclination angle is too large, determining that the sitting posture is poor.
In one possible implementation, step S203 may include: determining a color temperature according to the illumination data; and when the posture of the target object is judged to be computer working and the color temperature exceeds a second threshold value, generating second prompt information, wherein the second prompt information indicates that the color temperature is too high or a color temperature adjusting mode.
The color temperature of the current ambient light can be calculated through the illumination data. The blue color temperature is the highest, and the blue color temperature exists in computer displays, digital products, LEDs and other light rays in a large quantity, the amount of toxins in a macular region in eyes is increased due to overhigh color temperature, visual fatigue and irreversible damage can be caused due to long-time watching, and the positive mood of workers can be influenced due to overhigh color temperature. Therefore, when the computer is judged to work and the color temperature is too high, real-time second prompt information is generated, damage is reduced in time, and long-time damage is avoided, such as suggesting a timely rest or selecting a blue light prevention product. The second threshold value can be flexibly set according to personal conditions and actual detection scenes, and only the physiological and health standards of a human body are met.
In one possible implementation, step S203 may include: determining a target time interval to which the current time belongs; determining physiological equivalent illumination according to the illumination data; and generating third prompt information when the physiological equivalent illumination does not accord with the preset condition corresponding to the target time period, wherein the third prompt information indicates that the physiological equivalent illumination is abnormal or the physiological equivalent illumination adjusting mode.
Wherein, the physiological equivalent illumination of the current environment light can be calculated through the illumination data. Physiologically equivalent illuminance is a light metric derived from the effect of irradiance on the non-visual system of a person, including vital rhythms, neuroendocrine and neural behavior, simply speaking, light affects a biological clock. The method has different physiological equivalent illumination standards in the daytime and at night, can determine a specific time period according to working requirements for judgment, and does not limit the specific time period. In one possible implementation, the night physiological equivalent illumination of the residential building refers to 20: data measured after 00, the physiological equivalent illumination for long-term operation refers to 10:00-17:00 measured during the work hours. In a possible implementation mode, the physiological equivalent illumination intensity of a place where people work for a long time in a public building is not lower than 200lx at 1.2 meters in the main sight line direction, and the physiological equivalent illumination intensity of a residential building at night is not higher than 50lx. It can be determined that 10:00-17:00 is a first target time interval, and the corresponding preset conditions are that the physiological equivalent illumination is not lower than a threshold value A,20:00 to the next day 6: and 00 is a second target time interval, the corresponding preset condition is that the physiological equivalent illumination is not higher than a threshold value B, wherein the threshold value A and the threshold value B can be set according to actual needs by referring to the standards, and in order to ensure a good rest environment and a comfortable and efficient working environment, when the physiological equivalent illumination does not conform to the preset condition corresponding to the target time interval, real-time third prompt information is generated, discomfort is reduced in time, and adverse effects on the biological clock for a long time are avoided, such as light adjustment is recommended.
In one possible implementation, step S203 may include: and generating fourth prompt information when the posture of the target object is judged to be the bad sitting posture, wherein the fourth prompt information indicates the bad sitting posture or the sitting posture adjustment mode.
If the head posture angular declination angle is too large, the sitting posture is poor. The poor sitting posture not only causes asthenopia due to too close writing distance, affects eyesight for a long time, but also causes lateral bending of the spine, thereby affecting respiration and digestion, and shows slight displacement for a long time to affect other functions, such as thoracic vertebra protrusion and circulatory function. Therefore, real-time fourth prompt information is generated, damage is timely reduced, and adverse effects on eyes and bodies for a long time are avoided, such as suggestion of sitting posture adjustment or relaxation movement.
The above-described illuminance, color temperature, physiologically equivalent illuminance, and the like can be obtained based on the illumination data by the correlation technique.
In one possible implementation, the method may further include: and storing the illumination data, the motion data and the detection result. And determining statistical information aiming at the detection of the luminous environment and the attitude according to the detection result in the preset time period. And generating fifth prompt information according to the statistical information.
All illumination data and motion data, all calculation results such as illumination, color temperature, physiological equivalent illumination, posture angle and the like obtained through calculation, all detection results such as light source types, working types, postures and whether the detection results reach the standard or not can be stored, and timestamp information corresponding to the data can be recorded at the same time.
The method comprises the steps of calculating the illumination data, the motion data, the calculation result, the detection result and the preset time period, wherein all information in the device can be counted in the time period from the start of the device to the end of the detection or any preset time period.
The detection results of the luminous environment and the posture can be summarized according to the stored illumination data, the stored motion data, the stored calculation results, the stored detection results and the stored preset time period, and the overall risk prompt is carried out.
In one possible implementation, the statistical information may include: natural light illumination time; and/or the proportion of the standard reaching time of any one or more of the illuminance, the color temperature, the physiological equivalent illuminance and the posture of the target object to the preset time period.
For example, the preset time period is 24 hours, the total duration of the natural light illumination time in the preset time period can be counted, the ratio of the total duration of the illumination lower than the first threshold (i.e., the illumination reaches the standard) to the 24 hours can be counted, and the like.
In a possible implementation manner, the fifth prompting message includes, according to the statistical information: the fifth cue information indicates that more sufficient natural light is being accepted when the natural light illumination time is insufficient; when the illuminance and/or the posture standard-reaching rate of the target object is low, the fifth prompt message indicates that the myopia risk exists; and when the standard reaching rate of the physiological equivalent illumination is low, the fifth prompt message indicates that the physiological rhythm health risk exists.
Wherein, according to the proportion of the standard reaching time to the total monitoring time, the standard reaching rate of the luminous environment and the posture can be calculated. From the sum of the natural light exposure times, the natural light exposure time period can be calculated. Then analyzing the historical data of the user to generate a macroscopic fifth prompt message,
in one possible implementation, if the natural light exposure is low in duration, it is suggested that more sufficient natural light needs to be accepted to maintain health and well-being; if the standard reaching rate of the illumination and the posture is low, indicating that the risk of myopia exists; if the standard reaching rate of the physiological equivalent illumination is low, the risk of unhealthy physiological rhythm is prompted. The content can be visually displayed through application programs on electronic equipment such as a mobile phone, a tablet, a computer and the like, so that the influence of a light environment on vision and biological rhythm can be clearly known, and multi-dimensional long-term evaluation on eye use habits can be realized. The multi-dimensional evaluation and prompting simultaneously helps the user to maintain a healthy circadian rhythm and protects the user's visual health.
FIG. 2b shows a flow diagram of a light environment and gesture detection method according to an embodiment of the present disclosure. As shown in fig. 2b, after the wearable device is turned on, the collected illumination data and motion data may be transmitted to the light environment and posture detection device, the light environment and posture detection device may obtain spectral information according to the illumination data after receiving the illumination data and the motion data, determine that the main light source of the light environment is natural light illumination or artificial light illumination according to the spectral information, obtain a posture angle according to the motion data, determine the posture of the target object according to the posture angle, obtain illumination, color temperature, physiological equivalent illumination according to the illumination data, and determine whether the illumination, color temperature, physiological equivalent illumination reach the standard or not by combining time and the posture of the target object. The system can remind in real time according to various substandard conditions, such as reminding through a light environment and posture detection device or wearable equipment, and can also store various detection results, data and time stamps thereof, obtain statistical information based on the stored data and remind according to the statistical information.
It should be noted that, although fig. 2a and 2b are used as examples to describe a light environment and gesture detection method as above, those skilled in the art will understand that the disclosure should not be limited thereto. In fact, the user can flexibly set the threshold value according to personal conditions and/or practical application scenes as long as the physiological and health standards of the human body are met. The illumination data and the motion data can be flexibly used according to personal conditions and/or actual application scenes, and are not limited to the detection contents given in the present example.
In a possible implementation manner, the present disclosure further provides a light environment and posture detection device based on a wearable device, including:
the wearable device comprises a receiving module, a processing module and a display module, wherein the receiving module is used for receiving illumination data of ambient light collected by the wearable device and motion data of a target object wearing the wearable device, and the wearable device is worn near the eyes of the target object;
a determination module for determining a pose angle of the head of the target object from the motion data;
and the detection module is used for detecting the posture of the target object and/or the light environment of the target object according to the posture angle and/or the illumination data to obtain a detection result.
In some embodiments, functions of or modules included in the apparatus provided in the embodiments of the present disclosure may be used to execute the method described in the above method embodiments, and specific implementation thereof may refer to the description of the above method embodiments, and for brevity, will not be described again here.
Embodiments of the present disclosure also provide a computer-readable storage medium having stored thereon computer program instructions, which when executed by a processor, implement the above-mentioned method. The computer readable storage medium may be a volatile or non-volatile computer readable storage medium.
The embodiment of the present disclosure further provides a light environment and posture detecting device based on wearable device, including: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to implement the above-described method when executing the memory-stored instructions.
The disclosed embodiments also provide a computer program product comprising computer readable code or a non-transitory computer readable storage medium carrying computer readable code, which when run in a processor of an electronic device, the processor in the electronic device performs the above method.
FIG. 9 illustrates a block diagram of an apparatus 1900 for light environment and gesture detection, according to an exemplary embodiment. For example, the apparatus 1900 may be provided as a server or terminal device. Referring to fig. 9, the device 1900 includes a processing component 1922 further including one or more processors and memory resources, represented by memory 1932, for storing instructions, e.g., applications, executable by the processing component 1922. The application programs stored in memory 1932 may include one or more modules that each correspond to a set of instructions. Further, the processing component 1922 is configured to execute instructions to perform the above-described method.
The device 1900 may also include a power component 1926 configured to perform power management of the device 1900, a wired or wireless network interface 1950 configured to connect the device 1900 to a network, and an input/output (I/O) interface 1958. The device 1900 may operate based on an operating system stored in memory 1932, such as Windows Server, mac OS XTM, unixTM, linuxTM, freeBSDTM, or the like.
In an exemplary embodiment, a non-transitory computer readable storage medium, such as the memory 1932, is also provided that includes computer program instructions executable by the processing component 1922 of the apparatus 1900 to perform the above-described methods.
The present disclosure may be systems, methods, and/or computer program products. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied thereon for causing a processor to implement various aspects of the present disclosure.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present disclosure may be assembler instructions, instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, the electronic circuitry that can execute the computer-readable program instructions implements aspects of the present disclosure by utilizing the state information of the computer-readable program instructions to personalize the electronic circuitry, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA).
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The foregoing description of the embodiments of the present disclosure has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (13)

1. A light environment and posture detection method based on wearable equipment is characterized by comprising the following steps:
receiving illumination data of ambient light collected by a wearable device and motion data of a target object wearing the wearable device, wherein the wearable device is worn near the eyes of the target object;
determining a pose angle of the target object head from the motion data;
and detecting the posture and/or the light environment of the target object according to the posture angle and/or the illumination data to obtain a detection result.
2. The method of claim 1, wherein: detecting the posture and/or the light environment of the target object according to the posture angle and/or the illumination data to obtain a detection result, wherein the detection result comprises the following steps:
determining spectral information according to the illumination data;
and determining that the main light source of the light environment is natural light illumination or artificial illumination according to the spectral information.
3. The method according to claim 1, wherein detecting the posture and/or the light environment of the target object according to the posture angle and/or the illumination data to obtain a detection result comprises:
determining illumination according to the illumination data;
and when the illumination exceeds a first threshold value, generating first prompt information, wherein the first prompt information indicates that the illumination is too high or the illumination adjusting mode.
4. The method according to claim 1, wherein detecting the posture and/or the light environment of the target object according to the posture angle and/or the illumination data to obtain a detection result comprises:
and determining the posture of the target object as computer office or paper office according to the posture angle, and judging whether the target object is in a bad sitting posture or not according to the posture angle.
5. The method of claim 4, wherein detecting the posture and/or the light environment of the target object according to the posture angle and/or the illumination data to obtain a detection result, comprises:
determining a color temperature according to the illumination data;
and when the posture of the target object is judged to be computer working and the color temperature exceeds a second threshold value, generating second prompt information, wherein the second prompt information indicates that the color temperature is too high or the color temperature adjusting mode.
6. The method according to claim 1, wherein detecting the posture and/or the light environment of the target object according to the posture angle and/or the illumination data to obtain a detection result comprises:
determining a target time interval to which the current time belongs;
determining physiological equivalent illumination according to the illumination data;
and generating third prompt information when the physiological equivalent illumination does not accord with the preset condition corresponding to the target time period, wherein the third prompt information indicates that the physiological equivalent illumination is abnormal or the physiological equivalent illumination adjusting mode.
7. The method according to claim 4, wherein the detecting the posture and/or the light environment of the target object according to the posture angle and/or the illumination data to obtain a detection result, further comprises:
and generating fourth prompt information when the posture of the target object is judged to be the bad sitting posture, wherein the fourth prompt information indicates the bad sitting posture or the sitting posture adjustment mode.
8. The method of claim 1, further comprising:
storing the illumination data, the motion data and the detection result;
determining statistical information aiming at the detection of the luminous environment and the posture according to the detection result in the preset time period;
and generating fifth prompt information according to the statistical information.
9. The method of claim 8, wherein the statistical information comprises:
natural light illumination time; and/or
The standard reaching time of any one or more of the illumination, the color temperature, the physiological equivalent illumination and the posture of the target object accounts for the proportion of the preset time period.
10. The method according to claims 8 and 9, wherein generating a fifth hint according to the statistical information comprises:
the fifth cue information indicates that more sufficient natural light is being accepted when the natural light illumination time is insufficient;
when the illuminance and/or the posture standard-reaching rate of the target object is low, the fifth prompt message indicates that the myopia risk exists;
and when the standard reaching rate of the physiological equivalent illumination is low, the fifth prompt message indicates that the physiological rhythm health risk exists.
11. The utility model provides a luminous environment and gesture detection device based on wearable equipment which characterized in that includes:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to implement the method of any one of claims 1 to 10 when executing the memory-stored instructions.
12. A non-transitory computer readable storage medium having stored thereon computer program instructions, wherein the computer program instructions, when executed by a processor, implement the method of any one of claims 1 to 10.
13. The utility model provides a luminous environment and gesture detecting system based on wearable equipment which characterized in that includes:
a wearable device for acquiring illumination data of ambient light and motion data of a target object wearing the wearable device, the wearable device being worn near an eye of the target object;
the light environment and gesture detection apparatus of claim 11;
and the communication equipment is used for transmitting the illumination data and the motion data to the light environment and posture detection device.
CN202210793682.0A 2022-07-05 2022-07-05 Light environment and attitude detection method, device and system based on wearable equipment Pending CN115177240A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210793682.0A CN115177240A (en) 2022-07-05 2022-07-05 Light environment and attitude detection method, device and system based on wearable equipment
PCT/CN2022/106272 WO2024007364A1 (en) 2022-07-05 2022-07-18 Wearable-device-based light environment and posture detection method, apparatus and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210793682.0A CN115177240A (en) 2022-07-05 2022-07-05 Light environment and attitude detection method, device and system based on wearable equipment

Publications (1)

Publication Number Publication Date
CN115177240A true CN115177240A (en) 2022-10-14

Family

ID=83518234

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210793682.0A Pending CN115177240A (en) 2022-07-05 2022-07-05 Light environment and attitude detection method, device and system based on wearable equipment

Country Status (2)

Country Link
CN (1) CN115177240A (en)
WO (1) WO2024007364A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116989820A (en) * 2023-09-27 2023-11-03 厦门精图信息技术有限公司 Intelligent navigation system and method

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106510719B (en) * 2016-09-30 2023-11-28 歌尔股份有限公司 User gesture monitoring method and wearable device
CN109799624A (en) * 2019-03-15 2019-05-24 北京艾索健康科技有限公司 A kind of intelligence children's protective spectacles and terminal device
CN111930230A (en) * 2020-07-27 2020-11-13 歌尔光学科技有限公司 Gesture detection method, wearable device and computer-readable storage medium
US20220071563A1 (en) * 2020-09-08 2022-03-10 LEDO Network, Inc. Wearable health monitoring system
CN112433614A (en) * 2020-11-25 2021-03-02 歌尔光学科技有限公司 Eyesight protection method, device, equipment and computer readable storage medium
CN114631809A (en) * 2022-03-16 2022-06-17 苏州科医世凯半导体技术有限责任公司 Head wearing equipment, eye fatigue monitoring method and device and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116989820A (en) * 2023-09-27 2023-11-03 厦门精图信息技术有限公司 Intelligent navigation system and method
CN116989820B (en) * 2023-09-27 2023-12-05 厦门精图信息技术有限公司 Intelligent navigation system and method

Also Published As

Publication number Publication date
WO2024007364A1 (en) 2024-01-11

Similar Documents

Publication Publication Date Title
US11237409B2 (en) Wearing detection module for spectacle frame
CN107707763B (en) Near-sighted prevention and control wearable device and near-sighted prevention and control system and method
US11000186B2 (en) Systems and methods for retarding myopia progression
EP3039390B1 (en) System for sensing light exposure of a user
US20200297206A1 (en) System for assessing a health condition of a user
WO2017013051A1 (en) Head-mountable computing device, method and computer program product
CN108154866B (en) Display screen system capable of adjusting brightness in real time and brightness real-time adjusting method thereof
US10909164B2 (en) Method for updating an index of a person
US20200094015A1 (en) Controlling light exposure for circadian phase management
EP3239870B1 (en) A method for monitoring the behavior of a cohort group of members
CN115177240A (en) Light environment and attitude detection method, device and system based on wearable equipment
US20170055867A1 (en) Protection device, wearable device, protection method and display system
CN111708166A (en) Degree adjusting method and device and head-mounted display equipment
CN109299645A (en) Method, apparatus, system and storage medium for sight protectio prompt
EP3112927A1 (en) A vision monitoring module fixed on a spectacle frame
WO2016131927A1 (en) Method, electronic device and system for monitoring a skin surface condition
US20200182687A1 (en) Device for measuring light exposure of a subject
CN105450869B (en) A kind of harmful light reminding method and device
CN105450868B (en) A kind of harmful light reminding method and device
CN109124568A (en) Method and related electronic device that eyesight shows loving care for information are provided
CN106344033A (en) Instant eye monitor
CN112566304B (en) System and method for managing illumination of a region of interest comprising at least one object susceptible to manipulation by a user
TWI633462B (en) Method of providing eye care messages and electronic device thereof
CN108135481B (en) User migraine headache analysis assembly

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination