CN110801229A - Eyesight protection monitoring method, system, equipment and storage medium - Google Patents

Eyesight protection monitoring method, system, equipment and storage medium Download PDF

Info

Publication number
CN110801229A
CN110801229A CN201911117661.1A CN201911117661A CN110801229A CN 110801229 A CN110801229 A CN 110801229A CN 201911117661 A CN201911117661 A CN 201911117661A CN 110801229 A CN110801229 A CN 110801229A
Authority
CN
China
Prior art keywords
target object
user
module
face
vision protection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911117661.1A
Other languages
Chinese (zh)
Inventor
卫利娟
申勇韬
曾世钦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Inventec Appliances Shanghai Corp
Original Assignee
Inventec Appliances Shanghai Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Inventec Appliances Shanghai Corp filed Critical Inventec Appliances Shanghai Corp
Priority to CN201911117661.1A priority Critical patent/CN110801229A/en
Publication of CN110801229A publication Critical patent/CN110801229A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1072Measuring physical dimensions, e.g. size of the entire body or parts thereof measuring distances on the body, e.g. measuring length, height or thickness
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1079Measuring physical dimensions, e.g. size of the entire body or parts thereof using optical or photographic means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/746Alarms related to a physiological condition, e.g. details of setting alarm thresholds or avoiding false alarms
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • G08B21/24Reminder alarms, e.g. anti-loss alarms

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Veterinary Medicine (AREA)
  • Surgery (AREA)
  • Biophysics (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Physiology (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • General Physics & Mathematics (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a vision protection monitoring method, a system, equipment and a storage medium, wherein the method comprises the following steps: acquiring position information of a target object; acquiring position information of at least one face characteristic point of a face of a user; calculating the distance H between the eyes of the user and the target object according to the position relation between the eyes of the user and the face characteristic points; acquiring vision protection parameters corresponding to the target object, wherein the vision protection parameters at least comprise a threshold distance H between the target object and the eyes of the user0(ii) a Judging whether the distance H is smaller than the threshold distance H0(ii) a If yes, sending warning information. The method and the system of the invention play a role in protecting eyesight by monitoring the distance between the eyes of the user and the target object in real time and reminding the user of poor eye use condition in time, and the method is suitable for different scenes.

Description

Eyesight protection monitoring method, system, equipment and storage medium
Technical Field
The invention relates to the technical field of intelligent equipment, in particular to a vision protection monitoring method, a system, equipment and a storage medium.
Background
Along with the increase of electronic product varieties, the time for using electronic products by people is continuously increased, and the problem of myopia caused by poor sitting posture habit when using electronic products is more and more serious.
Some existing sitting posture correction devices acquire sitting posture information through various sensors or face recognition technologies, and carry out correction reminding by comparing with preset sitting posture information, but in the prior art, the existing sitting posture correction devices are usually only suitable for a single eye-using scene, but in life, people often carry out scene switching, for example, people who write the work may see a television for a while, see a mobile phone after the work is finished, and the like, and lack a device which can be suitable for vision protection and monitoring of a plurality of eye-using scenes simultaneously. Meanwhile, the existing sitting posture corrector has the problems of inconvenient carrying, deficient use experience and the like.
How to protect and monitor the vision under a plurality of scenes becomes a problem to be solved urgently.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present invention and therefore may include information that does not constitute prior art known to a person of ordinary skill in the art.
Disclosure of Invention
In view of the problems in the prior art, an object of the present invention is to provide a method, a system, a device and a storage medium for eyesight protection monitoring suitable for different scenes.
The embodiment of the invention provides a vision protection monitoring method, which is characterized by comprising the following steps:
s100: acquiring position information of a target object;
s200: acquiring position information of at least one face characteristic point of a face of a user;
s300: calculating the distance H between the eyes of the user and the target object according to the position relation between the eyes of the user and the face characteristic points;
s400: acquiring vision protection parameters corresponding to the target object, wherein the vision protection parameters at least comprise a threshold between the target object and the eyes of the userDistance of value H0
S500: judging whether the distance H is smaller than the threshold distance H0
If so, S600: and sending warning information.
Preferably, the human face feature point includes at least one of a tip of a nose, a left corner of a mouth, a right corner of a mouth, a chin point, a left earlobe, and a right earlobe.
Preferably, before the step S400, the method further includes:
identifying attributes of the target object;
the step S400 further includes:
s410: and acquiring vision protection parameters corresponding to the target object according to the identified attributes of the target object and according to different target objects and a vision protection parameter mapping table.
Preferably, before the step S400, the method further includes:
acquiring a first image of a target object;
attributes of the target object are identified by image recognition techniques.
Preferably, the S100 step includes:
s101: acquiring a first depth image of a target object, wherein pixel points in the first depth image correspond to depth information;
s102: and acquiring first coordinate information of a target object according to the first depth image as the position information of the target object.
Preferably, the S200 step includes:
s201: acquiring a second depth image of the face of the user, wherein pixel points in the second depth image correspond to depth information;
s202: identifying at least one face feature point from the second depth image;
s203: and acquiring second coordinate information of the face characteristic point as the position information of the face characteristic point.
Preferably, the positional relationship between the user's eyes and the face feature points is obtained by:
acquiring a third depth image of the face of the user, wherein pixel points in the third depth image correspond to depth information;
recognizing two-eye pupils and at least one face characteristic point of the face of the user according to the third depth image;
acquiring coordinate information of pupils of two eyes of a face of a user and third coordinate information of the face characteristic points;
recording the position relation between the eyes of the user and the face characteristic points.
Preferably, the duration of each monitoring period is set to T0, and the steps S100 to S500 are executed in each monitoring period, where the step S500 includes:
judging whether the distance H of the current monitoring period is smaller than the threshold distance H0Setting the current monitoring period to be the kth monitoring period;
if the distance H of the current monitoring period is less than the threshold distance H0Then obtain the abnormal duration T of the previous monitoring periodk-1,Tk-1Representing the time when the distance H between the eyes of the user and the target object is continuously less than the threshold distance H in the k-1 th monitoring period, and calculating the abnormal duration T of the current monitoring periodk=Tk-1+T0;
If the distance H of the current monitoring period is greater than or equal to the threshold distance H0Then the abnormal duration T of the current monitoring period is setk=0;
Judging the abnormal duration T of the current monitoring periodkIf the second threshold value is greater than the first threshold value, if so, the step S600 is continued.
Preferably, the vision protection monitoring method further comprises the steps of: recording the distance H for each monitoring period.
Preferably, before the step S400, the method further includes:
acquiring the ambient illuminance of a target object;
the vision protection parameters in the step S400 further include threshold illuminance corresponding to the target object;
the step S500 further includes S510: judging whether the ambient illuminance is within the threshold illuminance range;
if not, S600: and sending warning information.
The embodiment of the invention provides a vision protection monitoring system, which comprises an identification module, a calculation module, a storage module, a judgment module and an alarm module, wherein:
the identification module is used for acquiring the position information of the target object; and position information of at least one face feature point of the user face;
the calculation module is used for calculating the distance H between the user eyes and the target object according to the position relation between the user eyes and the face characteristic points;
the storage module is used for storing vision protection parameters corresponding to the target object, and the vision protection parameters at least comprise a threshold distance H between the target object and the eyes of the user0
The judging module is used for judging the distance H and the threshold distance H0The relationship between;
and the alarm module is used for sending or not sending warning information according to the judgment result of the judgment module.
Preferably, the identification module comprises an image acquisition module, an image identification module and a three-dimensional space module;
the image acquisition module is used for acquiring an image of a target object and/or a face of a user;
the image recognition module is used for recognizing attributes of the target object and/or human face characteristic points in the image of the human face of the user;
the three-dimensional space module is used for obtaining first coordinate information of a target object and/or second coordinate information of a face characteristic point according to the image of the target object and/or the face of the user obtained by the image obtaining module, wherein the first coordinate information is used as position information of the target object, and the second coordinate information is used as position information of the face characteristic point.
Preferably, the image acquisition module, the image recognition module and the three-dimensional space module are connected through a wireless network;
the image recognition module and the three-dimensional space module acquire the image of the target object and/or the face of the user acquired by the image acquisition module through a wireless network
And the three-dimensional space module sends the position information of the target object and the position information of the face characteristic point back to the image identification module through a wireless network.
An embodiment of the present invention further provides a protection monitoring device, including:
a processor;
a memory having stored therein executable instructions of the processor;
wherein the processor is configured to perform the steps of the vision protection monitoring method via execution of the executable instructions.
An embodiment of the present invention also provides a computer-readable storage medium storing a program, characterized in that the program, when executed, implements the steps of the eyesight protection monitoring method.
The method and the system of the invention play a role in protecting eyesight by monitoring the distance between the eyes of the user and the target object in real time and reminding the user of poor eye use condition in time, and the method is suitable for different scenes.
Drawings
Other features, objects, and advantages of the invention will be apparent from the following detailed description of non-limiting embodiments, which proceeds with reference to the accompanying drawings and which is incorporated in and constitutes a part of this specification, illustrating embodiments consistent with the present application and together with the description serve to explain the principles of the application. It is obvious that the drawings in the following description are only some embodiments of the invention, and that for a person skilled in the art, other drawings can be derived from them without inventive effort.
FIG. 1 is a flow chart of a vision protection monitoring method according to an embodiment of the present invention;
FIG. 2 is a schematic three-dimensional space diagram of the positional relationship between the user's eyes and the human face feature points;
FIG. 3 is a schematic three-dimensional space diagram of calculating a distance between a user's eye and a target object;
FIG. 4 is a flowchart of a method for monitoring eyesight protection according to an embodiment of the invention
FIG. 5 is a schematic diagram of a vision protection monitoring system according to an embodiment of the present invention;
FIG. 6 is a schematic structural diagram of a vision protection monitoring device according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of a computer-readable storage medium according to an embodiment of the present invention.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
Fig. 1 is a flowchart of a vision protection monitoring method according to an embodiment of the present invention, specifically, the method includes the following steps:
s100: acquiring position information of a target object;
s200: acquiring position information of at least one face characteristic point of a face of a user;
s300: calculating the distance H between the eyes of the user and the target object according to the position relation between the eyes of the user and the face characteristic points; the face feature point may include at least one of a tip of a nose, a left corner of a mouth, a right corner of a mouth, a chin point, a left earlobe, and a right earlobe.
S400: acquiring vision protection parameters corresponding to the target object, wherein the vision protection parameters at least comprise a threshold distance H between the target object and the eyes of the user0
S500: judging whether the distance H is smaller than the threshold distance H0
If so, S600: and sending warning information.
If not, returning to the step S100, and continuously monitoring whether the distance H between the user face and the target object is smaller than the threshold distance H or not in the next monitoring period0
In step S400, vision protection parameters corresponding to the target object are obtained, where the vision protection parameters at least include a threshold distance H between the target object and the eyes of the user0It can be understood that when the target object is a book, a mobile phone, a tablet computer or a television screen, the reasonable distance between the eyes and the target object, or the distance value to be reached which is less harmful to the vision, is different. If the distance between a book or a handheld device and eyes is more than 30cm, the distance between a computer and the eyes is more than 60cm when the computer is operated, the reasonable distance between a television and the eyes also depends on the size of a display screen, the reasonable distance is more than 7-9 times of the diagonal length of the display screen, and the reasonable distance between a target object and the eyes of a user is defined as a threshold distance H. Based on the method, the multi-scene eyesight protection monitoring can be realized.
The eyesight protection parameters in the step S400 may be preset, or specific parameters thereof may be changed according to different users.
Meanwhile, in different scenes, that is, when the eyesight protection monitoring system does not know the attribute of the target object, the method of the present invention further provides a step of automatically identifying the target object, and specifically before the step S400, the method may further include:
identifying attributes of the target object;
the step S400 further includes:
s410: and acquiring vision protection parameters corresponding to the target object according to the identified attributes of the target object and according to different target objects and a vision protection parameter mapping table.
Preferably, before the step S400, the method further includes:
acquiring a first image of a target object;
the attribute of the target object is identified by an image recognition technique, and the image recognition technique herein may employ a method of identifying different objects by analyzing objects in a picture or a video stream based on computer large-scale image training, but is not limited to this method.
In an embodiment of the present invention, the step S100 includes:
s101: acquiring a first depth image of a target object, wherein pixel points in the first depth image correspond to depth information; the first depth image may be obtained by a depth imaging camera, such as a Time-of-flight (TOF) camera, in which the camera emits modulated near-infrared light, which is reflected after encountering an object, and the camera converts the distance of the photographed scene by calculating the Time difference or phase difference between light emission and reflection to generate depth information, and then, in combination with the obtained conventional image, the three-dimensional contour of the object can be presented in a topographic map manner that different colors represent different distances.
Meanwhile, the depth information may be converted into a three-dimensional coordinate system with the camera as an origin, and the first coordinate information of the target object may be acquired as the position information of the target object according to the first depth image obtained in S101 in S102.
Similarly, the obtaining of the position information of the at least one face feature point of the user face in step S200 may also be obtained by:
s201: acquiring a second depth image of the face of the user, wherein pixel points in the second depth image correspond to depth information;
s202: identifying at least one face feature point from the second depth image;
s203: and acquiring second coordinate information of the face characteristic point as the position information of the face characteristic point.
The reason why the method of directly acquiring the eye position information of the user is not adopted in the present invention is that the acquisition of the eye position information may interfere with the normal use of the eyes of the user, resulting in a series of uncomfortable and inconvenient use scenarios.
It should be noted that, the position relationship between the user' S eyes and the facial feature points in step S300 may be preset by the system, or may be obtained in advance by the following steps:
acquiring a third depth image of the face of the user, wherein pixel points in the third depth image correspond to depth information;
the face feature point positioning technology is carried out on face data by utilizing a face detection technology and a key point positioning technology, and the face feature point positioning method is roughly divided into three types: a detection method based on an Active Shape Model (ASM), a detection method based on cascade shape regression (CPR), a method based on deep learning and the like, so that the method can identify the pupils of the two eyes and at least one face characteristic point of the face of the user according to the third depth image by utilizing the technology; acquiring coordinate information of pupils of two eyes of a face of a user and third coordinate information of the face characteristic points; thereby obtaining the position relation between the eyes and the face characteristic points, wherein the position relation is the distance between the eyes and the face characteristic points or the space relation.
Further for example, the TOF camera in the vision protection monitoring system 100 performs third depth image acquisition of the face of the user, and identifies the pupils of both eyes and the chin respectively by an image identification technology, and may use the center point C1 of the pupils of both eyes as the reference point of the eyes of the user, and may obtain the coordinate information of C1 in the three-dimensional coordinate system with the TOF camera a1 in the system as the origin and the third coordinate information of the chin B1 as the feature point of the face, as shown in fig. 2, and store these coordinates as preset information. At this time, the distance m of C1 and B1 may be considered as the positional relationship between the eyes of a specific user and the human face feature point.
When the method of the present invention is used for protecting and monitoring the eyesight of a user, the eyesight protection monitoring system 100 is placed between the face of the user and the target object 200, so that a TOF camera in the system can simultaneously obtain the position information of the face of the user and the target object, as shown in fig. 3, wherein a2 is an origin, the position information of the target object obtained at this time is D, and the position information of the chin of the characteristic point of the face of the user is B2, and according to the position relationship between the eyes of the user and the characteristic point of the face, the position information of the center point C2 of the pupils of the two eyes of the user can be obtained, so as to calculate the distance between the position information D and the position information C2, namely the distance D between the eyes of the user and the target object.
In practical use, in order to achieve the effect of real-time vision protection and monitoring, the duration of each monitoring period may be set to T0, and the steps S100 to S500 are performed in each monitoring period, where the step S500 may further include:
judging whether the distance H of the current monitoring period is smaller than the threshold distance H0Setting that the current monitoring period is in the kth monitoring period, and k is more than or equal to 1;
if the distance H of the current monitoring period is less than the threshold distance H0Then obtain the abnormal duration T of the previous monitoring periodk-1,Tk-1Represents that the distance H between the eyes of the user and the target object is continuously smaller than the distance H in the k-1 th monitoring period0And performing step S501: calculating the abnormal duration T of the current monitoring periodk=Tk-1+ T0; if the distance H of the current monitoring period is greater than or equal to the threshold distance H0Then, step S502 is executed: setting an abnormal duration T of a current monitoring periodk=0。
Then, step S503 is executed: judging the abnormal duration T of the current monitoring periodkAnd if so, continuing to execute the step S600, otherwise, continuing to execute the step S100, namely, continuing to execute the steps S100 to S500 in the (k + 1) th monitoring period. Second threshold values here, e.g. in vision protection parametersDistance of value H0It may be an empirical value, and the eye-wear time for eye fatigue may vary for different target subjects. For example, the duration of reading the book should not exceed 60 minutes, the duration of computer operation, reading with a mobile phone device, or watching with a television should not exceed 30 minutes, and the like, different second thresholds may be set according to different users.
In the invention, the distance H of each monitoring period can be recorded so as to facilitate subsequent inquiry of a user. The recorded data can be displayed through a display end, such as a mobile phone APP end of a user, the user can check the eye using condition, historical data can be analyzed, and meanwhile, the eye using habit of the user can be adjusted by combining reasonable eyesight protection suggestions, and eyesight can be protected to the maximum extent. The display end can be any product or component with a display function, such as a mobile phone, a tablet computer, a television, a display, a notebook computer, a digital photo frame, a media player, a watch device, a pendant device, an earphone or earphone device, a navigation device, a wearable or micro device, an embedded device of a system in which an electronic device with a display is installed in a self-service terminal or an automobile, and the like.
In the actual eye-using scene, the light environment is also an important factor influencing the eyesight. In the eyesight protection monitoring method of the present invention, before the step S400, the method further includes: the ambient illuminance, which is commonly referred to as lux, of the target object is obtained and represents the amount of light received per unit area of the surface of the subject. 1 lux is equivalent to 1 lumen/square meter, namely the luminous flux of a light source which is at a distance of one meter and has the luminous intensity of 1 candle and vertically irradiates on the area of each square meter of a subject.
At this time, the eyesight protection parameters in the step S400 further include a threshold illuminance corresponding to the target object; it is generally considered that the ambient light level comfortable for the human eye is between 50lux and 60lux, and when the ambient light level is 300lux, the human eye will be extremely uncomfortable. Similarly, too weak degree of attention may cause eye fatigue, and therefore, the threshold illumination may be in a range of 40lux to 200 lux.
The step S500 further includes S510: judging whether the ambient illuminance is within the threshold illuminance;
if not, S600: sending warning information;
if yes, whether the environment off-illumination meets the requirement or not is continuously monitored in the next monitoring period, namely whether the environment off-illumination is within the threshold illumination or not.
The warning message includes at least one of a voice prompt, a screen display prompt, or a vibration, but is not limited thereto. Even the types of the warnings can be increased or the levels of the warnings can be improved in the process of time passing, and the steps are used for fully reminding the user in various ways such as vision, hearing and the like.
It should be noted that, the step S510 of determining whether the ambient light level is within the threshold light level may also be performed periodically, and the period of performing the step may be the same as the period of monitoring the distance H between the target object and the eyes of the user. Similarly, the time when the ambient light level continues to no longer be within the threshold light level may be recorded, but when the duration is greater than a certain threshold, an alert message may be sent to the user.
In the above method, the step of acquiring the ambient light illuminance is only added for example, and likewise, a parameter of acquiring the target object light illuminance may be added to prompt the user whether the target object light is too strong or too weak, and the like.
The embodiment of the present invention provides a vision protection monitoring system 100, see fig. 5, including an identification module M100, a calculation module M200, a storage module M300, a judgment module M400, and an alarm module M500, wherein:
the identification module M100 is configured to obtain position information of a target object; and position information of at least one face feature point of the user face;
the calculating module M200 is configured to calculate a distance H between the user's eyes and the target object according to a position relationship between the user's eyes and the face feature points;
the storage module M300 is configured to store eyesight protection parameters corresponding to the target object, where the eyesight protection parameters at least include a threshold distance H between the target object and the eyes of the user0
The judging module M400 is configured to judge that the distance H is equal to the threshold distance H0The relationship between;
the alarm module M500 is configured to send or not send warning information according to the determination result of the determination module.
In an embodiment, the identification module M100 may further include an image acquisition module M110, an image identification module M120, and a three-dimensional space module M130;
the image acquisition module M110 is configured to acquire an image of a target object and/or a face of a user; as in the above example, the image acquisition module M110 may be a TOF camera system;
the image recognition module M120 is configured to recognize attributes of the target object and/or human face feature points in the image of the human face of the user;
the three-dimensional space module M130 is configured to obtain first coordinate information of a target object and/or second coordinate information of a face feature point according to an image of the user's face and/or the image of the target object obtained by the image obtaining module, where the first coordinate information is used as position information of the target object, and the second coordinate information is used as position information of the face feature point.
In an embodiment, the image obtaining module M110, the image recognition module M120 and the three-dimensional space module M120 are connected through a wireless network; including but not limited to at least one of the following: a network protocol based on Zigbee (Zigbee) protocol; a network protocol based on a wireless networking specification Z-Wave; a network protocol based on Wi-Fi (Wireless Fidelity) protocol; a network protocol based on a BLE (bluetooth low Energy) protocol; a network protocol based on an RF (Radio Frequency) 433 protocol, which uses a 433Mhz Frequency band; a network protocol based on RF2.4G protocol, wherein the network protocol uses a 2.4Ghz frequency band; a network protocol based on the radio frequency RF5G protocol, which uses the 5Ghz band.
The image recognition module M120 and the three-dimensional space module M130 acquire the image of the target object and/or the face of the user acquired by the image acquisition module M110 through a wireless network
The three-dimensional space module M130 transmits the position information of the target object and the position information of the face feature point back to the image recognition module M120 through a wireless network.
Accordingly, the vision protection monitoring system 100 may further include a light detection module for monitoring the ambient light intensity and/or the light intensity of the target object in the respective eye-using scenes.
An electronic device 600 according to this embodiment of the invention is described below with reference to fig. 6. The electronic device 600 shown in fig. 6 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 6, the electronic device 600 is embodied in the form of a general purpose computing device. The components of the electronic device 600 may include, but are not limited to: at least one processing unit 610, at least one memory unit 620, a bus 630 connecting the different platform components (including the memory unit 620 and the processing unit 610), a display unit 640, etc.
Wherein the storage unit stores program code executable by the processing unit 610 to cause the processing unit 610 to perform steps according to various exemplary embodiments of the present invention described in the above-mentioned electronic prescription flow processing method section of the present specification. For example, processing unit 610 may perform the steps as shown in fig. 1.
The storage unit 620 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM)6201 and/or a cache memory unit 6202, and may further include a read-only memory unit (ROM) 6203.
The memory unit 620 may also include a program/utility 6204 having a set (at least one) of program modules 6205, such program modules 6205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 630 may be one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 600 may also communicate with one or more external devices 700 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 600, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 600 to communicate with one or more other computing devices. Such communication may occur via an input/output (I/O) interface 650. Also, the electronic device 600 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via the network adapter 660. The network adapter 660 may communicate with other modules of the electronic device 600 via the bus 630. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 600, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAIH systems, tape drives, and data backup storage platforms, to name a few.
Embodiments of the present invention also provide a computer-readable storage medium for storing a program, where the program is executed to implement the steps of the sorting vision protection monitoring method. In some possible embodiments, the aspects of the present invention may also be implemented in the form of a program product comprising program code for causing a terminal device to perform the steps according to various exemplary embodiments of the present invention described in the above-mentioned electronic prescription flow processing method section of this specification, when the program product is run on the terminal device.
Referring to fig. 7, a program product 800 for implementing the above method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CH-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CH-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable storage medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable storage medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
In summary, the present invention provides a method, a system, a device and a storage medium for eyesight protection monitoring, which comprises the following steps: acquiring position information of a target object; acquiring position information of at least one face characteristic point of a face of a user; calculating the distance H between the eyes of the user and the target object according to the position relation between the eyes of the user and the face characteristic points; acquiring vision protection parameters corresponding to the target object, wherein the vision protection parameters at least comprise a threshold distance H between the target object and eyes of a user; judging whether the distance H is smaller than the threshold distance H; if yes, sending warning information. The method and the system of the invention play a role in protecting eyesight by monitoring the distance between the eyes of the user and the target object in real time and reminding the user of poor eye use condition in time, and the method is suitable for different scenes.
The foregoing is a more detailed description of the invention in connection with specific preferred embodiments and it is not intended that the invention be limited to these specific details. It will be evident to those skilled in the art that the present application is not limited to the details of the foregoing illustrative embodiments, and that the present application may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the application being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned. Furthermore, it is obvious that the word "comprising" does not exclude other elements or steps, and the singular does not exclude the plural. A plurality of units or means recited in the apparatus claims may also be implemented by one unit or means in software or hardware. The terms first, second, etc. are used to denote names, but not any particular order.

Claims (15)

1. A vision protection monitoring method is characterized by comprising the following steps:
s100: acquiring position information of a target object;
s200: acquiring position information of at least one face characteristic point of a face of a user;
s300: calculating the distance H between the eyes of the user and the target object according to the position relation between the eyes of the user and the face characteristic points;
s400: acquiring vision protection parameters corresponding to the target object, wherein the vision protection parameters at least comprise a threshold distance H between the target object and the eyes of the user0
S500: judging whether the distance H is smaller than the threshold distance H0
If so, S600: and sending warning information.
2. The vision protection monitoring method of claim 1, wherein the human face feature point comprises at least one of a tip of a nose, a left corner of a mouth, a right corner of a mouth, a chin point, a left earlobe, and a right earlobe.
3. The vision protection monitoring method of claim 1, before the step of S400, further comprising:
identifying attributes of the target object;
the step S400 further includes:
s410: and acquiring vision protection parameters corresponding to the target object according to the identified attributes of the target object and according to different target objects and a vision protection parameter mapping table.
4. The vision protection monitoring method of claim 3, before the step S400, further comprising:
acquiring a first image of a target object;
attributes of the target object are identified by image recognition techniques.
5. The vision protection monitoring method of claim 1, wherein the S100 step includes:
s101: acquiring a first depth image of a target object, wherein pixel points in the first depth image correspond to depth information;
s102: and acquiring first coordinate information of a target object according to the first depth image as the position information of the target object.
6. The vision protection monitoring method of claim 1, wherein the S200 step includes:
s201: acquiring a second depth image of the face of the user, wherein pixel points in the second depth image correspond to depth information;
s202: identifying at least one face feature point from the second depth image;
s203: and acquiring second coordinate information of the face characteristic point as the position information of the face characteristic point.
7. The eyesight protection monitoring method of claim 1, wherein the positional relationship between the user's eyes and the human face feature points is obtained by:
acquiring a third depth image of the face of the user, wherein pixel points in the third depth image correspond to depth information;
recognizing two-eye pupils and at least one face characteristic point of the face of the user according to the third depth image;
acquiring coordinate information of pupils of two eyes of a face of a user and third coordinate information of the face characteristic points;
recording the position relation between the eyes of the user and the face characteristic points.
8. The eyesight protection monitoring method according to claim 1, wherein the duration of each monitoring period is set to T0, and the steps S100 to S500 are performed in each monitoring period, the step S500 includes:
judging whether the distance H of the current monitoring period is smaller than the threshold distance H0Setting the current monitoring period to be the kth monitoring period;
if the distance H of the current monitoring period is less than the threshold distance H0Then obtain the abnormal duration T of the previous monitoring periodk-1,Tk-1Indicating that the distance H between the eyes of the user and the target object in the k-1 th monitoring period is continuously smaller than the threshold distance H0And calculating the abnormal duration T of the current monitoring periodk=Tk-1+T0;
If the distance H of the current monitoring period is greater than or equal to the threshold distance H0Then the abnormal duration T of the current monitoring period is setk=0;
Judging the abnormal duration T of the current monitoring periodkIf the second threshold value is greater than the first threshold value, if so, the step S600 is continued.
9. The vision protection monitoring method of claim 8, further comprising the steps of: recording the distance H for each monitoring period.
10. The vision protection monitoring method of claim 1, before the step of S400, further comprising:
acquiring the ambient illuminance of a target object;
the vision protection parameters in the step S400 further include threshold illuminance corresponding to the target object;
the step S500 further includes S510: judging whether the ambient illuminance is within the threshold illuminance range;
if not, S600: and sending warning information.
11. The utility model provides a vision protection monitoring system which characterized in that, includes identification module, calculation module, storage module, judgement module and alarm module, wherein:
the identification module is used for acquiring the position information of the target object; and position information of at least one face feature point of the user face;
the calculation module is used for calculating the distance H between the user eyes and the target object according to the position relation between the user eyes and the face characteristic points;
the storage module is used for storing vision protection parameters corresponding to the target object, and the vision protection parameters at least comprise a threshold distance H between the target object and the eyes of the user0
The judging module is used for judging the distance H and the threshold distance H0The relationship between;
and the alarm module is used for sending or not sending warning information according to the judgment result of the judgment module.
12. The vision protection monitoring system of claim 11, wherein the recognition module includes an image acquisition module, an image recognition module, and a three-dimensional space module;
the image acquisition module is used for acquiring an image of a target object and/or a face of a user;
the image recognition module is used for recognizing attributes of the target object and/or human face characteristic points in the image of the human face of the user;
the three-dimensional space module is used for obtaining first coordinate information of a target object and/or second coordinate information of a face characteristic point according to the image of the target object and/or the face of the user obtained by the image obtaining module, wherein the first coordinate information is used as position information of the target object, and the second coordinate information is used as position information of the face characteristic point.
13. The vision protection monitoring system of claim 12, wherein the image acquisition module, the image recognition module and the three-dimensional space module are connected by a wireless network;
the image recognition module and the three-dimensional space module acquire images of a target object and/or a user face through the image acquisition module through a wireless network;
and the three-dimensional space module sends the position information of the target object and the position information of the face characteristic point back to the image identification module through a wireless network.
14. A vision protection monitoring device, comprising:
a processor;
a memory having stored therein executable instructions of the processor;
wherein the processor is configured to perform the steps of the vision protection monitoring method of any one of claims 1 to 10 via execution of the executable instructions.
15. A computer-readable storage medium storing a program which, when executed, performs the steps of the vision protection monitoring method of any one of claims 1 to 10.
CN201911117661.1A 2019-11-15 2019-11-15 Eyesight protection monitoring method, system, equipment and storage medium Pending CN110801229A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911117661.1A CN110801229A (en) 2019-11-15 2019-11-15 Eyesight protection monitoring method, system, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911117661.1A CN110801229A (en) 2019-11-15 2019-11-15 Eyesight protection monitoring method, system, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN110801229A true CN110801229A (en) 2020-02-18

Family

ID=69489990

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911117661.1A Pending CN110801229A (en) 2019-11-15 2019-11-15 Eyesight protection monitoring method, system, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110801229A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113639871A (en) * 2020-04-24 2021-11-12 杭州海康威视***技术有限公司 Target object detection method, device and equipment and storage medium
CN114120357A (en) * 2021-10-22 2022-03-01 中山大学中山眼科中心 Neural network-based myopia prevention method and device
CN115035588A (en) * 2022-05-31 2022-09-09 中国科学院半导体研究所 Eyesight protection prompting method, device, storage medium and program product
CN115116088A (en) * 2022-05-27 2022-09-27 中国科学院半导体研究所 Myopia prediction method, apparatus, storage medium, and program product

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105573627A (en) * 2015-04-28 2016-05-11 宇龙计算机通信科技(深圳)有限公司 Method and device for prompting user to protect eyes through intelligent glasses
CN105759971A (en) * 2016-03-08 2016-07-13 珠海全志科技股份有限公司 Method and system for automatically prompting distance from human eyes to screen
WO2016192565A1 (en) * 2015-06-02 2016-12-08 杭州镜之镜科技有限公司 Individual eye use monitoring system
CN106503645A (en) * 2016-10-19 2017-03-15 深圳大学 Monocular distance-finding method and system based on Android
CN108615012A (en) * 2018-04-27 2018-10-02 Oppo广东移动通信有限公司 Distance reminding method, electronic device and non-volatile computer readable storage medium storing program for executing
CN110349383A (en) * 2019-07-18 2019-10-18 浙江师范大学 A kind of intelligent eyeshield device and method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105573627A (en) * 2015-04-28 2016-05-11 宇龙计算机通信科技(深圳)有限公司 Method and device for prompting user to protect eyes through intelligent glasses
WO2016192565A1 (en) * 2015-06-02 2016-12-08 杭州镜之镜科技有限公司 Individual eye use monitoring system
CN105759971A (en) * 2016-03-08 2016-07-13 珠海全志科技股份有限公司 Method and system for automatically prompting distance from human eyes to screen
CN106503645A (en) * 2016-10-19 2017-03-15 深圳大学 Monocular distance-finding method and system based on Android
CN108615012A (en) * 2018-04-27 2018-10-02 Oppo广东移动通信有限公司 Distance reminding method, electronic device and non-volatile computer readable storage medium storing program for executing
CN110349383A (en) * 2019-07-18 2019-10-18 浙江师范大学 A kind of intelligent eyeshield device and method

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113639871A (en) * 2020-04-24 2021-11-12 杭州海康威视***技术有限公司 Target object detection method, device and equipment and storage medium
CN113639871B (en) * 2020-04-24 2022-12-23 杭州海康威视***技术有限公司 Target object detection method, device and equipment and storage medium
CN114120357A (en) * 2021-10-22 2022-03-01 中山大学中山眼科中心 Neural network-based myopia prevention method and device
CN115116088A (en) * 2022-05-27 2022-09-27 中国科学院半导体研究所 Myopia prediction method, apparatus, storage medium, and program product
CN115035588A (en) * 2022-05-31 2022-09-09 中国科学院半导体研究所 Eyesight protection prompting method, device, storage medium and program product

Similar Documents

Publication Publication Date Title
CN110801229A (en) Eyesight protection monitoring method, system, equipment and storage medium
US20230333377A1 (en) Display System
CN109635621B (en) System and method for recognizing gestures based on deep learning in first-person perspective
RU2719839C2 (en) Methods and systems for retrieving user motion characteristics using hall sensor for providing feedback to user
US20220167878A1 (en) Posture Analysis Systems and Methods
US20190073767A1 (en) Facial skin mask generation for heart rate detection
CN108135469A (en) Estimated using the eyelid shape of eyes attitude measurement
KR20180109109A (en) Method of recognition based on IRIS recognition and Electronic device supporting the same
CN103955272A (en) Terminal equipment user posture detecting system
CN110673819A (en) Information processing method and electronic equipment
JPWO2019031005A1 (en) Information processing apparatus, information processing method, and program
EP4002199A1 (en) Method and device for behavior recognition based on line-of-sight estimation, electronic equipment, and storage medium
US20210182554A1 (en) Method and system for performing eye tracking using an off-axis camera
CN112506336A (en) Head mounted display with haptic output
KR20200076170A (en) Assistance system and method for a person who is visually impaired using smart glasses
US11030979B2 (en) Information processing apparatus and information processing method
CN114677703A (en) Privacy preserving mask compliance level measurement system and method
CN111052127A (en) System and method for fatigue detection
US11137600B2 (en) Display device, display control method, and display system
US10997828B2 (en) Sound generation based on visual data
EP3699865B1 (en) Three-dimensional face shape derivation device, three-dimensional face shape deriving method, and non-transitory computer readable medium
US20190354175A1 (en) Eye Enrollment For Head-Mounted Enclosure
WO2020044916A1 (en) Information processing device, information processing method, and program
JP2018160128A (en) Image processing apparatus, image processing system, and image processing method
WO2019207721A1 (en) Information processing device, control method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200218