CN112954848A - Intelligent light control method and device, computer equipment and storage medium - Google Patents

Intelligent light control method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN112954848A
CN112954848A CN202110260081.9A CN202110260081A CN112954848A CN 112954848 A CN112954848 A CN 112954848A CN 202110260081 A CN202110260081 A CN 202110260081A CN 112954848 A CN112954848 A CN 112954848A
Authority
CN
China
Prior art keywords
user
real
time
iris
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110260081.9A
Other languages
Chinese (zh)
Inventor
王乾富
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Daxinglang Electronic Technology Co Ltd
Original Assignee
Shenzhen Daxinglang Electronic Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Daxinglang Electronic Technology Co Ltd filed Critical Shenzhen Daxinglang Electronic Technology Co Ltd
Priority to CN202110260081.9A priority Critical patent/CN112954848A/en
Publication of CN112954848A publication Critical patent/CN112954848A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B45/00Circuit arrangements for operating light-emitting diodes [LED]
    • H05B45/10Controlling the intensity of the light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Image Input (AREA)

Abstract

The application relates to an intelligent light control method, an intelligent light control device, computer equipment and a storage medium, wherein a real-time face image of a user is acquired; then, according to the real-time face image, obtaining skin color information of the user through a pre-trained face model; acquiring real-time iris information of the user through a pre-trained iris model according to the real-time face image; and finally, completing light control through a pre-trained light control model according to the skin color information and the real-time iris information. The invention realizes that the dimming scheme is formulated according to the user by capturing the skin color information and the iris information of the user, thereby adjusting the color and the brightness of the light and finally leading the light control to be more intelligently finished.

Description

Intelligent light control method and device, computer equipment and storage medium
Technical Field
The invention relates to the technical field of intelligent light control, in particular to an intelligent light control method, an intelligent light control device, computer equipment and a storage medium.
Background
The existing LED table lamp has a dimming function, the existing dimming scheme can be that the intensity of light is adjusted through a manual mode through the change of the environment by a user, or the ambient illumination is acquired and then is automatically changed correspondingly, for example, when the ambient illumination is high, the illumination of the light source of the LED table lamp is increased so as to adapt to the environment, and the better user experience is achieved. However, the existing dimming scheme does not consider the characteristics of the user, and researches show that the eyes of different races are different and the response to light is different. If the dimming scheme suitable for the user cannot be specified according to the characteristics of the user, poor experience is inevitably brought.
Disclosure of Invention
In view of the above, it is necessary to provide an intelligent light control method, an intelligent light control device, a computer device, and a storage medium, which can make a dimming scheme suitable for a user according to skin color information and pupil information of the user.
In a first aspect, the present invention provides an intelligent light control method, including:
acquiring a real-time face image of a user;
obtaining skin color information of the user through a pre-trained face model according to the real-time face image;
acquiring real-time iris information of the user through a pre-trained iris model according to the real-time face image;
and completing light control through a pre-trained light control model according to the skin color information and the real-time iris information.
In a second aspect, the present invention provides an intelligent light control device, comprising:
the acquisition module is used for acquiring a face image of a user;
the face model module is used for acquiring the skin color information of the user through a pre-trained face model according to the real-time face image;
the iris model module is used for acquiring the real-time iris information of the user through a pre-trained iris model according to the real-time face image;
and the light control module is used for finishing light control through a pre-trained light control model according to the skin color information and the real-time iris information.
In a third aspect, the present invention provides a computer apparatus comprising a memory and a processor, the memory storing a computer program which, when executed by the processor, causes the processor to perform the steps of:
acquiring a real-time face image of a user;
obtaining skin color information of the user through a pre-trained face model according to the real-time face image;
acquiring real-time iris information of the user through a pre-trained iris model according to the real-time face image;
and completing light control through a pre-trained light control model according to the skin color information and the real-time iris information.
In a fourth aspect, the present invention provides a computer readable storage medium storing a computer program which, when executed by a processor, causes the processor to perform the steps of:
acquiring a real-time face image of a user;
obtaining skin color information of the user through a pre-trained face model according to the real-time face image;
acquiring real-time iris information of the user through a pre-trained iris model according to the real-time face image;
and completing light control through a pre-trained light control model according to the skin color information and the real-time iris information.
The application relates to an intelligent light control method, an intelligent light control device, computer equipment and a storage medium, wherein a real-time face image of a user is acquired; then, according to the real-time face image, obtaining skin color information of the user through a pre-trained face model; acquiring real-time iris information of the user through a pre-trained iris model according to the real-time face image; and finally, completing light control through a pre-trained light control model according to the skin color information and the real-time iris information. The invention realizes that the dimming scheme is formulated according to the user by capturing the skin color information and the iris information of the user, thereby adjusting the color and the brightness of the light and finally leading the light control to be completed more intelligently.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the structures shown in the drawings without creative efforts.
FIG. 1 is a flow diagram of an intelligent light control method in one embodiment;
FIG. 2 is a flow diagram of an intelligent light control method in one embodiment;
FIG. 3 is a flow diagram of an intelligent light control method in one embodiment;
FIG. 4 is a flow diagram of an intelligent light control method in one embodiment;
FIG. 5 is a block diagram of an embodiment of an intelligent lighting control apparatus;
FIG. 6 is a diagram illustrating an internal structure of a computer device according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
As shown in fig. 1, the present invention provides an intelligent light control method, which includes:
and 102, acquiring a real-time face image of the user.
The method comprises the steps of obtaining a real-time face image of a user through shooting equipment. Illustratively, a real-time facial image of a user can be captured by the intelligent AI camera.
And 104, acquiring the skin color information of the user through a pre-trained face model according to the real-time face image.
The face model is used for obtaining the skin color information of the user according to the real-time face image. After the shooting equipment acquires the real-time face image of the user, the skin color information of the user can be obtained through the trained face model. For example, the face model may be stored in the camera, the terminal, or the cloud. According to the real-time face image of the user, the user can be judged to be a yellow race, a black race, a white race or a brown race.
And 106, acquiring real-time iris information of the user through a pre-trained iris model according to the real-time face image.
Wherein, the iris model is used for obtaining the iris information of the user according to the implemented face image. According to the real-time face image of the user, the skin color information of the user can be obtained, and the iris information of the user can be obtained through the trained iris model. For example, the iris model may be stored in the camera, the terminal, or the cloud, and the trained iris model is used to obtain the iris information of the user, including but not limited to the movement state of the user's eyeball and the contraction state of the user's pupil.
And step 108, completing light control through a pre-trained light control model according to the skin color information and the real-time iris information.
The light control model is used for formulating a dimming scheme suitable for the user according to the skin color information and the iris information of the user. Illustratively, the light control model is obtained based on training of a neural network model, and the color of the skin, the age and the sex output by the face model, the contraction condition of the pupil output by the iris model, the motion state of the eyeball, the color of the iris and the like are used as the input of the neural network model, and the light control model is obtained through training. For example, the light control model may be stored in the camera, the terminal, or the cloud.
In one embodiment, the skin color information and the iris information of the user are obtained through the trained face model and iris model respectively, and specifically, the eye fatigue state of the user can be further obtained according to the iris information of the user. Through the skin color information and the eye fatigue information of the user, the trained light control model can be used for formulating a dimming scheme for the user.
The application relates to an intelligent light control method, which comprises the steps of obtaining a real-time face image of a user; then, according to the real-time face image, obtaining skin color information of the user through a pre-trained face model; acquiring real-time iris information of the user through a pre-trained iris model according to the real-time face image; and finally, completing light control through a pre-trained light control model according to the skin color information and the real-time iris information. The invention realizes that the dimming scheme is formulated according to the user by capturing the skin color information and the iris information of the user, thereby adjusting the color and the brightness of the light and finally leading the light control to be completed more intelligently.
In one embodiment, as shown in fig. 2, the intelligent light control method is applied to an intelligent light control system, and the method further includes: and acquiring user information used for logging in the system by a user, wherein the user information comprises actual age and actual gender.
In order to improve safety guarantee for a user, the intelligent light control system has an account login function, the user logs in the system through preset user information, and after login succeeds, the system starts to make a dimming scheme for the user. Illustratively, the user information includes, but is not limited to, the actual age, the actual gender of the user.
Before the obtaining of the skin color information of the user through a pre-trained face model according to the real-time face image, the method further comprises the following steps:
step 202, obtaining the current age and the current gender of the user through a pre-trained face model according to the real-time face image.
In order to enable the login system to have higher security, after the shooting equipment acquires the real-time face image of the user, the trained face model can also judge the current age and the current gender of the user according to the real-time face image of the user, and the trained face model is used for comparing the current age and the current gender with user information.
Step 204, judging whether the current age of the user is consistent with the actual age; and judging whether the current gender of the user is consistent with the actual gender.
And judging whether the current age of the user obtained by the face model is consistent with the actual age or not and judging whether the current gender of the user obtained by the face model is consistent with the actual gender or not to judge whether the user can be verified by logging in the system or not.
Step 206, if the current age of the user is consistent with the actual age, and the current gender of the user is consistent with the actual gender, the user successfully logs in the system.
When the current age of the user is consistent with the actual age and the current gender is consistent with the actual gender at the same time, the user successfully logs in the system. When only the current age of the user coincides with the actual age or the current gender coincides with the actual gender, the user cannot log in the system.
In one embodiment, as shown in fig. 3, after acquiring the real-time iris information of the user through a pre-trained iris model according to the real-time face image, the method further includes:
step 302, determining the real-time position of the user's gaze according to the real-time iris information based on the iris model.
The iris information comprises the motion state of eyeballs, the real-time position of the eyesight of the user can be obtained through the trained iris model according to the motion state of the eyeballs of the user contained in the iris information of the user, and then a real-time dimming scheme is formulated for the user according to the real-time position of the eyesight of the user.
And 304, based on the light control model, completing light control according to the skin color information and the real-time position of the user's gaze.
And the light control model can complete light control based on the skin color information of the user and the real-time position of the sight of the user.
In one embodiment, the light control model determines the light color and the light intensity according to the skin color information of the user, and determines the light irradiation range according to the real-time position of the user's gaze, thereby completing the light control.
In one embodiment, as shown in fig. 4, after acquiring the real-time iris information of the user through a pre-trained iris model according to the real-time face image, the method further includes:
step 402, determining real-time eye fatigue information of the user according to the real-time iris information based on the iris model;
wherein, in order to make light control more accurate, iris model still is used for confirming user's real-time asthenopia information. Illustratively, the iris model determines real-time eye fatigue information of the user according to real-time fatigue evaluation parameters of eye movement characteristics of the user, wherein the real-time fatigue evaluation parameters of the eye movement characteristics comprise the movement of an eyelid and the movement state of an eyeball of the user.
In one embodiment, wakefulness is normal eye opening, rapid blinking, active eye state, mental concentration; fatigue refers to the tendency of eyes to close, the activity of eyeballs to decrease, the vision to stay still, or the action of crowding eyes to resist fatigue; the severe fatigue means that the eyes have a severe tendency of closing, and the eyes are continuously closed for a long time, the degree of opening is reduced, and the like.
And step 404, based on the light control model, completing light control according to the skin color information and the real-time eye fatigue information.
The light control model determines the illumination intensity and the illumination color according to the skin color information and the real-time eye fatigue information of the user, and therefore light control is completed.
In one embodiment, determining real-time asthenopia information of the user based on the real-time iris information comprises: acquiring age information of a user; and determining real-time eye fatigue information of the user according to the age information and the real-time iris information based on the iris model.
Wherein, the sensitivity of people of different ages to light intensity and illumination colour all is different, for making light control more accurate, improve user experience, iris model still is used for confirming user's age information, confirms user's real-time tired information of eye according to user's age information and the real-time tired evaluation parameter of user's eye action characteristic to make light control model confirm illumination intensity and illumination colour, thereby accomplish light control.
In one embodiment, the face model is trained based on a neural network model, and the method further comprises: receiving first training face images obtained from different shooting orientations, and determining a first label corresponding to the first training face images, wherein the first label comprises: skin color, age, gender; and according to the first training face image and the corresponding first label, obtaining the face model through the neural network model training.
The method comprises the steps of receiving first training face images acquired by shooting equipment from different shooting directions, determining first labels such as skin color, age and gender corresponding to the first training face images, using the first labels such as skin color, age and gender as input of a neural network model, and training to obtain a face model. Illustratively, five face images of each sample acquired by the intelligent AI camera from different shooting directions are received, first labels such as skin color, age, gender and the like corresponding to each face image of each sample are confirmed, the first labels such as skin color, age, gender and the like corresponding to each face image of each sample are respectively used as input of a neural network model, and the trained face model is obtained through multiple times of training of multiple samples.
In one embodiment, the iris model is trained based on a neural network model, the method further comprising: receiving a second training face image obtained under different illumination intensities, and determining a second label corresponding to the second training face image, wherein the second label comprises: contraction of the pupil, movement state of the eyeball, iris color; and training the neural network model to obtain the iris model according to the second training face image and the corresponding second label.
The method comprises the steps that a shooting device obtains second training face images from different shooting directions, second labels such as pupil contraction conditions, eyeball motion states and iris colors corresponding to the second training face images are determined, the second labels such as the pupil contraction conditions, the eyeball motion states and the iris colors serve as input of a neural network model, and the iris model is obtained through training. Illustratively, five face images of each sample acquired by the intelligent AI camera from different shooting directions are received, second labels such as the contraction condition of the pupil, the movement state of the eyeball, the iris color and the like corresponding to each face image of each sample are confirmed, the second labels such as the contraction condition of the pupil, the movement state of the eyeball, the iris color and the like corresponding to each face image of each sample are respectively used as the input of the neural network model, and the trained iris model is obtained through multiple times of training of multiple samples.
As shown in fig. 5, the present invention provides an intelligent light control device, which includes:
an obtaining module 502, configured to obtain a face image of a user;
a face model module 504, configured to obtain skin color information of the user through a pre-trained face model according to the real-time face image;
an iris model module 506, which obtains real-time iris information of the user through a pre-trained iris model according to the real-time face image;
and the light control module 508 is configured to complete light control through a pre-trained light control model according to the skin color information and the real-time iris information.
As shown in FIG. 6, in one embodiment, an internal structure of a computer device is shown. The computer equipment can be an intelligent light control device, or a terminal or a server connected with the intelligent light control device. As shown in fig. 6, the computer device includes a processor, a memory, and a network interface connected by a system bus. Wherein the memory includes a non-volatile storage medium and an internal memory. The non-volatile storage medium of the computer device stores an operating system and may also store a computer program, which, when executed by the processor, causes the processor to implement the intelligent light control method. The internal memory may also store a computer program, and the computer program, when executed by the processor, may cause the processor to perform the intelligent light control method. The network interface is used for communicating with an external device. Those skilled in the art will appreciate that the architecture shown in fig. 6 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, the intelligent light control method provided by the present application may be implemented in the form of a computer program, and the computer program may be run on a computer device as shown in fig. 6. The memory of the computer equipment can store various program templates forming the intelligent light control device. For example, the acquisition module 502, the face model module 504, the iris model module 506, and the light control module 508.
A computer device comprising a memory and a processor, the memory storing a computer program that, when executed by the processor, causes the processor to perform the steps of: acquiring a real-time face image of a user; obtaining skin color information of the user through a pre-trained face model according to the real-time face image; acquiring real-time iris information of the user through a pre-trained iris model according to the real-time face image; and completing light control through a pre-trained light control model according to the skin color information and the real-time iris information.
In one embodiment, the method is applied to an intelligent light control system, and the method further comprises the following steps: acquiring user information used for logging in the system by a user, wherein the user information comprises actual age and actual gender; before the obtaining of the skin color information of the user through a pre-trained face model according to the real-time face image, the method further comprises the following steps: acquiring the current age and the current gender of the user through a pre-trained face model according to the real-time face image; judging whether the current age of the user is consistent with the actual age; judging whether the current gender of the user is consistent with the actual gender; and if the current age of the user is consistent with the actual age and the current gender of the user is consistent with the actual gender, the user successfully logs in the system.
In one embodiment, after obtaining the real-time iris information of the user through a pre-trained iris model according to the real-time face image, the method further includes: determining a real-time location of the user's gaze from the real-time iris information based on the iris model; and based on the light control model, completing light control according to the skin color information and the real-time position of the user's gaze.
In one embodiment, after obtaining the real-time iris information of the user through a pre-trained iris model according to the real-time face image, the method further includes: determining real-time asthenopia information of the user from the real-time iris information based on the iris model; and based on the light control model, completing light control according to the skin color information and the real-time eye fatigue information.
In one embodiment, the determining real-time eye fatigue information of the user according to the real-time iris information comprises: acquiring age information of a user; and determining real-time eye fatigue information of the user according to the age information and the real-time iris information based on the iris model.
In one embodiment, the face model is trained based on a neural network model, and the method further includes: receiving first training face images obtained from different shooting orientations, and determining a first label corresponding to the first training face images, wherein the first label comprises: skin color, age, gender; and according to the first training face image and the corresponding first label, obtaining the face model through the neural network model training.
In one embodiment, the iris model is trained based on a neural network model, the method further comprising: receiving a second training face image obtained under different illumination intensities, and determining a second label corresponding to the second training face image, wherein the second label comprises: contraction of the pupil, movement state of the eyeball, iris color; and training the neural network model to obtain the iris model according to the second training face image and the corresponding second label.
A computer-readable storage medium storing a computer program which, when executed by a processor, causes the processor to perform the steps of: acquiring a real-time face image of a user; obtaining skin color information of the user through a pre-trained face model according to the real-time face image; acquiring real-time iris information of the user through a pre-trained iris model according to the real-time face image; and completing light control through a pre-trained light control model according to the skin color information and the real-time iris information.
In one embodiment, the method is applied to an intelligent light control system, and the method further comprises the following steps: acquiring user information used for logging in the system by a user, wherein the user information comprises actual age and actual gender; before the obtaining of the skin color information of the user through a pre-trained face model according to the real-time face image, the method further comprises the following steps: acquiring the current age and the current gender of the user through a pre-trained face model according to the real-time face image; judging whether the current age of the user is consistent with the actual age; judging whether the current gender of the user is consistent with the actual gender; and if the current age of the user is consistent with the actual age and the current gender of the user is consistent with the actual gender, the user successfully logs in the system.
In one embodiment, after obtaining the real-time iris information of the user through a pre-trained iris model according to the real-time face image, the method further includes: determining a real-time location of the user's gaze from the real-time iris information based on the iris model; and based on the light control model, completing light control according to the skin color information and the real-time position of the user's gaze.
In one embodiment, after obtaining the real-time iris information of the user through a pre-trained iris model according to the real-time face image, the method further includes: determining real-time asthenopia information of the user from the real-time iris information based on the iris model; and based on the light control model, completing light control according to the skin color information and the real-time eye fatigue information.
In one embodiment, the determining real-time eye fatigue information of the user according to the real-time iris information comprises: acquiring age information of a user; and determining real-time eye fatigue information of the user according to the age information and the real-time iris information based on the iris model.
In one embodiment, the face model is trained based on a neural network model, and the method further includes: receiving first training face images obtained from different shooting orientations, and determining a first label corresponding to the first training face images, wherein the first label comprises: skin color, age, gender; and according to the first training face image and the corresponding first label, obtaining the face model through the neural network model training.
In one embodiment, the iris model is trained based on a neural network model, the method further comprising: receiving a second training face image obtained under different illumination intensities, and determining a second label corresponding to the second training face image, wherein the second label comprises: contraction of the pupil, movement state of the eyeball, iris color; and training the neural network model to obtain the iris model according to the second training face image and the corresponding second label.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a non-volatile computer-readable storage medium, and can include the processes of the embodiments of the methods described above when executed. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above examples only show some embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. An intelligent light control method, characterized in that the method comprises:
acquiring a real-time face image of a user;
obtaining skin color information of the user through a pre-trained face model according to the real-time face image;
acquiring real-time iris information of the user through a pre-trained iris model according to the real-time face image;
and completing light control through a pre-trained light control model according to the skin color information and the real-time iris information.
2. The method of claim 1, wherein the method is applied to a smart light control system, and the method further comprises:
acquiring user information used for logging in the system by a user, wherein the user information comprises actual age and actual gender;
before the obtaining of the skin color information of the user through a pre-trained face model according to the real-time face image, the method further comprises the following steps:
acquiring the current age and the current gender of the user through a pre-trained face model according to the real-time face image;
judging whether the current age of the user is consistent with the actual age; judging whether the current gender of the user is consistent with the actual gender;
and if the current age of the user is consistent with the actual age and the current gender of the user is consistent with the actual gender, the user successfully logs in the system.
3. The method of claim 1, wherein after obtaining the real-time iris information of the user through the pre-trained iris model according to the real-time face image, the method further comprises:
determining a real-time location of the user's gaze from the real-time iris information based on the iris model;
and based on the light control model, completing light control according to the skin color information and the real-time position of the user's gaze.
4. The method of claim 1, wherein after obtaining the real-time iris information of the user through the pre-trained iris model according to the real-time face image, the method further comprises:
determining real-time asthenopia information of the user from the real-time iris information based on the iris model;
and based on the light control model, completing light control according to the skin color information and the real-time eye fatigue information.
5. The method of claim 4, wherein determining real-time eyestrain information for the user from the real-time iris information comprises:
acquiring age information of a user;
and determining real-time eye fatigue information of the user according to the age information and the real-time iris information based on the iris model.
6. The method of claim 1, wherein the face model is trained based on a neural network model, the method further comprising:
receiving first training face images obtained from different shooting orientations, and determining a first label corresponding to the first training face images, wherein the first label comprises: skin color, age, gender;
and according to the first training face image and the corresponding first label, obtaining the face model through the neural network model training.
7. The method of claim 1, wherein the iris model is trained based on a neural network model, the method further comprising:
receiving a second training face image obtained under different illumination intensities, and determining a second label corresponding to the second training face image, wherein the second label comprises: contraction of the pupil, movement state of the eyeball, iris color;
and training the neural network model to obtain the iris model according to the second training face image and the corresponding second label.
8. An intelligent light control device, characterized in that the device comprises:
the acquisition module is used for acquiring a face image of a user;
the face model module is used for acquiring the skin color information of the user through a pre-trained face model according to the real-time face image;
the iris model module is used for acquiring the real-time iris information of the user through a pre-trained iris model according to the real-time face image;
and the light control module is used for finishing light control through a pre-trained light control model according to the skin color information and the real-time iris information.
9. A computer device comprising a memory and a processor, the memory storing a computer program that, when executed by the processor, causes the processor to perform the steps of the method according to any one of claims 1 to 7.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, causes the processor to carry out the steps of the method according to any one of claims 1 to 7.
CN202110260081.9A 2021-03-10 2021-03-10 Intelligent light control method and device, computer equipment and storage medium Pending CN112954848A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110260081.9A CN112954848A (en) 2021-03-10 2021-03-10 Intelligent light control method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110260081.9A CN112954848A (en) 2021-03-10 2021-03-10 Intelligent light control method and device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN112954848A true CN112954848A (en) 2021-06-11

Family

ID=76229236

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110260081.9A Pending CN112954848A (en) 2021-03-10 2021-03-10 Intelligent light control method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112954848A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113365382A (en) * 2021-08-10 2021-09-07 深圳市信润富联数字科技有限公司 Light control method and device, electronic equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107480588A (en) * 2017-07-07 2017-12-15 广东欧珀移动通信有限公司 The control method and electronic installation of infrared light supply component
CN107563325A (en) * 2017-08-31 2018-01-09 广东小天才科技有限公司 Method and device for testing fatigue degree and terminal equipment
CN108131791A (en) * 2017-12-04 2018-06-08 广东美的制冷设备有限公司 Control method, device and the server of home appliance
CN110807351A (en) * 2019-08-28 2020-02-18 杭州勒格网络科技有限公司 Intelligent vehicle-mounted fatigue detection system, method and device based on face recognition

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107480588A (en) * 2017-07-07 2017-12-15 广东欧珀移动通信有限公司 The control method and electronic installation of infrared light supply component
CN107563325A (en) * 2017-08-31 2018-01-09 广东小天才科技有限公司 Method and device for testing fatigue degree and terminal equipment
CN108131791A (en) * 2017-12-04 2018-06-08 广东美的制冷设备有限公司 Control method, device and the server of home appliance
CN110807351A (en) * 2019-08-28 2020-02-18 杭州勒格网络科技有限公司 Intelligent vehicle-mounted fatigue detection system, method and device based on face recognition

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113365382A (en) * 2021-08-10 2021-09-07 深圳市信润富联数字科技有限公司 Light control method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
US10762367B2 (en) Systems and methods of biometric analysis to determine natural reflectivity
Memon et al. Tracker for sleepy drivers at the wheel
CN110377385B (en) Screen display method and device and terminal equipment
RU2016138608A (en) NEURAL NETWORK FOR SEGMENTING THE EYE IMAGE AND ASSESSING THE QUALITY OF THE IMAGE
US10592740B2 (en) Control system, information processing device, control method, and program
DE102018102194A1 (en) Electronic equipment, information processing and program
US11138429B2 (en) Iris recognition using eye-tracking system
CN111860365B (en) Classroom light adjusting method and device, computer equipment and storage medium
CN112954848A (en) Intelligent light control method and device, computer equipment and storage medium
Engel et al. Habitual wearers of colored lenses adapt more rapidly to the color changes the lenses produce
Kholerdi et al. Driver's drowsiness detection using an enhanced image processing technique inspired by the human visual system
CN110533001A (en) Big data face identification method based on recognition of face
CN109584522A (en) Desk lamp and its control method, computer storage medium
CN113593484B (en) Backlight brightness adjusting method, system and device and display equipment
Maheshan et al. On the use of image enhancement technique towards robust sclera segmentation
CN114305334A (en) Intelligent beauty method, device, equipment and storage medium
CN116580493A (en) Access control system, control method, equipment and medium for face recognition
CN115482198A (en) Vision parameter detection method and device, computer equipment and storage medium
WO2021048682A1 (en) Classification method
US10796147B1 (en) Method and apparatus for improving the match performance and user convenience of biometric systems that use images of the human eye
CN117911664A (en) Method, system, medium and computer for automatically adjusting illumination color according to emotion
CN115097978A (en) Adjusting method, device, equipment and medium of vehicle-mounted display system
CN117279170B (en) Light adjusting method and system for eye-protection lamp
CN116614918A (en) Illuminance control method, device, storage medium and equipment based on user age
Malathi et al. IRIS PATTERN RECOGNITION: A QUANTITATIVE REVIEW ON HOW TO IMPROVE IRIS-BASED PATTERN RECOGNITION SECURITY SYSTEM

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210611