CN117545122A - LED lamp array control method, device, storage medium and equipment - Google Patents

LED lamp array control method, device, storage medium and equipment Download PDF

Info

Publication number
CN117545122A
CN117545122A CN202311679437.8A CN202311679437A CN117545122A CN 117545122 A CN117545122 A CN 117545122A CN 202311679437 A CN202311679437 A CN 202311679437A CN 117545122 A CN117545122 A CN 117545122A
Authority
CN
China
Prior art keywords
data
led lamp
model
lamp array
feature extraction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311679437.8A
Other languages
Chinese (zh)
Inventor
赵昌平
李正权
李洁儒
张熙
苏炜
胡夏林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Science & Technology Infrastructure Center
Original Assignee
Guangdong Science & Technology Infrastructure Center
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Science & Technology Infrastructure Center filed Critical Guangdong Science & Technology Infrastructure Center
Priority to CN202311679437.8A priority Critical patent/CN117545122A/en
Publication of CN117545122A publication Critical patent/CN117545122A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B45/00Circuit arrangements for operating light-emitting diodes [LED]
    • H05B45/10Controlling the intensity of the light
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B45/00Circuit arrangements for operating light-emitting diodes [LED]
    • H05B45/20Controlling the colour of the light
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/11Controlling the light source in response to determined parameters by determining the brightness or colour temperature of ambient light
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/115Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings
    • H05B47/12Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings by detecting audible sound
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/165Controlling the light source following a pre-assigned programmed sequence; Logic control [LC]

Landscapes

  • Circuit Arrangement For Electric Light Sources In General (AREA)

Abstract

The invention discloses a control method, a device, a storage medium and equipment for an LED lamp array, wherein behavior data of a user and state parameters of an environment where the LED lamp array is positioned are collected; preprocessing the acquired data by adopting a pre-trained preprocessing model; analyzing and extracting features of the preprocessed data by adopting a pre-trained enhanced environment perception data driven feature extraction model; based on the results of data analysis and feature extraction, determining a control strategy of the LED lamp array by adopting an intelligent decision model obtained through pre-training; and controlling the LED lamp array according to the control strategy. According to the invention, the behavior data and the environmental parameters of the user are collected, the autonomous innovative algorithm is adopted to perform data processing and feature extraction, and the intelligent control strategy is formulated according to the result, so that the personalized, intelligent and real-time LED lamp array control is realized. The method and the device can be applied to the fields of indoor illumination, display, advertisement and the like, and have wide market prospect.

Description

LED lamp array control method, device, storage medium and equipment
Technical Field
The invention relates to the field of intelligent home, in particular to a method, a device, a storage medium and equipment for controlling an LED lamp array.
Background
Along with the improvement of the living standard of people and the progress of science and technology, the LED lamp array is more and more favored by modern people. The LED lamp array adopts the LED light source technology, breaks through the tradition of single-point lighting of lamp decoration, and the innovation of multi-point lighting can easily realize the integrated control of different color temperatures, different brightness and different lighting positions.
Conventional LED arrays typically employ predetermined parameters and fixed algorithms to control the brightness, color and movement of the LED array. The disadvantage of this approach is the lack of flexibility and originality, which does not meet the personalized control needs of the user.
Disclosure of Invention
In order to solve the problems, the invention provides a method, a device, a storage medium and equipment for controlling an LED lamp array, which can realize personalized, intelligent and real-time LED lamp array control according to collected user behavior data and environmental conditions.
The embodiment of the invention provides a control method of an LED lamp array, which comprises the following steps:
collecting behavior data of a user and state parameters of an environment where the LED lamp array is located;
preprocessing the acquired data by adopting a pre-trained preprocessing model;
analyzing and extracting features of the preprocessed data by adopting a pre-trained enhanced environment perception data driven feature extraction model;
based on the results of data analysis and feature extraction, determining a control strategy of the LED lamp array by adopting an intelligent decision model obtained through pre-training;
and controlling the LED lamp array according to the control strategy.
Preferably, the pretreatment process comprises:
data cleaning is carried out on the collected data to remove abnormal values, missing values and noise;
normalizing the cleaned data;
and carrying out smoothing treatment on the normalized data.
As a preferred solution, the pre-training process of the enhanced environment perception data driven feature extraction model specifically includes:
acquiring training data set d= { (x) i ,y i )},x i To input feature vectors, y i For inputting feature vector x i Corresponding to the output value;
defining a feature extraction model F (x; theta), wherein F is a network model, x is an input feature vector, and theta is a network parameter;
constructing a multi-layer neural network, and learning a network parameter theta in a forward propagation and reverse propagation mode;
measuring the difference between the predicted value of the feature extraction model and the output value in the dataset by adopting a loss function;
and minimizing a loss function by adopting a gradient descent optimization algorithm, and optimizing the network parameter theta to obtain the enhanced environment perception data driving characteristic extraction model.
Further, the enhanced context awareness data-driven feature extraction model includes an context awareness extraction model and a behavioral feature extraction model.
Preferably, the intelligent decision model comprises a brightness adjustment model, a color adjustment model and a motion mode adjustment model.
Preferably, the brightness adjustment model is a=x 11 +X 22
The color adjustment model is b=x 13 +X 24
The motion mode adjustment model is C=X 15 +X 26
Wherein A, B and C are respectively brightness value, color value and motion mode value, and different motion mode values correspond to different motion modes, X 1 For user behavior, X2 is the environmental state, η 1 、η 3 η 5 Is the behavioral weight, eta 2 、η 4 η 6 Is a state weight.
As a preferable scheme, the state parameters of the environment where the LED lamp array is located comprise ambient illumination intensity, temperature information and humidity information.
The embodiment of the invention provides an LED lamp array control device, which comprises:
the data acquisition module is used for acquiring behavior data of a user and state parameters of an environment where the LED lamp array is positioned;
the preprocessing module is used for preprocessing the acquired data by adopting a pre-trained preprocessing model;
the feature extraction module is used for analyzing and extracting features of the preprocessed data by adopting a pre-trained enhanced environment perception data driving feature extraction model;
the strategy calculation module is used for determining the control strategy of the LED lamp array by adopting an intelligent decision model obtained by pre-training based on the results of data analysis and feature extraction;
and the control module is used for controlling the LED lamp array according to the control strategy.
Preferably, the preprocessing module is configured to:
data cleaning is carried out on the collected data to remove abnormal values, missing values and noise;
normalizing the cleaned data;
and carrying out smoothing treatment on the normalized data.
As a preferred solution, the pre-training process of the enhanced environment perception data driven feature extraction model specifically includes:
acquiring training data set d= { (x) i ,y i )},x i To input feature vectors, y i For inputting feature vector x i Corresponding to the output value;
defining a feature extraction model F (x; theta), wherein F is a network model, x is an input feature vector, and theta is a network parameter;
constructing a multi-layer neural network, and learning a network parameter theta in a forward propagation and reverse propagation mode;
measuring the difference between the predicted value of the feature extraction model and the output value in the dataset by adopting a loss function;
and minimizing a loss function by adopting a gradient descent optimization algorithm, and optimizing the network parameter theta to obtain the enhanced environment perception data driving characteristic extraction model.
Further, the enhanced context awareness data-driven feature extraction model includes an context awareness extraction model and a behavioral feature extraction model.
Preferably, the intelligent decision model comprises a brightness adjustment model, a color adjustment model and a motion mode adjustment model.
Further, the brightness adjustment model is a=x 11 +X 22
The color adjustment model is b=x 13 +X 24
The motion mode adjustment model is C=X 15 +X 26
Wherein A, B and C are respectively brightness value, color value and motion mode value, and different motion mode values correspond to different motion modes, X 1 For user behavior, X2 is the environmental state, η 1 、η 3 η 5 Is the behavioral weight, eta 2 、η 4 η 6 Is a state weight.
Preferably, the state parameters of the environment where the LED lamp array is located include ambient illumination intensity, temperature information and humidity information.
The embodiment of the invention also provides a computer readable storage medium, which comprises a stored computer program, wherein when the computer program runs, equipment where the computer readable storage medium is located is controlled to execute the LED lamp array control method according to any one of the above embodiments.
The embodiment of the invention also provides a terminal device, which comprises a processor, a memory and a computer program stored in the memory and configured to be executed by the processor, wherein the LED lamp array control method according to any one of the above embodiments is realized when the processor executes the computer program.
According to the LED lamp array control method, the device, the storage medium and the equipment, behavior data of a user and state parameters of an environment where the LED lamp array is located are collected; preprocessing the acquired data by adopting a pre-trained preprocessing model; analyzing and extracting features of the preprocessed data by adopting a pre-trained enhanced environment perception data driven feature extraction model; based on the results of data analysis and feature extraction, determining a control strategy of the LED lamp array by adopting an intelligent decision model obtained through pre-training; and controlling the LED lamp array according to the control strategy. According to the requirements and environmental conditions of different users, personalized, intelligent and real-time LED lamp array control is realized.
Through the technical scheme, personalized, intelligent and real-time LED lamp array control can be realized, and better lighting experience and energy-saving effects are provided. The LED lamp array control method, the device, the storage medium and the equipment have the following advantages:
personalized lighting: by collecting user behavior data and environmental states, an illumination scheme suitable for individual users can be formulated according to the information such as the position, the action and the illumination requirement of the users, and the personalized illumination requirement of the users is met.
Intelligent control: and an autonomous innovative intelligent decision algorithm is adopted, factors such as user behavior, environmental state, equipment capacity and the like are comprehensively considered, and a control strategy of the LED lamp array is formulated, so that the LED lamp array can intelligently adjust parameters such as brightness, color, mode and the like.
Real-time response: the LED lamp array control system can collect and analyze user behavior and environmental state data in real time, and timely adjust the control strategy of the LED lamp array according to the data analysis result, so that the LED lamp array can quickly respond to the change of the user demand.
High efficiency and energy saving: the brightness and the power of the LED lamp array can be effectively controlled through intelligent control, and the energy-saving effect is realized. Meanwhile, the illumination requirements of users can be met to the greatest extent through an optimization algorithm and a flexible decision model, and the energy utilization efficiency is improved.
Scalability: the LED lamp array control device and the LED lamp array control equipment can control different types and scales of LED lamp arrays by adding or replacing corresponding modules and interfaces, and have certain flexibility and expandability.
In summary, the method, the device, the storage medium and the equipment for controlling the LED lamp array can realize personalized, intelligent and real-time LED lamp array control, and provide better lighting experience and energy-saving effect for users. The invention has innovation and practicability and wide application prospect.
Drawings
Fig. 1 is a schematic flow chart of a method for controlling an LED array according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of an LED array control device according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of a terminal device according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The embodiment of the invention provides an LED lamp array control method, referring to FIG. 1, which is a flow chart of the LED lamp array control method provided by the embodiment of the invention, wherein the method comprises the steps S1-S5:
s1, collecting behavior data of a user and state parameters of an environment where the LED lamp array is located;
s2, preprocessing the acquired data by adopting a pre-trained preprocessing model;
s3, analyzing and extracting the preprocessed data by adopting a pre-trained enhanced environment perception data driving feature extraction model;
s4, determining a control strategy of the LED lamp array by adopting an intelligent decision model obtained by pre-training based on the results of data analysis and feature extraction;
s5, controlling the LED lamp array according to the control strategy.
When the embodiment is implemented, the behavior data of the user and the state parameters of the environment where the LED lamp array is positioned are collected; preprocessing the acquired data by adopting an autonomous innovative preprocessing model; the method comprises the steps of adopting an independently innovative LED-EEDFE enhanced environment perception data driving feature extraction model to analyze and extract features of preprocessed data; based on the results of data analysis and feature extraction, an autonomous innovative (LED-CID) intelligent decision model is adopted to formulate a control strategy of the LED lamp array; and executing corresponding control commands by the LED lamp array control system according to the formulated control strategy.
According to the scheme, individuation, intellectualization and real-time LED lamp array control can be realized according to the collected user behavior data and the environmental state.
In yet another embodiment provided by the present invention, the preprocessing process includes:
data cleaning is carried out on the collected data to remove abnormal values, missing values and noise;
normalizing the cleaned data;
and carrying out smoothing treatment on the normalized data.
When the embodiment is implemented, the collected data is preprocessed by adopting the LED-DP function of the innovative preprocessing model. The purpose of preprocessing is to reduce noise effects, normalize the data, and reduce data complexity. Specific preprocessing steps include data cleaning, data normalization, data smoothing and the like.
The LED-DP function is an innovative function aimed at preprocessing the acquired data. Its main purpose is to reduce noise effects, normalize the data and reduce data complexity. The specific implementation steps are as follows:
abnormal values, missing values and noise are removed, noise influence is reduced, and accuracy of data is guaranteed. An autonomous innovative denoising algorithm is used, where denoise_value represents the denoised value, P (x, y) represents the value of a certain data point, weight is the weight, mean is the mean. The denoising algorithm formula is as follows: denoised_value= (P (x, y) weight+mean (1-weight)).
And normalizing the cleaned data, so that the complexity of the data is reduced and the comparability of the data is improved. An autonomous innovative normalization algorithm is used, where normalized_x represents the normalized value, log_x represents the value of a certain data point, and min_log_x and max_log_x represent the minimum and maximum values of the data, respectively. The normalization formula is as follows: normalized_x= (log_x-min_log_x)/(max_log_x-min_log_x).
And smoothing the standardized data, eliminating sudden fluctuation and noise in the data, and improving the stability and stability of the data.
According to the above steps, the following innovative functions are used for data preprocessing.
def LED_DP(raw_data,weight,window_size):
denoised_data=[];
Data cleaning:
mean=np.mean(raw_data);
for iin range(len(raw_data));
denoised_value=(raw_data[i]*weight+mean*(1-weight));
denoised_data.append(denoised_value);
data normalization:
min_log_x=np.min(np.log(denoised_data));
max_log_x=np.max(np.log(denoised_data));
normalized_data=[(np.log(x)-min_log_x)/(max_log_x-min_log_x)for x in denoised_data];
and (3) data smoothing:
smoothed_data=data_smoothing(normalized_data,window_size);
return smoothed_data;
def data_smoothing(normalized_data,window_size);
smoothed_data=[];
for iin range(len(normalized_data));
lower=max(0,i-window_size//2);
upper=min(len(normalized_data),i+window_size//2+1);
smoothed_value=np.mean(normalized_data[lower:upper]);
smoothed_data.append(smoothed_value);
return smoothed_data;
the key point of the implementation steps and innovation functions is that autonomous innovative denoising and normalization algorithms are used, and an LED_DP function is provided to realize preprocessing of data.
In yet another embodiment of the present invention, the enhanced context awareness data-driven feature extraction model pre-training process specifically includes:
acquiring training data set d= { (x) i ,y i )},x i To input feature vectors, y i For inputting feature vector x i Corresponding to the output value;
defining a feature extraction model F (x; theta), wherein F is a network model, x is an input feature vector, and theta is a network parameter;
constructing a multi-layer neural network, and learning a network parameter theta in a forward propagation and reverse propagation mode;
measuring the difference between the predicted value of the feature extraction model and the output value in the dataset by adopting a loss function;
and minimizing a loss function by adopting a gradient descent optimization algorithm, and optimizing the network parameter theta to obtain the enhanced environment perception data driving characteristic extraction model.
When the embodiment is implemented, the LED-EEDFE enhanced environment perception data driving feature extraction model which is independently innovated is adopted to analyze and extract features of the preprocessed data.
The algorithm of the feature extraction model is as follows:
let the existing dataset be D = { (x) i ,y i ) I=1, 2,..n, where x i To input feature vectors, y i For inputting feature vector x i Corresponding output values, i.e. behavior features;
defining a feature extraction model as F (x; theta), wherein F is a network model, x is an input feature vector, and theta is a network parameter;
a multi-layer neural network is constructed, and the network parameters theta are learned through forward propagation and backward propagation modes.
The gap between the predicted value F (x; θ) and the true value y is minimized, and a loss function is used to define the gap.
The network parameter θ is updated by an optimization algorithm such as gradient descent, so that the loss function is minimized.
And obtaining a trained enhanced environment perception data driving feature extraction model F (x; theta).
In the data prediction stage, the collected user behavior data and the environmental state parameters are input into a trained feature extraction model, and behavior features are extracted.
The algorithm of the prediction function is as follows:
inputting behavior data to be predicted and environmental state parameters to be predicted, and marking the behavior data and the environmental state parameters as x new
Extracting behavior features using enhanced context awareness data driven feature extraction model F (x; θ), denoted as F new =F*(x new ;θ)。
The model creatively combines the user behavior data with the environmental state parameters, and can extract useful behavior features by adopting the user behavior data and the environmental parameters on the premise of ensuring the data privacy through the pre-trained feature extraction model. Through the training and prediction process of the feature extraction model, deep understanding and prediction of user behavior can be obtained.
In yet another embodiment provided by the present invention, the enhanced context awareness data-driven feature extraction model includes a context awareness extraction model and a behavioral feature extraction model.
In the implementation of this embodiment, the training is performed using the LED-LEARN algorithm: the preprocessed data is trained using the LED-LEARN algorithm to achieve behavior recognition and environmental awareness. Training the preprocessed data by using a supervised learning method in the behavior recognition part to obtain a parameter theta of a behavior recognition model behavior . Training the preprocessed data by using an unsupervised learning method in the environment sensing part to obtain a parameter theta of an environment sensing model environment
And extracting characteristics related to LED lamp array control on the trained behavior recognition model and environment perception model. First, a behavior recognition model F is adopted behavior (x;θ behavior ) Performing behavior recognition on the preprocessed data to obtain behavior characteristics f behavior . Then, adopt the environment perception model F environment (x;θ environment ) Performing environment sensing on the preprocessed data, and learning potential representation of the data to obtain environment characteristics f environment . Finally, the behavior feature f behavior And environmental characteristics f environment Combining to obtain the characteristic representation f related to LED lamp array control LED
Algorithm for defining feature prediction function, driving feature extraction model F (x; θ) and new input data x according to enhanced context awareness data new And performing feature extraction and prediction on the new data by using the model parameters theta. By inputting new data into the extraction model, a corresponding characteristic representation f is obtained new
The following functions are used to feature extraction and prediction algorithms.
def LED_EEDFE(preprocessed_data,training_data);
Training the preprocessed data by using an LED-LEARN algorithm to realize behavior recognition and environment perception:
θ behavior =LED_LEARN_behavior(training_data);
θ environment =LED_LEARN_environment(training_data);
defining behavior recognition model F behavior (x;θ behavior ):
behavior_model=custom_behavior_model(θ behavior );
Defining an environmental awareness model F environment (x;θ environment );
environment_model=custom_environment_model(θ environment );
And (3) behavior feature extraction:
f behavior =behavior_model(preprocessed_data);
extracting environmental characteristics:
f environment =environment_model(preprocessed_data);
the combined feature represents:
f LED =custom_feature_combination(f behavior ,f environment );
return f LED
def feature_prediction(x_new,model_parameters);
algorithm of feature prediction function, driving feature extraction model F (x; theta) and new input data x according to enhanced environment perception data new And (3) extracting and predicting the characteristics:
f new =custom_feature_extraction_and_prediction(x_new,model_parameters);
return f new
the innovation of the implementation steps and functions is that an autonomously innovative LED-EEDFE enhanced context aware data driven feature extraction model is used and an led_eedfe function is provided to enable feature extraction of the preprocessed data. In the LED_EEDFE function, an LED-LEARN algorithm is firstly used for training to obtain parameters of a behavior recognition and environment perception model. And then, extracting corresponding characteristic representations by adopting a behavior recognition model and an environment perception model respectively. And finally, combining the behavior characteristic and the environment characteristic to obtain the characteristic representation related to the LED lamp array control.
In yet another embodiment of the present invention, the intelligent decision model includes a brightness adjustment model, a color adjustment model, and a motion mode adjustment model.
When the embodiment is implemented, based on the results of data analysis and feature extraction, an autonomous innovative LED-CID intelligent decision algorithm is adopted to formulate a control strategy of the LED lamp array. According to the behavior of the user and the state of the environment, the decision algorithm can automatically adjust parameters of the LED lamp array, such as brightness, color, movement mode and the like.
According to the weight of the user behavior and the environment state, the adjustment values of the brightness, the color and the movement mode of the LED are calculated, and the characteristic representation and the LED array parameters are returned together, so that the brightness, the color and the movement mode of the LED can be controlled.
In yet another embodiment of the present invention, the brightness adjustment model is a=x 11 +X 22
The color adjustment model is b=x 13 +X 24
The motion mode adjustment model is C=X 15 +X 26
Wherein A, B and C are respectively brightness value, color value and motion mode value, and different motion mode values correspond to different motion modes, X 1 For user behavior, X2 is the environmental state, η 1 、η 3 η 5 Is the behavioral weight, eta 2 、η 4 η 6 Is a state weight.
When the embodiment is implemented, an LED-CID intelligent decision algorithm is defined: and (3) designing an LED-CID intelligent decision algorithm according to autonomous innovation, and formulating a control strategy of the LED lamp array according to the feature extraction result and the behavior of a user and the state of the environment. The algorithm needs to define related weight values and parameters:
brightness = user behavior weight + user behavior + environmental state weight + environmental state;
color = user behavior weight + user behavior + environmental state weight + environmental state;
LED movement pattern adjustment formula movement pattern = user behavior weight user behavior + environmental state weight environmental state.
I.e. brightness adjustment model a=x 11 +X 22
The color adjustment model is b=x 13 +X 24
The motion mode adjustment model is C=X 15 +X 26
A. B and C are respectively brightness value, color value and motion mode value, and different motion mode values correspond to different motion modes, X 1 For user behavior, X2 is the environmental state, η 1 、η 3 η 5 Is the behavioral weight, eta 2 、η 4 η 6 Is a state weight.
LED-CID intelligent decision algorithm and calculation of LED lamp array control parameters
user_behavior_weight=0.5;
environment_status_weight=0.5;
brightness=user_behavior_weight*user_behavior+
environment_status_weight*environment_status;
color=user_behavior_weight*user_behavior+environment_status_weight*environment_status;
motion=user_behavior_weight*user_behavior+environment_status_weight*environment_status;
return f LED ,brightness,color,motion;
The implementation steps and functions are innovative in that an autonomously innovative LED-EEDFE enhanced environment perception data driven feature extraction model is used, and an LED_EEDFE function is provided to realize feature extraction of preprocessed data and formulation of an LED lamp array control strategy. In the LED_EEDFE function, an LED-LEARN algorithm is firstly used for training to obtain parameters of a behavior recognition and environment perception model. And then, extracting corresponding characteristic representations by adopting a behavior recognition model and an environment perception model respectively. And finally, calculating the adjustment values of the brightness, the color and the movement mode of the LED according to the weights of the user behaviors and the environment states, and returning the characteristic representation and the LED array parameters.
And executing corresponding control commands by the LED lamp array control system according to the formulated control strategy. And a control command is transmitted to the LED lamp array control module through a communication interface with the LED lamp array, so that individuation, intellectualization and real-time control of the LED lamp array are realized. The specific implementation steps are as follows:
designing an LED lamp array control command: and designing a control command of the LED lamp array according to the formulated control strategy. The command format is "LED" + parameter 1+parameter 2+parameter 3, where parameter 1 represents brightness, parameter 2 represents color, and parameter 3 represents movement pattern.
And the communication interface is communicated with the LED lamp array, and communication connection is established with the LED lamp array control module through the communication interface with the LED lamp array. According to the protocol specification of the communication interface, setting communication parameters and communication modes, and ensuring correct and stable communication with the LED lamp array.
Calculating brightness control commands according to the formulated control strategy: according to the luminance control formula duty_cycle= (desired_brightness/max_brightness) 100, a desired luminance value is calculated according to a formulated control strategy and input parameters such as user behavior, environmental state and the like.
According to the formulated control strategy, calculating a color control command: the formula for color control:
r=(desired_color[0]/max_color_value)*255;
g=(desired_color[1]/max_color_value)*255;
b=(desired_color[2]/max_color_value)*255;
and calculating a desired color value according to the formulated control strategy and input parameters such as user behavior, environmental state and the like.
Creating an LED control command: and combining the brightness value and other parameters according to the format and the instruction of the LED lamp array control command to create the control command of the LED lamp array.
Delivering a control command: and transmitting the LED control command to the LED lamp array control module through a communication interface with the LED lamp array. The accurate transmission and reliability of the command are ensured, so that the individuation, the intellectualization and the real-time control of the LED lamp array are realized.
Controlling the position or rotation of the LED lamp beads: and according to the movement mode parameters in the control command, the movement control of the LED lamp array is realized by adjusting the positions or the rotation of the LED lamp beads. And corresponding control strategies and methods are adopted according to specific designs and technical implementations.
Monitoring the state and feedback information of the LED lamp array: the control system collects and monitors the state and feedback information of the LED lamp array in real time, such as brightness, temperature, power consumption and the like. And the stability and the safety of the working of the LED lamp array are ensured through monitoring results, and the control strategy is dynamically adjusted and optimized according to feedback information.
Communication interface function with LED array:
def LED_communication(command);
and the LED lamp array is communicated with:
transmitting a control command to the LED lamp array:
acquiring the state and feedback information of an LED lamp array;
calculating brightness control commands:
desired_brightness=compute_brightness(user_behavior,environment_status);
max_brightness=get_max_brightness();
duty_cycle=(desired_brightness/max_brightness)*100;
calculating a color control command:
desired_color=compute_color(user_behavior,environment_status);
max_color_value=get_max_color_value();
r=(desired_color[0]/max_color_value)*255;
g=(desired_color[1]/max_color_value)*255;
b=(desired_color[2]/max_color_value)*255;
creating an LED control command:
LED_command="LED"+str(duty_cycle)+str((int(r),int(g),int(b)))+str(motion);
delivering a control command:
LED_communication(LED_command);
in addition, the LED lamp array control device and the LED lamp array control equipment can further comprise a user interface module used for interacting with a user and displaying state information of the LED lamp array. The user interface module may include a display screen, buttons, touch screens, etc., through which a user may set control parameters of the LED array, select a lighting mode, etc. The user interface module is connected with the LED lamp array control module, so that interaction between a user and the LED lamp array is realized.
In order to achieve better user experience and energy efficiency optimization, the LED lamp array control device and the LED lamp array control equipment can further comprise a positioning module and an energy efficiency optimization module. The positioning module is used for tracking the position information of the user in real time, and the energy efficiency optimization module is used for adjusting the brightness and the color of the lamplight according to the position of the user and the layout information of the LED lamp array, so that localized illumination and energy saving optimization are realized. The positioning module and the energy efficiency optimizing module are connected with the LED lamp array control module, so that the transmission and the processing of positioning data and energy efficiency data are realized.
In still another embodiment of the present invention, the state parameters of the environment in which the LED array is located include ambient light intensity, temperature information, and humidity information.
When the embodiment is implemented, behavior data of a user and environmental state parameters of the LED lamp array are collected. Through the use of sensor equipment such as cameras, motion sensors, illumination sensors and the like, behavior data such as the position, the action, the illumination requirement and the like of a user are collected in real time, and meanwhile, state parameters such as the illumination intensity, the temperature, the humidity and the like of the environment where the LED lamp array is located are collected. This step is carried out as follows:
according to the requirements, a proper number and type of sensor devices such as cameras, motion sensors, illumination sensors and the like are deployed so as to acquire behavior data of a user and environmental state parameters of the LED lamp array in real time. The sensor device should be arranged in a suitable position to cover the desired area. For example, the camera should be positioned at a location where the user's position and motion can be captured, and the illumination sensor should be positioned at a location where the ambient illumination intensity can be accurately perceived.
And acquiring behavior data such as the position, the action, the illumination requirement and the like of a user in real time through deployed sensor equipment, and acquiring state parameters such as the illumination intensity, the temperature, the humidity and the like of the environment where the LED lamp array is positioned. The sensor device may be connected to the control system via an interface, transmitting the acquired data to the control system.
The sensor device transmits the acquired data to the control system through the interface. The control system receives the data and stores it in a suitable storage medium, such as a local database, cloud storage, or the like. Ensuring the security and integrity of the data.
In this step, the following functions are used to collect and store data:
def data_collection();
user_position=get_user_position (); acquiring a user position;
user_action=get_user_action (); acquiring a user action;
user_illumination_demand=get_user_illumination_demand (); obtaining the illumination requirement of a user:
environmental_illumination=get_environmental_illumination (); acquiring the ambient illumination intensity:
environmental_temperature=get_environmental_temperature (); obtaining the ambient temperature:
environment_environment=get_environment_environment (); acquiring the ambient humidity:
data={"user_position":user_position,"user_action":
user_action,"user_illumination_demand":
user_illumination_demand,"environment_illumination":
environment_illumination,"environment_temperature":environment_temperature,
"environment_humidity":environment_humidity
};
store_data (data); storing data;
by deploying the sensor equipment, behavior data of a user and environmental state parameters of the LED lamp array are collected in real time, and corresponding functions are used for data collection and storage. Thus, an accurate data basis can be provided for subsequent LED lamp array control.
In still another embodiment of the present invention, referring to fig. 2, a schematic structural diagram of an LED array control device provided in an embodiment of the present invention is provided, where the device includes:
the data acquisition module is used for acquiring behavior data of a user and state parameters of an environment where the LED lamp array is positioned;
the preprocessing module is used for preprocessing the acquired data by adopting a pre-trained preprocessing model;
the feature extraction module is used for analyzing and extracting features of the preprocessed data by adopting a pre-trained enhanced environment perception data driving feature extraction model;
the strategy calculation module is used for determining the control strategy of the LED lamp array by adopting an intelligent decision model obtained by pre-training based on the results of data analysis and feature extraction;
and the control module is used for controlling the LED lamp array according to the control strategy.
It should be noted that, the LED array control device provided in the embodiment of the present invention can execute the LED array control method described in any embodiment of the foregoing embodiment, and specific functions of the LED array control device are not described herein.
Referring to fig. 3, a schematic structural diagram of a terminal device according to an embodiment of the present invention is provided. The terminal device of this embodiment includes: a processor, a memory, and a computer program, such as an LED array program, stored in the memory and executable on the processor. The steps in the above embodiments of the LED array control method are implemented when the processor executes the computer program, for example, steps S1 to S5 shown in fig. 1. Alternatively, the processor may implement the functions of the modules in the above-described device embodiments when executing the computer program.
The computer program may be divided into one or more modules/units, which are stored in the memory and executed by the processor to accomplish the present invention, for example. The one or more modules/units may be a series of computer program instruction segments capable of performing the specified functions, which instruction segments are used for describing the execution of the computer program in the terminal device. For example, the computer program may be divided into modules, and specific functions of each module are not described herein.
The terminal equipment can be computing equipment such as a desktop computer, a notebook computer, a palm computer, a cloud server and the like. The terminal device may include, but is not limited to, a processor, a memory. It will be appreciated by those skilled in the art that the schematic diagram is merely an example of a terminal device and does not constitute a limitation of the terminal device, and may include more or less components than illustrated, or may combine certain components, or different components, e.g., the terminal device may further include an input-output device, a network access device, a bus, etc.
The processor may be a central processing unit (Central Processing Unit, CPU), other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. The general purpose processor may be a microprocessor or the processor may be any conventional processor or the like, which is a control center of the terminal device, and which connects various parts of the entire terminal device using various interfaces and lines.
The memory may be used to store the computer program and/or module, and the processor may implement various functions of the terminal device by running or executing the computer program and/or module stored in the memory and invoking data stored in the memory. The memory may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like; the storage data area may store data (such as audio data, phonebook, etc.) created according to the use of the handset, etc. In addition, the memory may include high-speed random access memory, and may also include non-volatile memory, such as a hard disk, memory, plug-in hard disk, smart Media Card (SMC), secure Digital (SD) Card, flash Card (Flash Card), at least one disk storage device, flash memory device, or other volatile solid-state storage device.
Wherein the terminal device integrated modules/units may be stored in a computer readable storage medium if implemented in the form of software functional units and sold or used as stand alone products. Based on such understanding, the present invention may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code, which may be in the form of code, object code, executable files, or in some intermediate form, etc. The computer readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth.
It should be noted that the above-described apparatus embodiments are merely illustrative, and the units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. In addition, in the drawings of the embodiment of the device provided by the invention, the connection relation between the modules represents that the modules have communication connection, and can be specifically implemented as one or more communication buses or signal lines. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
While the foregoing is directed to the preferred embodiments of the present invention, it will be appreciated by those skilled in the art that changes and modifications may be made without departing from the principles of the invention, such changes and modifications are also intended to be within the scope of the invention.

Claims (10)

1. The LED lamp array control method is characterized by comprising the following steps of:
collecting behavior data of a user and state parameters of an environment where the LED lamp array is located;
preprocessing the acquired data by adopting a pre-trained preprocessing model;
analyzing and extracting features of the preprocessed data by adopting a pre-trained enhanced environment perception data driven feature extraction model;
based on the results of data analysis and feature extraction, determining a control strategy of the LED lamp array by adopting an intelligent decision model obtained through pre-training;
and controlling the LED lamp array according to the control strategy.
2. The LED array control method of claim 1, wherein the preprocessing process comprises:
data cleaning is carried out on the collected data to remove abnormal values, missing values and noise;
normalizing the cleaned data;
and carrying out smoothing treatment on the normalized data.
3. The LED array control method of claim 1, wherein said enhanced context awareness data-driven feature extraction model pre-training process specifically comprises:
acquiring training data set d= { (x) i ,y i )},x i To input feature vectors, y i For inputting feature vector x i Corresponding to the output value;
defining a feature extraction model F (x; theta), wherein F is a network model, x is an input feature vector, and theta is a network parameter;
constructing a multi-layer neural network, and learning a network parameter theta in a forward propagation and reverse propagation mode;
measuring the difference between the predicted value of the feature extraction model and the output value in the dataset by adopting a loss function;
and minimizing a loss function by adopting a gradient descent optimization algorithm, and optimizing the network parameter theta to obtain the enhanced environment perception data driving characteristic extraction model.
4. The LED array control method of claim 3, wherein said enhanced context awareness data-driven feature extraction model comprises a context awareness extraction model and a behavioral feature extraction model.
5. The LED array control method of claim 1, wherein said intelligent decision model comprises a brightness adjustment model, a color adjustment model, and a motion mode adjustment model.
6. The LED array control method of claim 5, wherein the brightness adjustment model is a=x 11 +X 22
The color adjustment model is b=x 13 +X 24
The movement squareAdjusting the model to c=x 15 +X 26
Wherein A, B and C are respectively brightness value, color value and motion mode value, and different motion mode values correspond to different motion modes, X 1 For user behavior, X2 is the environmental state, η 1 、η 3 η 5 Is the behavioral weight, eta 2 、η 4 η 6 Is a state weight.
7. The method of claim 1, wherein the state parameters of the environment in which the LED array is located include ambient light intensity, temperature information, and humidity information.
8. An LED array control apparatus, the apparatus comprising:
the data acquisition module is used for acquiring behavior data of a user and state parameters of an environment where the LED lamp array is positioned;
the preprocessing module is used for preprocessing the acquired data by adopting a pre-trained preprocessing model;
the feature extraction module is used for analyzing and extracting features of the preprocessed data by adopting a pre-trained enhanced environment perception data driving feature extraction model;
the strategy calculation module is used for determining the control strategy of the LED lamp array by adopting an intelligent decision model obtained by pre-training based on the results of data analysis and feature extraction;
and the control module is used for controlling the LED lamp array according to the control strategy.
9. A computer readable storage medium, characterized in that the computer readable storage medium comprises a stored computer program, wherein the computer program, when run, controls a device in which the computer readable storage medium is located to perform the LED array control method according to any one of claims 1 to 7.
10. A terminal device comprising a processor, a memory and a computer program stored in the memory and configured to be executed by the processor, the processor implementing the LED array control method according to any one of claims 1 to 7 when executing the computer program.
CN202311679437.8A 2023-12-07 2023-12-07 LED lamp array control method, device, storage medium and equipment Pending CN117545122A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311679437.8A CN117545122A (en) 2023-12-07 2023-12-07 LED lamp array control method, device, storage medium and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311679437.8A CN117545122A (en) 2023-12-07 2023-12-07 LED lamp array control method, device, storage medium and equipment

Publications (1)

Publication Number Publication Date
CN117545122A true CN117545122A (en) 2024-02-09

Family

ID=89795787

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311679437.8A Pending CN117545122A (en) 2023-12-07 2023-12-07 LED lamp array control method, device, storage medium and equipment

Country Status (1)

Country Link
CN (1) CN117545122A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118042685A (en) * 2024-04-09 2024-05-14 深圳市鑫莱特照明有限公司 Intelligent guiding control method and system for LED lamp

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150035437A1 (en) * 2013-08-05 2015-02-05 Peter J. Panopoulos Led lighting system
CN204707304U (en) * 2015-01-29 2015-10-14 广州景晴光电科技有限公司 Can many pixels RGB controller of custom lamp optical mode arbitrarily
WO2019120028A1 (en) * 2017-12-20 2019-06-27 Oppo广东移动通信有限公司 Intelligent screen brightness adjustment method and apparatus, and storage medium and mobile terminal
CN110970003A (en) * 2019-12-24 2020-04-07 维沃移动通信有限公司 Screen brightness adjusting method and device, electronic equipment and storage medium
CN111970785A (en) * 2020-08-18 2020-11-20 蔡国凤 Emergency LED street lamp control method and system of intelligent street lamp
CN113869482A (en) * 2021-07-19 2021-12-31 北京工业大学 Intelligent street lamp self-adaptive energy-saving control method and system based on deep reinforcement learning
CN117010263A (en) * 2023-04-01 2023-11-07 西北工业大学 Residual life prediction method based on convolutional neural network and long-term and short-term memory network
CN117098270A (en) * 2023-10-17 2023-11-21 深圳市天成照明有限公司 Intelligent control method, device and equipment for LED energy-saving lamp and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150035437A1 (en) * 2013-08-05 2015-02-05 Peter J. Panopoulos Led lighting system
CN204707304U (en) * 2015-01-29 2015-10-14 广州景晴光电科技有限公司 Can many pixels RGB controller of custom lamp optical mode arbitrarily
WO2019120028A1 (en) * 2017-12-20 2019-06-27 Oppo广东移动通信有限公司 Intelligent screen brightness adjustment method and apparatus, and storage medium and mobile terminal
CN110970003A (en) * 2019-12-24 2020-04-07 维沃移动通信有限公司 Screen brightness adjusting method and device, electronic equipment and storage medium
CN111970785A (en) * 2020-08-18 2020-11-20 蔡国凤 Emergency LED street lamp control method and system of intelligent street lamp
CN113869482A (en) * 2021-07-19 2021-12-31 北京工业大学 Intelligent street lamp self-adaptive energy-saving control method and system based on deep reinforcement learning
CN117010263A (en) * 2023-04-01 2023-11-07 西北工业大学 Residual life prediction method based on convolutional neural network and long-term and short-term memory network
CN117098270A (en) * 2023-10-17 2023-11-21 深圳市天成照明有限公司 Intelligent control method, device and equipment for LED energy-saving lamp and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118042685A (en) * 2024-04-09 2024-05-14 深圳市鑫莱特照明有限公司 Intelligent guiding control method and system for LED lamp
CN118042685B (en) * 2024-04-09 2024-07-02 深圳市鑫莱特照明有限公司 Intelligent guiding control method and system for LED lamp

Similar Documents

Publication Publication Date Title
US11783227B2 (en) Method, apparatus, device and readable medium for transfer learning in machine learning
US10460231B2 (en) Method and apparatus of neural network based image signal processor
WO2021136365A1 (en) Application development method and apparatus based on machine learning model, and electronic device
US11132547B2 (en) Emotion recognition-based artwork recommendation method and device, medium, and electronic apparatus
WO2021139471A1 (en) Health status test method and device, and computer storage medium
JP2019535055A (en) Perform gesture-based operations
CN111079833B (en) Image recognition method, image recognition device and computer-readable storage medium
US8428310B2 (en) Pattern classification system and method for collective learning
CN117545122A (en) LED lamp array control method, device, storage medium and equipment
CN113222170B (en) Intelligent algorithm and model for AI collaborative service platform of Internet of things
WO2021066796A1 (en) Modeling human behavior in work environments using neural networks
CN112529149A (en) Data processing method and related device
CN111797862A (en) Task processing method and device, storage medium and electronic equipment
CN117612100A (en) Sewage treatment equipment monitoring system and method based on image recognition
CN116755811A (en) Desktop virtualization method and system for fusion application
KR102619200B1 (en) Method and computer program for creating a neural network model that automatically controls environmental facilities based on artificial intelligence
CN111601418B (en) Color temperature adjusting method and device, storage medium and processor
CN113705070A (en) Simulator training method, device, equipment and storage medium
CN115047773A (en) Intelligent home automatic control method and device and intelligent gateway
Qishu Implementation method of intelligent emotion-aware clothing system based on nanofibre technology
Manolova et al. Human activity recognition with semantically guided graph-convolutional network
CN117241443B (en) Intelligent lighting lamp based on Internet of things and intelligent control method thereof
Sharma et al. Cotton Crop Disease Detection System Using Deep Learning Approach
US20230066206A1 (en) Automatic processing chain generation
US20230306238A1 (en) Multi-level coordinated internet of things artificial intelligence

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination