CN112381307B - Meteorological event prediction method and device and related equipment - Google Patents

Meteorological event prediction method and device and related equipment Download PDF

Info

Publication number
CN112381307B
CN112381307B CN202011312818.9A CN202011312818A CN112381307B CN 112381307 B CN112381307 B CN 112381307B CN 202011312818 A CN202011312818 A CN 202011312818A CN 112381307 B CN112381307 B CN 112381307B
Authority
CN
China
Prior art keywords
sample
terminal
weather
order gradient
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011312818.9A
Other languages
Chinese (zh)
Other versions
CN112381307A (en
Inventor
王健宗
李泽远
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Technology Shenzhen Co Ltd
Original Assignee
Ping An Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Technology Shenzhen Co Ltd filed Critical Ping An Technology Shenzhen Co Ltd
Priority to CN202011312818.9A priority Critical patent/CN112381307B/en
Publication of CN112381307A publication Critical patent/CN112381307A/en
Priority to PCT/CN2021/083026 priority patent/WO2021203980A1/en
Application granted granted Critical
Publication of CN112381307B publication Critical patent/CN112381307B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • G06N20/20Ensemble learning
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Economics (AREA)
  • Human Resources & Organizations (AREA)
  • Strategic Management (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Operations Research (AREA)
  • Data Mining & Analysis (AREA)
  • Tourism & Hospitality (AREA)
  • Game Theory and Decision Science (AREA)
  • General Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Development Economics (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The application provides a weather event prediction method, which comprises the following steps: the first terminal calculates a first-order gradient set and a second-order gradient set of a loss function of the model to be trained according to each sample in the first sample set; and receiving an aggregate first-order gradient set and an aggregate second-order gradient set which are transmitted by the second terminal and are calculated according to a second sample set, wherein the second sample set is a sample set which is determined from the sample set of the second terminal and is similar to the first sample set; and then training a model according to the gradient values of the samples and the aggregate gradient values of the similar samples, and predicting the meteorological conditions through the trained model. The aggregation gradient values of the similar samples are sent to the first terminal for training the model, so that the problem of data leakage is avoided, meanwhile, the similar samples of other terminals are used in the process of training the model, the trained model is more accurate, the terminals can synchronously and parallelly train, the calculation efficiency of the model is improved, and data and resources are reasonably utilized.

Description

Meteorological event prediction method and device and related equipment
Technical Field
The present invention relates to the field of big data processing technologies, and in particular, to a method and an apparatus for predicting a meteorological event, and a related device.
Background
With the development of big data and artificial intelligence, massive data deep learning, complex neural networks and the like are gradually applied, and the application of the big data and artificial intelligence technology to weather event prediction becomes a hot topic, such as rainfall prediction, temperature prediction, wind speed prediction and the like.
At present, the weather forecast method mainly comprises traditional statistical modes such as a regression model, an autoregressive moving average model and the like, and artificial intelligent models such as an artificial neural network, a support vector machine, a regression tree and the like. However, the existing researches are directed at a centralized training model, namely, after all data of all meteorological sites are uploaded to a central server, the model is trained, but because the meteorological sites are widely distributed, are more in quantity and are monitored for a long time, the data volume is very large, and the secret problems are involved in different provincial meteorological data, the model cannot reach the expectations of people only by training the model through the centralized training mode, the training process is very weak, and the problems of low operation efficiency, over-wide model, insufficient performance and the like are inevitably caused.
Disclosure of Invention
The embodiment of the application provides a weather event prediction method, which can solve the problems of data privacy protection among weather data, low operation efficiency, too wide range, insufficient performance and the like of a model.
In a first aspect, the present application provides a weather event prediction method applied to a weather prediction system including a first terminal located at a first weather station and a second terminal located at a second weather station, the method comprising: the first terminal calculates a first-order gradient set and a second-order gradient set of a loss function of a model to be trained according to each sample in a first sample set, wherein one gradient value in the first-order gradient set is calculated according to one sample in the first sample set, one gradient value in the second-order gradient set is calculated according to one sample in the first sample set, and the first sample set is a set of samples collected by a first weather station; the first terminal receives an aggregate first-order gradient set and an aggregate second-order gradient set which are sent by a second terminal and are calculated according to a second sample set, wherein the second sample set comprises samples similar to each sample in the first sample set; the first terminal trains the model to be trained according to the first-order gradient set, the aggregation first-order gradient set, the second-order gradient set, the aggregation second-order gradient set and the first sample set to obtain a trained model; and the first terminal predicts the sample to be predicted based on the trained model, and determines the prediction result of the sample to be predicted.
In a second aspect, the present application provides a weather event prediction method applied to a weather prediction system including a first terminal located at a first weather station and a second terminal located at a second weather station, the method comprising: the second terminal sends a second hash table to the first terminal, wherein the second hash table comprises an identifier corresponding to each sample in the third sample set and a hash value corresponding to each sample, and the third sample set is a sample collected by the second weather station; the second terminal receives a sample identification set sent by the first terminal, and each sample identification in the sample identification set indicates one sample in the third sample set; the second terminal determines a second sample set in the third sample set according to the sample identification set, calculates an aggregate first-order gradient set and an aggregate second-order gradient set of the loss function of the model to be trained according to each sample in the second sample set, and sends the aggregate first-order gradient set and the aggregate second-order gradient set to the first terminal.
In a third aspect, the present application provides a weather event prediction apparatus, the apparatus comprising: one or more functional modules for performing the method as described in the first aspect; or for performing the method as described in the second aspect.
In a fourth aspect, the present application provides a computer device comprising a processor and a memory, the memory for storing instructions, the processor for executing the instructions, which when executed by the processor, perform the method of the first aspect; or for performing the method as described in the second aspect.
In a fifth aspect, the present application provides a computer readable storage medium storing a computer program for execution by a processor to implement the method of the first aspect; or for performing the method as described in the second aspect.
In the embodiment of the application, the first terminal of the first weather station trains the model by using the gradient set obtained by calculating the first sample and the gradient set obtained by calculating the second sample of the second terminal, wherein the second sample is a sample similar to the first weather station in the second weather station, and as can be seen, the local data and the data similar to other weather stations are used as training parameters of the model, the data and the resources are reasonably utilized, so that the prediction result of the model is more accurate; in addition, the first terminal receives the gradient value of the sample of the second terminal, the method does not involve a central server, and the terminals of all weather stations do not need to upload data to the central server, so that the problem of leakage of data privacy is avoided; the method can be simultaneously applied to weather prediction of a plurality of weather stations, the weather stations can cooperate, the model aiming at each weather station is trained synchronously and parallelly, and the calculation efficiency of the model is effectively improved.
Drawings
In order to more clearly describe the technical solutions in the embodiments or the background of the present application, the following description will describe the drawings that are required to be used in the embodiments or the background of the present application.
FIG. 1 is a schematic overall flow chart of a weather event prediction method according to an embodiment of the present application;
fig. 2 is a sample data structure of a data terminal provided in an embodiment of the present application;
FIG. 3 is a schematic flow chart of a model training process provided in an embodiment of the present application;
fig. 4 is a schematic diagram of a data structure obtained by encrypting sample data of a data terminal according to an embodiment of the present application;
fig. 5 is a schematic data structure diagram of a sample identifier set determined by a data terminal according to a hash table according to an embodiment of the present application;
FIG. 6 is a schematic structural diagram of a weather event prediction apparatus according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
Embodiments of the present invention are described below with reference to the drawings in the embodiments of the present application. The terminology used in the description section of the present application is for the purpose of describing particular embodiments of the present application only and is not intended to be limiting of the present application. When training of participation models of multiple participants is involved, the traditional method often uploads data of the multiple participants to a central server, trains the models through the central server, and then sends the trained models to the participants for prediction of related events. Although the method can use the data of the participants for training the model, so that the trained model is suitable for data prediction of each participant, the method has low operation efficiency, the model is too wide, and the problem of privacy leakage exists.
In order to solve the above problems, the present application provides a weather event prediction method, which combines the features of federal learning technology, and uses XGBOOST model to train the model. The data are not required to be uploaded to the central server by each weather station, the model for the local sample is trained locally by each weather station, and meanwhile, similar samples of other weather stations are used for training of the local model, wherein the similar samples are samples similar to the samples collected locally in the samples collected by other weather stations, the local model is trained by receiving the aggregate gradient values of the similar samples of the other weather stations, the problem of data privacy leakage is avoided, the models for the weather stations are synchronously and parallelly trained by each weather station, and the calculation efficiency of the models and the accuracy of the models are effectively improved.
First, the flow of the weather event prediction method according to the embodiment of the present application will be described in its entirety.
FIG. 1 shows a schematic overall flow diagram of a weather event prediction method, the overall flow of predicting a weather event comprising the steps of:
s101: early data is acquired to train a weather prediction model.
The early data refers to early weather data, for example, if a certain weather data indicates that under the data conditions of 29 ℃ temperature, 73% humidity, 27 km/h wind speed, 1009 hundred Pa air pressure and the like, the weather condition of a certain ground is small rain, then the temperature, humidity, wind speed and air pressure are sample feature sets of weather events, the small rain is a sample label of the weather events, and the sample feature sets of the weather events and the sample label form a weather event sample. The model is trained by using the early known data, i.e. the parameters of the model are found, and finally a weather prediction model with known parameters is obtained.
In the embodiment of the application, the model used for model training is an XGBOOST model. The model training process is a process of constructing a regression tree, namely, continuously adding the regression tree, i.e. learning a new function f (t) to fit the predicted residual error of the t-1 tree trained previously.
S102: and predicting the meteorological event by using the trained model.
The final trained model can be used for predicting the weather result of the weather event after the sample feature set of the weather event is input into the trained model under the condition that the feature set (temperature, humidity, wind speed, air pressure and the like) of each weather sample is known and the weather event is not known. Specifically, when the weather condition of a weather event is predicted, a sample feature set of the weather event is input into a trained model, one sample feature of the sample feature set correspondingly falls on one leaf node in one tree, and finally, the sum of the weights of the leaf nodes obtained by each tree is taken as a predicted value of the weather event.
In a specific embodiment of the present application, the sample feature set in the meteorological data may include other meteorological features such as temperature, humidity, wind speed, air pressure, etc., the sample feature set is not limited in the embodiment of the present application, the meteorological condition may be one or more of other meteorological conditions such as wind, cloud, snow, etc., and the number of sample tags corresponding to a certain meteorological condition is not limited.
The weather event prediction method provided by the embodiment of the application is applied to a weather prediction system, wherein the weather prediction system comprises data terminals of a plurality of weather stations, the weather stations are trained synchronously and parallelly, the model training process is consistent, and the joint training is carried out on the basis of not sharing data. When each weather station trains a local XGBOOST model, not only local sample data but also data similar to the local sample data of other weather stations are used. The method can realize joint training under the condition of not revealing sample data of all weather stations, solves the problem of data privacy disclosure among the data of the weather stations, and uses the local samples and similar sample training models of other weather stations to enable the models to be more accurate, reasonably utilize the data and resources, and synchronize and train the models of all weather stations, thereby effectively improving the calculation efficiency of the models.
The model training method provided in the application embodiment is described below by taking a single weather station as an example.
Since a large amount of sample data is required in the model training process, the structure of the sample data will be described first.
The embodiment of the application relates to model training among a plurality of weather stations, and each weather station has own weather data, then P i Indicating the ith weather station, i epsilon (1, 2,3, … …, M), M being the number of weather stations,representing the ith weather station P i Sample data of sample q of (q. Epsilon. (1, 2,3, … …, N) i ),N i For the ith weather station P i Sample number of samples, sample data->Comprising sample feature set->And sample tag->Sample feature set->Representing the ith weather station P i Characteristic set (meteorological data such as temperature, humidity, wind speed, air pressure and the like) corresponding to sample q, wherein the sample characteristic set is +.>T is the number of sample features, sample tag +.>Representing the ith weather station P i Sample tags (no, light, medium, heavy) corresponding to samples q of (1), wherein ∈>0 indicates no rain, 1 indicates light rain, 2 indicates medium rain, 3 indicates heavy rain, and 4 indicates heavy rain. Then the ith weather station P i Is of the sample set I of (1) i Can be expressed as +.>Can also be expressed asAnd each sample data corresponds to a sample Identity (ID), e.g., sample +.>Sample identification 1, sample->And so on.
In a specific embodiment of the present application, the naming manner of the sample identifier corresponding to the sample data is not limited, and each weather station may confirm the identifier of each sample by itself or be uniformly determined by all weather stations participating in model training.
FIG. 2 shows a sample data structure of a data terminal provided in an embodiment of the present application, with a first weather station P 1 For example, the first weather station P shown in FIG. 2 1 Sample data table I of (2) 1 Representing a first weather station P 1 Is selected from the first set of samples of the sample,sample data->Sample identification 1, sample data +.>Sample identification of 2, sample data +.>Sample number 3, … …, < >>Is identified as N 1 Sample->Is a set of sample features of (a)Includes sample characteristics->The sample label corresponding to the sample is +.>In the table shown in fig. 2, a corresponding one of the sample marks 1 acts as the first weather station P 1 Sample data of sample 1, which sample data comprises sample characteristics +.>Has a value of 12, sample characteristics->Has a value of 17, sample characteristics->Has a value of 10, … …, sample characteristics->Is 54, sample labelAt 0, the sample is identified as 2, and the corresponding weather station P is used for 1 Sample data for sample 2, and so on.
By a first weather station P 1 By way of example, a first weather station P is described 1 First terminal and second weather station P of (C) 2 Is a training process of the second terminal in model training. FIG. 3 shows a flow diagram of a model training process provided in an embodiment of the present application. Since the process of model training for each weather station is consistent, only in FIG. 3 Showing a first weather station P 1 It should be appreciated that when there are M weather stations performing training of the model simultaneously, the second weather station P 2 To the Mth weather station P M The M-1 weather stations are located at the first weather station P 1 The model training is also performed locally in the model training of (1), and the process of model training and the first weather station P 1 The model training process is the same.
S201: and the first terminal converts each sample in the first sample set into a hash value to obtain a first hash table corresponding to the first sample set.
First sample set I 1 Is a first weather station P 1 A collection of collected samples, including N 1 And a plurality of sample data, each sample including a sample feature set and a sample tag, the sample feature set including temperature, humidity, wind speed, and air pressure, the sample tag indicating a meteorological condition. Encryption of the data is required before training of the model, for the first weather station P 1 Is set to be a single sample dataGenerating L hash values +.>Wherein delta a,b (v) =cossim (a, v) +b is expressed as a hash function, a is a d-dimensional random vector, v is d-dimensional sample data, and b is [0,1 ]]A random number set by each weather station, and { delta } corresponding to the random number k } k=1,2…,L Representing the L hash functions corresponding to the different random vectors a and random numbers b. Thus, each sample data is mapped into a fixed-length character string through a hash function, and fig. 4 shows a schematic diagram of a data structure obtained by encrypting a sample set of a data terminal provided in an embodiment of the present application.
The first weather station P as shown in FIG. 4 1 Is a data structure diagram of the encrypted sample set, a first weather station P 1 Is of the sample data of (a)After the hash function processing, L hash functions are generated +.>Sample data->After the hash function processing, L hash functions are generated +.>Similarly, a first weather station P 1 Together with N 1 Samples of weather station P 1 Obtaining N 1 * A first hash table of L.
S202: and the second terminal converts each sample in the third sample set into a hash value to obtain a second hash table corresponding to the third sample set, and sends the second hash table to the first terminal.
Wherein the third sample set I 2 Is the second weather station P 2 The system comprises collected samples, each sample comprises a sample feature set and a sample label, the sample feature set comprises temperature, humidity, wind speed and air pressure, the sample label indicates weather conditions, and the second hash table comprises a sample identifier corresponding to each sample in a third sample set and a hash value corresponding to each sample. Similar to the first terminal, for the second weather station P 2 Is a third sample set I of (1) 2 Second weather station P 2 The second terminal in (2) generates an N according to the hash function 2 * L and transmitting the second hash table to the first weather station P 1
It should be understood that when the participating model training involves M terminals, the terminals of each weather station participating in the model training will execute the operation of the second terminal, generate the hash table corresponding to the respective samples, and send the hash table to the first terminal, that is, each weather station P i Will generate an N i * The hash table of L is sent to the first terminal, and the hash functions used by the weather stations are the same.
S203: the first terminal receives a second hash table sent by the second terminal, obtains a sample identification set according to the first hash table and the second hash table, and sends the sample identification set to the second terminal.
The second hash table comprises an identifier corresponding to each sample in the third sample set and a hash value corresponding to each sample, and the first terminal determines the sample identifier corresponding to the sample most similar to each sample in the first sample set in the third sample set according to the first hash table and the second hash table, so that a sample identifier set is obtained.
Specifically, for a certain hash value in the first hash table, the hash value corresponds to a certain sample in the first sample set, the first terminal searches a hash value closest to the hash value from the second hash table, determines a sample identifier corresponding to the closest hash value, when each corresponding sample in the first hash table determines a sample identifier, a set represented by all samples is the sample identifier set, and a sample corresponding to the closest hash value is the most similar sample of a certain sample in the first sample set.
When M terminals are involved in model training, the sample identification set comprises a weather station and a first weather station P involved in model training 1 The most similar sample corresponding to the local sample of the first weather station P 1 Any one of the first set of samples of (a). First weather station P 1 According to the received M-1 hash tables of the other weather stations, comparing the hash tables of the other weather stations with the first hash table, and determining that the other weather stations collect samples and the first weather station P 1 The most similar sample identification set of the first sample set of (a), i.e. N is determined from each weather station 1 A plurality of sample identifications, wherein each sample data corresponding to the sample identifications represents a sample data corresponding to the first weather station P 1 The sample data of one of the samples of (a) is most similar and the sample identification set is sent to the terminals of the other M-1 weather stations participating in the model training.
Due to the first weather station P 1 With N 1 Sample data of one sample qCan all be from other weather stations P i Sample identity of the sample found to be most similar in (i=2, 3,4, … …, M), noted +.>When i=1, the most similar sample is the first weather station P 1 Wherein the most similar sample is obtained by comparing the L hash values of a single sample in the hash table, and since the hash value is a string of a fixed length, the most similar sample is found by comparing the sizes of the strings, i.e. the two compared strings are the closest in size, and the two samples are the most similar.
Fig. 5 schematically shows a first weather station P 1 And a data structure diagram of the sample identification set determined according to the hash table. As shown in the table of FIG. 5, each row represents a first weather station P 1 Sample identification of the sample most similar to the other weather stations, each column representing a weather station and a first weather station P 1 Sample identification of the most similar sample corresponding to the sample of (c). By a second weather station P 2 For example, the second column in FIG. 5 is the second weather station P 2 With the first weather station P 1 Sample identification of the most similar sample corresponding to the sample of (a), e.g. the second weather station P 2 Intermediate and first weather station P 1 Is of the sample data of (a)The most similar sample is the second weather station P 2 The sample identification is 45 corresponding to the sample data.
It should be noted that, since the first weather station P is sought 1 Is a similar sample of the first weather station P 1 With N 1 Each weather station will find N 1 Sample identification and first weather station P 1 In one-to-one correspondence with the samples, i.e. the first weather station P 1 One N is obtained 1 * M sample identification set S 1 . For example, a first weather station P 1 With 100 samples, the second weather station P 2 There are 150For the sample, for the first weather station P 1 Is a first weather station P 1 From a second weather station P 2 A sample identity of a most similar sample is determined from the second weather station P 2 The sample identities of the 100 most similar samples determined in (a) may be different from each other or may be partially the same. It can be seen that when the ith weather station P is sought i When the samples are similar, the ith weather station P i Will obtain a sample identification set S i ∈N i X M, and broadcast the sample identification set.
S204: the first terminal calculates a first-order gradient set and a second-order gradient set of a loss function of the model to be trained according to each sample in the first sample set.
Wherein one gradient value in the one-step gradient set is calculated from one sample in the first sample set, and one gradient value in the second-order gradient set is calculated from one sample in the first sample set. Using the formulaAnd->Calculating a first weather station P 1 First order gradient g of a certain sample q of (2) 1q And second order gradient h 1q Wherein->Representing a first weather station P 1 Sample tag of sample i>Representing the predicted value of sample i, which is given as the initial value when constructing the first tree, since no model has been trained at this time, function l () represents the loss function of the model to be trained, function l' () represents the first derivative of the loss function, function l "() represents the second derivative of the loss function, and the first weather station P 1 Is the first weather station P 1 Is of the first sample setAny one sample, i.e. the first weather station P 1 A first order gradient and a second order gradient are calculated for each sample of (a).
It will be appreciated that when M terminals are involved in model training, each terminal of each weather station involved in model training will perform the first weather station P described above 1 Is characterized by comprising the following steps: the first-order gradient and the second-order gradient of the loss function are calculated using the sample set, wherein the selection of the loss function is not limited in the application, and for example, the loss function may be loglos.
S205: the second terminal receives the sample identification set sent by the first terminal, determines a second sample set in a third sample set according to the sample identification set, calculates an aggregate first-order gradient set and an aggregate second-order gradient set of a loss function of the model to be trained according to each sample in the second sample set, and sends the aggregate first-order gradient set and the aggregate second-order gradient set to the first terminal.
Wherein the second sample set comprises samples determined in the third sample set to be most similar to each sample in the first sample set, the second weather station P 2 The second terminal of (2) receives the first weather station P 1 The second terminal finds a second sample set from the third sample set based on the sample identification set. Also using the formula And->To calculate an aggregate first-order gradient set and an aggregate second-order gradient set of the loss function of the second sample set, and to send to the first weather station P 1 Wherein->Representing a second weather station P 2 Sample tag of sample q of ∈10->Representation sampleThe predicted value of q is a given initial value when constructing the first tree, since no model is trained at this time, the function l () represents the loss function of the model to be trained, the function l' () represents the first derivative of the loss function, and the function l "() represents the second derivative of the loss function. Due to the first weather station P 1 In total N 1 Samples are taken, so the second terminal calculates N altogether 1 A first order gradient and N 1 The second order gradient is polymerized. It should be understood that this N 1 The first order gradients of the aggregation may be different from each other or may be partially identical, and similarly, N 1 The individual polymeric second order gradients may be different or may be partially identical.
It should be understood that when M terminals are involved in the model training, the terminal of each weather station involved in the model training performs the operation of the second terminal, that is, finds a similar sample set according to the received sample identification set, calculates an aggregate first-order gradient set and an aggregate second-order gradient set of the similar sample set, and sends the aggregate first-order gradient set and the aggregate second-order gradient set to the first weather station.
For example, when the weather station involved in model training further includes a third weather station P 3 Fourth weather station P 4 … …, mth weather station P M When the method is used, a terminal of the third weather station finds samples similar to the first weather station in the third weather station according to the sample identification set sent by the first weather station, calculates gradient values of the similar samples to send to the first weather station, and the terminal of the fourth weather station finds samples similar to the first weather station in the fourth weather station according to the sample identification set sent by the first weather station, calculates gradient values of the similar samples to send to the first weather station, and so on, so that the first weather station can obtain gradient values of the first sample set and gradient values of samples similar to the first sample set in other weather stations participating in model training.
S206: the first terminal receives an aggregate first-order gradient set and an aggregate second-order gradient set which are sent by the second terminal and are calculated according to the second sample set, and trains the model to be trained according to the first-order gradient set, the aggregate first-order gradient set, the second-order gradient set, the aggregate second-order gradient set and the first sample set to obtain a trained model.
(1) The first terminal updates the first-order gradient set of the first sample set according to the first-order gradient set and the first-order gradient set to obtain a first-order sample gradient set, and updates the second-order gradient set of the first sample set according to the second-order gradient set and the second-order gradient set to obtain a second-order sample gradient set.
When the weather stations participating in model training comprise only the first weather station P 1 And a second weather station P 2 For a certain sample data in the first sample set, the first terminal compares the first-order gradient of the loss function of the sample data with the second weather station P 2 Aggregate the sum of the first-order gradients corresponding to the most similar samples of the sample data as the first-order sample gradient of the sample data, and combine the second-order gradient of the loss function of the sample data with the second weather station P 2 The sum of the aggregate second-order gradients corresponding to the samples most similar to the sample data is used as the second-order sample gradient of the sample data. Whereby the first terminal obtains a first weather station P 1 A first-order sample gradient set and a second-order sample gradient set of the first sample set of (c).
It will be appreciated that when M terminals are involved in model training, the first weather station P 1 The first terminal of the other weather station receives the aggregate first-order gradient set and the aggregate second-order gradient set sent by the terminals of the other weather stations, and the first weather station P 1 Is a first weather station P 1 Obtaining a first order gradient of polymerizationAnd polymerizing second order gradient->Then the first weather station P 1 According to->First-order sample gradient G for updating sample q 1q According toUpdate->Is of the second order sample gradient H 1q And, a first weather station P 1 The first and second gradients are updated for training of the model.
(2) Training the XGBOOST model by the first terminal by using the first sample set, the first-order sample gradient set and the second-order sample gradient set, thereby obtaining a first weather station P 1 Is to be predicted.
At the training first weather station P 1 In the weather prediction model, a first-order sample gradient set and a second-order sample gradient set of the first sample set are used as the first-order gradient set and the second-order gradient set when the first sample set is trained, and a first gradient tree of the model is trained.
Because the process of training the model is a process of continuously constructing the regression tree, particularly, when constructing the first tree, splitting is needed at the root node, the first sample set is divided into two sets of a left child node and a right child node at the node, and the sample gradient values of the samples are used for calculating G of the two sets L 、G R 、H L 、H R And (3) reusing the formula:
the Gain is calculated and the maximum value of the Gain value Gain is used as a criterion for judging the optimal splitting point.
Wherein G is L Represents the sum of the first-order sample gradients of the set of sample points in the left leaf node after splitting, G R Represents the sum of first-order sample gradients of the set of sample points in the right leaf node if split, H L Represents the sum of the second order sample gradients of the set of sample points in the left leaf node if split, H R Representing the sum of the second order sample gradients of the set of sample points in the right leaf node if split.
Firstly, the division interval is determined according to different division points, the first sample set is divided into two sets of a left sub-node and a right sub-node, then the division point is determined by using the sample characteristic set of the sample data, and then the Gain under different division points is repeatedly calculated.
For example, for sample characteristicsIf for sample characteristics->At weather station P 1 The sample data of (a) includes data values {12,15,20,30,35}, and the Gain is calculated using 12,15,20,30,35 as the division points. The next sample feature is traversed in the same way, gain calculated, and so on.
Taking the splitting point with the maximum Gain as a root node, obtaining a sample set of left and right sub-nodes after splitting, judging whether splitting is continued or not according to the depth of the tree, if the split sub-nodes only leave one sample, the node does not need to split any more, and the node becomes a leaf node, and according to the formula:
the weight of the leaf node is calculated. Wherein, among them, To fall into the sum of all samples first order sample gradient statistics of leaf i, +.>The sum of the second order sample gradient statistics for all samples falling into leaf i.
If the depth of the tree is not reached, the same splitting operation is continuously carried out on the left child node and the right child node, namely the child node is regarded as a root node to circulate the process.
If the depth of the tree is reached, the nodes of the tree can not be split any more, and leaf nodes are calculatedWeighting ofThereby training the first gradient tree.
At construction t (t)>1) In the course of the tree, the training process is exactly the same as the previous t-1 tree, but the input parameters of the tree are no longer the initial input parameters G used by the first tree 1q 、H 1q Since the t-th tree is fitted on the basis of the previous t-1 tree, the step is oneAnd second order gradient->Predictive value of ith training sample by using model formed by t-1 trees before>And then calculating the Gain based on the splitting points, and finally determining the optimal splitting points and the optimal weights required by the construction of the gradient tree of the round.
The XGBOOST model training is complete when all sample features in the sample feature set are used in the construction of the model.
It will be appreciated that when M terminals are involved in model training, the terminals of the weather station performing the model training will perform the first weather station P 1 The terminals of the weather stations participating in the model training all execute the second weather station P 2 Is provided. It can be seen that for one weather station, the weather station not only trains the model using sample data to other weather stations, but also obtains a predictive model for the present weather station.
S207: the first terminal predicts the sample to be predicted based on the trained model, and determines the prediction result of the sample to be predicted.
After the first terminal is trained to obtain the weather event prediction model, the weather condition of a certain event can be predicted by using a sample to be predicted. Substituting a sample feature set in a sample to be predicted into a trained regression tree, wherein each sample feature finally falls on a leaf node of one regression tree, adding up weight values of the leaf nodes obtained by all trees to obtain a weather predicted value of the event, and comparing the result value with the value of which sample label is closest to obtain a weather condition (no rain, light rain, medium rain, heavy rain and heavy rain) corresponding to the sample label to obtain a predicted result of the sample to be predicted.
It can be seen that the method uses the idea of federal learning to find samples similar to the samples of the model training party from the sample data of the model training party to expand the training sample set, so as to construct a more accurate model, and meanwhile, after finding out the similar samples, the method does not directly send the sample data to the model training party, but sends the gradient value of the loss function of the similar samples, thereby avoiding the problem of data leakage.
Fig. 6 is a schematic diagram of a weather event prediction apparatus provided in an embodiment of the present application, where the weather event prediction apparatus 100 includes a receiving unit 101, a processing unit 102, and a transmitting unit 103, and the apparatus is capable of performing the operations of the first terminal and the second terminal. Wherein, when the weather event prediction apparatus 100 performs the operation of the first terminal:
a receiving unit 101, configured to receive an aggregate first-order gradient set and an aggregate second-order gradient set, which are sent by a second terminal and are calculated according to a second sample set, where the second sample set includes samples similar to each sample in the first sample set; and receiving a second hash table sent by the second terminal, wherein the second hash table comprises a sample identifier corresponding to each sample in the third sample set and a hash value corresponding to each sample.
A processing unit 102, configured to calculate, according to each sample in a first sample set, a first-order gradient set and a second-order gradient set of a loss function of a model to be trained, where one gradient value in the first-order gradient set is calculated according to one sample in the first sample set, and one gradient value in the second-order gradient set is calculated according to one sample in the first sample set, and the first sample set is a set of samples collected by a first weather station; training the model to be trained according to the first-order gradient set, the second-order gradient set and the first sample set to obtain a trained model; converting each sample in the first sample set into a hash value to obtain a first hash table corresponding to the first sample set; and determining sample identifiers corresponding to samples most similar to each sample of the first sample set in a third sample set according to the first hash table and the second hash table, and obtaining a sample identifier set.
And a sending unit 103, configured to send the sample identifier set to the second terminal, so that the second terminal determines the second sample set according to the sample identifier in the sample identifier set.
When the weather event prediction apparatus 100 performs the operation of the second terminal:
a receiving unit 101, configured to receive a set of sample identifiers sent by the first terminal, where each sample identifier in the set of sample identifiers indicates a sample in the third set of samples.
The processing unit 102 is configured to convert each sample in the third sample set into a hash value, so as to obtain a second hash table corresponding to the third sample set; and determining a second sample set in the third sample set according to the identification set, calculating an aggregate first-order gradient set and an aggregate second-order gradient set of the loss function of the model to be trained according to each sample in the second sample set, and sending the aggregate first-order gradient set and the aggregate second-order gradient set to the first terminal.
And the sending unit 103 is configured to send a second hash table to the first terminal, where the second hash table includes a sample identifier corresponding to each sample in a third sample set and a hash value corresponding to each sample, and the third sample set is a sample collected by the second weather station.
In particular, the weather event prediction apparatus 100 may refer to the related operation of the first terminal in the method embodiment, which is not described herein in detail.
Fig. 7 is a schematic structural diagram of a computing device according to an embodiment of the present application, where the computing device 200 includes: processor 210, communication interface 220, and memory 230, processor 210, communication interface 220, and memory 230 being interconnected by bus 240, wherein processor 210 is configured to execute instructions stored by memory 230. The memory 230 stores program code, and the processor 210 may call the program code stored in the memory 230 to:
the weather event prediction device calculates a first-order gradient set and a second-order gradient set of a loss function of a model to be trained according to each sample in a first sample set, wherein one gradient value in the first-order gradient set is calculated according to one sample in the first sample set, one gradient value in the second-order gradient set is calculated according to one sample in the first sample set, and the first sample set is a set of samples collected by a first weather station; receiving an aggregate first-order gradient set and an aggregate second-order gradient set which are sent by a second terminal and are calculated according to a second sample set, wherein the second sample set comprises samples similar to each sample in the first sample set; training the model to be trained according to the first-order gradient set, the aggregation first-order gradient set, the second-order gradient set, the aggregation second-order gradient set and the first sample set to obtain a trained model; and predicting the sample to be predicted based on the trained model, and determining the prediction result of the sample to be predicted.
In the embodiment of the present application, the processor 210 may have various specific implementation forms, for example, the processor 210 may be any one or a combination of multiple processors, such as a central processing unit (central processing unit, CPU), a graphics processor (graphics processing unit, GPU), a tensor processing unit (tensor processing unit, TPU), or a neural network processor (neural network processing unit, NPU), and the processor 210 may also be a single-core processor or a multi-core processor. The processor 210 may be formed from a combination of a CPU (GPU, TPU, or NPU) and a hardware chip. The hardware chip may be an application-specific integrated circuit (ASIC), a programmable logic device (programmable logic device, PLD), or a combination thereof. The PLD complex program logic device (complex programmable logical device, CPLD), field-programmable gate array (field-programmable gate array, FPGA), general-purpose array logic (generic array logic, GAL) or any combination thereof. The processor 210 may also be implemented solely with logic devices incorporating processing logic, such as an FPGA or digital signal processor (digital signal processor, DSP) or the like.
The communication interface 220 may be a wired interface, which may be an ethernet interface, a controller area network (controller area network, CAN) interface, or a local interconnect network (local interconnect network, LIN) interface, or a wireless interface, which may be a cellular network interface, or use a wireless lan interface, etc., for communicating with other modules or devices.
The memory 230 may be a nonvolatile memory such as a read-only memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an electrically Erasable EPROM (EEPROM), or a flash memory. Memory 230 may also be volatile memory, which may be random access memory (random access memory, RAM) used as external cache memory.
Memory 230 may also be used to store instructions and data such that processor 210 invokes the instructions stored in memory 230 to perform the operations performed by processing unit 103 or the operations performed by the weather event prediction device of the method embodiment described above. Moreover, computing device 200 may contain more or fewer components than shown in FIG. 7, or may have a different configuration of components.
The bus 240 may be classified into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in fig. 6, but not only one bus or one type of bus.
Optionally, the computing device 200 may further include an input/output interface 250, where the input/output interface 250 is connected to an input/output device, for receiving input information and outputting operation results.
It should be understood that the computing device 200 of the embodiment of the present application may correspond to the data processing apparatus 100 of the embodiment described above, and may perform the operations performed by the weather event prediction apparatus of the method embodiment described above, which are not described herein.
The embodiments of the present application further provide a non-transitory computer storage medium, where instructions are stored in the computer storage medium, when the computer storage medium runs on a processor, the method steps in the foregoing method embodiments may be implemented, and specific implementation of the method steps by the processor of the computer storage medium may refer to specific operations of the foregoing method embodiments, which are not repeated herein.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to the related descriptions of other embodiments.
The above embodiments may be implemented in whole or in part by software, hardware, firmware, or any other combination. When implemented in software, the above-described embodiments may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded or executed on a computer, produces a flow or function in accordance with embodiments of the present application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in or transmitted from one computer-readable storage medium to another, for example, by wired (e.g., coaxial cable, optical fiber, digital Subscriber Line (DSL)), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains one or more sets of available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, tape), an optical medium, or a semiconductor medium, which may be a solid state disk.
The foregoing is merely a specific embodiment of the present application. Variations and alternatives will occur to those skilled in the art from the detailed description provided herein and are intended to be included within the scope of the present application.

Claims (8)

1. A weather event prediction method, the weather event prediction method being applied to a weather prediction system including a first terminal located at a first weather station and a second terminal located at a second weather station, the method comprising:
the first terminal calculates a first-order gradient set and a second-order gradient set of a loss function of a model to be trained according to each sample in a first sample set, wherein one gradient value in the first-order gradient set is calculated according to one sample in the first sample set, one gradient value in the second-order gradient set is calculated according to one sample in the first sample set, and the first sample set is a set of samples collected by a first weather station;
the first terminal converts each sample in the first sample set into a hash value to obtain a first hash table corresponding to the first sample set;
the first terminal receives a second hash table sent by the second terminal, wherein the second hash table comprises an identifier corresponding to each sample and a hash value corresponding to each sample in a third sample set, and the third sample set is a sample collected by the second weather station;
The first terminal determines sample identifiers corresponding to samples most similar to each sample of the first sample set in the third sample set according to the first hash table and the second hash table, and a sample identifier set is obtained;
the first terminal sends the sample identification set to the second terminal so that the second terminal can determine a second sample set according to the sample identification in the sample identification set, wherein the second sample set comprises samples which are determined in a third sample set and are most similar to each sample in the first sample set;
the first terminal receives an aggregate first-order gradient set and an aggregate second-order gradient set which are sent by the second terminal and are calculated according to the second sample set;
the first terminal trains the model to be trained according to the first-order gradient set, the aggregation first-order gradient set, the second-order gradient set, the aggregation second-order gradient set and the first sample set to obtain a trained model;
and the first terminal predicts the sample to be predicted based on the trained model, and determines the prediction result of the sample to be predicted.
2. The method of claim 1, wherein each sample comprises a sample feature set and a sample tag, the sample feature set comprising temperature, humidity, wind speed, and air pressure, the sample tag indicating a meteorological condition.
3. A weather event prediction method, the weather event prediction method being applied to a weather prediction system including a first terminal located at a first weather station and a second terminal located at a second weather station, the method comprising:
the second terminal sends a second hash table to the first terminal, wherein the second hash table comprises a sample identifier corresponding to each sample in a third sample set and a hash value corresponding to each sample, and the third sample set is a sample collected by the second weather station;
the second terminal receives a sample identification set sent by the first terminal, each sample identification in the identification set indicates one sample in the third sample set, each sample identification in the sample identification set is a sample identification corresponding to a sample which is most similar to each sample in the first sample set in the third sample set according to a first hash table and the second hash table, the first hash table comprises a hash value corresponding to each sample in the first sample set, and the first sample set is a set of samples collected by a first weather station;
the second terminal determines a second sample set in the third sample set according to the sample identification set, calculates an aggregate first-order gradient set and an aggregate second-order gradient set of a loss function of a model to be trained according to each sample in the second sample set, and sends the aggregate first-order gradient set and the aggregate second-order gradient set to the first terminal, so that the first terminal trains the model to be trained according to the first-order gradient set of the loss function of the model to be trained, the aggregate first-order gradient set, the second-order gradient set of the loss function of the model to be trained, and the first sample set, the trained model is used for predicting samples to be predicted, and a prediction result of the samples to be predicted is determined, wherein the first-order gradient set and the second-order gradient set are calculated for each sample in the first sample set by the first terminal.
4. The method of claim 3, wherein before the second terminal sends the second hash table to the first terminal, further comprising: and the second terminal converts each sample in the third sample set into a hash value to obtain a second hash table corresponding to the third sample set.
5. The method of claim 3 or 4, wherein each sample comprises a sample feature set comprising temperature, humidity, wind speed, and air pressure, and a sample label indicating a meteorological condition.
6. A weather event prediction apparatus, the apparatus comprising: one or more functional modules for performing the method of claim 1 or 2; or for performing the method of any one of claims 3 to 5.
7. A computer device comprising a processor and a memory, the memory for storing instructions, the processor for executing the instructions, which when executed by the processor, perform the method of claim 1 or 2; or performing the method of any one of claims 3 to 5.
8. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program, which is executed by a processor to implement the method of claim 1 or 2; or performing the method of any one of claims 3 to 5.
CN202011312818.9A 2020-11-20 2020-11-20 Meteorological event prediction method and device and related equipment Active CN112381307B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011312818.9A CN112381307B (en) 2020-11-20 2020-11-20 Meteorological event prediction method and device and related equipment
PCT/CN2021/083026 WO2021203980A1 (en) 2020-11-20 2021-03-25 Meteorological event prediction method and apparatus, and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011312818.9A CN112381307B (en) 2020-11-20 2020-11-20 Meteorological event prediction method and device and related equipment

Publications (2)

Publication Number Publication Date
CN112381307A CN112381307A (en) 2021-02-19
CN112381307B true CN112381307B (en) 2023-12-22

Family

ID=74584503

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011312818.9A Active CN112381307B (en) 2020-11-20 2020-11-20 Meteorological event prediction method and device and related equipment

Country Status (2)

Country Link
CN (1) CN112381307B (en)
WO (1) WO2021203980A1 (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112381307B (en) * 2020-11-20 2023-12-22 平安科技(深圳)有限公司 Meteorological event prediction method and device and related equipment
CN113722739B (en) * 2021-09-06 2024-04-09 京东科技控股股份有限公司 Gradient lifting tree model generation method and device, electronic equipment and storage medium
CN113762805A (en) * 2021-09-23 2021-12-07 国网湖南省电力有限公司 Mountain forest fire early warning method applied to power transmission line
CN114239862A (en) * 2021-12-23 2022-03-25 电子科技大学 anti-Byzantine attack federal learning method for protecting user data privacy
CN114091624B (en) * 2022-01-18 2022-04-26 蓝象智联(杭州)科技有限公司 Federal gradient lifting decision tree model training method without third party
CN114626458B (en) * 2022-03-15 2022-10-21 中科三清科技有限公司 High-voltage rear part identification method and device, storage medium and terminal
CN115794981B (en) * 2022-12-14 2023-09-26 广西电网有限责任公司 Method and system for counting meteorological data by using model

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109472283A (en) * 2018-09-13 2019-03-15 中国科学院计算机网络信息中心 A kind of hazardous weather event prediction method and apparatus based on Multiple Incremental regression tree model
CN109783682A (en) * 2019-01-19 2019-05-21 北京工业大学 It is a kind of based on putting non-to the depth of similarity loose hashing image search method
WO2020029590A1 (en) * 2018-08-10 2020-02-13 深圳前海微众银行股份有限公司 Sample prediction method and device based on federated training, and storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10345485B2 (en) * 2015-10-07 2019-07-09 Forensic Weather Consultants, Llc Forensic weather system
CN111144576A (en) * 2019-12-13 2020-05-12 支付宝(杭州)信息技术有限公司 Model training method and device and electronic equipment
CN111695697B (en) * 2020-06-12 2023-09-08 深圳前海微众银行股份有限公司 Multiparty joint decision tree construction method, equipment and readable storage medium
CN112381307B (en) * 2020-11-20 2023-12-22 平安科技(深圳)有限公司 Meteorological event prediction method and device and related equipment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020029590A1 (en) * 2018-08-10 2020-02-13 深圳前海微众银行股份有限公司 Sample prediction method and device based on federated training, and storage medium
CN109472283A (en) * 2018-09-13 2019-03-15 中国科学院计算机网络信息中心 A kind of hazardous weather event prediction method and apparatus based on Multiple Incremental regression tree model
CN109783682A (en) * 2019-01-19 2019-05-21 北京工业大学 It is a kind of based on putting non-to the depth of similarity loose hashing image search method

Also Published As

Publication number Publication date
WO2021203980A1 (en) 2021-10-14
CN112381307A (en) 2021-02-19

Similar Documents

Publication Publication Date Title
CN112381307B (en) Meteorological event prediction method and device and related equipment
CN112738034B (en) Block chain phishing node detection method based on vertical federal learning
CN111953669B (en) Tor flow tracing and application type identification method and system suitable for SDN
CN111835763B (en) DNS tunnel traffic detection method and device and electronic equipment
CN111461410A (en) Air quality prediction method and device based on transfer learning
CN109787699B (en) Wireless sensor network routing link state prediction method based on mixed depth model
WO2020211482A1 (en) Network topology information acquisition method, apparatus, device, and storage medium
CN111294812A (en) Method and system for resource capacity expansion planning
CN116827873A (en) Encryption application flow classification method and system based on local-global feature attention
Tan et al. Recognizing the content types of network traffic based on a hybrid DNN-HMM model
CN115460608A (en) Method and device for executing network security policy and electronic equipment
CN117829307A (en) Federal learning method and system for data heterogeneity
CN111460277B (en) Personalized recommendation method based on mobile social network tree-shaped transmission path
CN115002031B (en) Federal learning network flow classification model training method, model and classification method based on unbalanced data distribution
CN114979017B (en) Deep learning protocol identification method and system based on original flow of industrial control system
Strelkovskaya et al. Multimedia traffic prediction based on wavelet-and spline-extrapolation
CN116319437A (en) Network connectivity detection method and device
CN110163249B (en) Base station classification identification method and system based on user parameter characteristics
CN113114677A (en) Botnet detection method and device
Vegni et al. Analysis of small-world features in vehicular social networks
CN110020087A (en) A kind of distributed PageRank accelerated method based on similarity estimation
WO2024124975A1 (en) Network quality evaluation method, apparatus and device, and storage medium and program product
CN117592556B (en) Semi-federal learning system based on GNN and operation method thereof
CN116744305B (en) Communication system based on safety control of 5G data communication process
CN114938333B (en) Power distribution station room end side node access method, device, chip, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant