WO2023051455A1 - Procédé et appareil d'entraînement de modèle de confiance - Google Patents

Procédé et appareil d'entraînement de modèle de confiance Download PDF

Info

Publication number
WO2023051455A1
WO2023051455A1 PCT/CN2022/121297 CN2022121297W WO2023051455A1 WO 2023051455 A1 WO2023051455 A1 WO 2023051455A1 CN 2022121297 W CN2022121297 W CN 2022121297W WO 2023051455 A1 WO2023051455 A1 WO 2023051455A1
Authority
WO
WIPO (PCT)
Prior art keywords
network device
data
network
trust
network devices
Prior art date
Application number
PCT/CN2022/121297
Other languages
English (en)
Chinese (zh)
Inventor
康鑫
王海光
朱成康
李铁岩
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023051455A1 publication Critical patent/WO2023051455A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Definitions

  • the present application relates to the field of communication technology and the field of artificial intelligence technology, and in particular to a method and device for training a trust model.
  • the network device may be maliciously attacked by other network devices, making the network device unusable.
  • the private data of the network device itself may be obtained by other network devices through illegal means, causing the private data of the network device to be leaked.
  • the prior art usually uses a trust model based on machine learning to evaluate the security of each network device, so as to obtain the trust level of each network device.
  • Network devices can determine whether to communicate with other network devices based on their trust levels.
  • the trust model based on machine learning needs to be trained through a large amount of labeled data, and the labeled data generally needs to be manually labeled. Therefore, a high labor cost is required to obtain label data.
  • the present application provides a trust model training method and device, which uses a combination of threshold judgment and algorithm clustering to label network devices, thereby reducing labor costs for obtaining label data for training trust models.
  • the present application provides a method for training a trust model.
  • the method includes: acquiring communication data of a plurality of network devices; a characteristic model determining characteristic data of a plurality of network devices according to the communication data of a plurality of network devices; determining a plurality of network devices according to a threshold condition and characteristic data of a plurality of network devices label data of each first network device in the network; multiple network devices include at least one first network device and multiple second network devices; the label data indicates the trust level of the network device; multiple second network devices are divided into preset number of clustering groups, and obtain the label data corresponding to each clustering group as the label data of each second network device in each clustering group; the preset number corresponds to the number of trust levels; according to the number of multiple network devices Feature data and label data update the parameters of the trust model.
  • the training samples of the trust model include label data of multiple network devices.
  • Label data of network devices is generally obtained by manually labeling network devices. In the case of a large number of network devices and a large number of devices, the cost of obtaining label data increases and the time spent is relatively long, resulting in low efficiency of training models.
  • the threshold condition is firstly used to determine the tag data of the first network device among the plurality of network devices. Then, after clustering the second network devices among the plurality of network devices to obtain a cluster group, the label manually attached to each cluster group is used as label data of each second network device in the cluster group.
  • the workload of manual labeling can be reduced, labor costs can be saved, and the efficiency of training models can be improved.
  • determining the label data of each first network device in the plurality of network devices includes: when the characteristic data of the network device in the plurality of network devices satisfies When the threshold condition is met, it is determined that the network device is the first network device; and the label data corresponding to the threshold condition is acquired as the label data of the first network device.
  • Using the threshold condition to determine the label of the first network device can reduce the amount of data processing in the subsequent clustering process.
  • the threshold condition includes: the feature data is smaller than a first threshold, and/or the feature data is larger than a second threshold.
  • the threshold condition shown above can prevent the characteristic data of a network device from becoming a clustering group independently during the clustering process due to being too large or too small, thereby affecting the clustering result.
  • the communication data includes: data transmission success times and data transmission failure times
  • the feature data includes: data transmission success rate
  • the present application provides a trust evaluation method.
  • the method includes: acquiring communication data of the network equipment; a characteristic model determining the characteristic data of the network equipment according to the communication data of the network equipment; and a trust model determining the trust level of the network equipment according to the characteristic data of the network equipment.
  • the trust model is trained by the trust model training method provided in the first aspect of the present application.
  • the trust evaluation method further includes: storing the trust level of the network device in a block chain.
  • the trust evaluation method further includes: storing a hash value corresponding to the trust level of the network device in a blockchain, and storing the trust level of the network device in a storage system.
  • the trust evaluation method further includes: broadcasting the trust level of the network device.
  • the present application also provides a training device for a trust model.
  • the device includes: an acquisition module, a feature extraction module, a first determination module, a second determination module and a training module.
  • the obtaining module is used for obtaining communication data of multiple network devices.
  • the feature extraction module is used to determine the feature data of the multiple network devices according to the communication data of the multiple network devices by using a feature model.
  • the first determination module is used to determine the label data of each first network device in the plurality of network devices according to the threshold condition and the characteristic data of the plurality of network devices; the plurality of network devices include at least one first network device and a plurality of A second network device; the tag data indicates a trust level of the network device.
  • the second determination module is used to divide the plurality of second network devices into a preset number of cluster groups, and obtain the label data corresponding to each cluster group as the second network device in each cluster group.
  • Tag data the preset number corresponds to the number of trust levels.
  • the training module is used to update the parameters of the trust model according to the feature data and label data of multiple network devices.
  • the first determination module is specifically configured to: determine that the network device is the first network device when the characteristic data of the network device among the plurality of network devices meets the threshold condition; and obtain label data corresponding to the threshold condition As the label data of the first network device.
  • the threshold condition includes: the feature data is smaller than a first threshold, and/or the feature data is larger than a second threshold.
  • the communication data includes: data transmission success times and data transmission failure times
  • the feature data includes: data transmission success rate
  • the present application further provides a trust evaluation device.
  • the device includes: an acquisition module, a feature extraction module and an evaluation module.
  • the obtaining module is used for obtaining communication data of the network device.
  • the feature extraction module is used to determine the feature data of the network device according to the communication data of the network device by using a feature model.
  • the evaluation module is used to determine the trust level of the network device according to the characteristic data of the network device by using a trust model.
  • the evaluation module is further configured to: store the trust level of the network device in the block chain; or store the hash value corresponding to the trust level of the network device in the block chain block chain, and store the trust level of the network device in the storage system; or broadcast the trust level of the network device.
  • the present application further provides a computing device.
  • the computing device includes: a processor and a memory.
  • the processor is used to execute the computer program stored in the memory to execute any method in the first aspect of the present application and its possible implementations, or to execute any method in the second aspect of the application and its possible implementations.
  • the present application further provides a computer-readable storage medium.
  • the computer-readable storage medium includes instructions. When the instructions are run on the computer, the computer is made to execute any method in the first aspect of the application and its possible implementations, or any method in the second aspect of the application and its possible implementations. either method.
  • the present application further provides a computer program product.
  • the computer program product includes program code.
  • the computer runs the computer program product, it causes the computer to execute any method in the first aspect of the application and its possible implementations, or any method in the second aspect of the application and its possible implementations.
  • FIG. 1 is a schematic structural diagram of a heterogeneous network provided by an embodiment of the present application.
  • Fig. 2 is a flow chart of a training method for a trust model provided by an embodiment of the present application
  • FIG. 3 is a flow chart of a method for trust assessment of network devices provided by an embodiment of the present application
  • FIG. 4 is a flowchart of a communication method for a network device provided in an embodiment of the present application.
  • FIG. 5 is a flow chart of another communication method of a network device provided in an embodiment of the present application.
  • FIG. 6 is a flowchart of another communication method of a network device provided in an embodiment of the present application.
  • FIG. 7 is a schematic structural diagram of a training device for a trust model provided by an embodiment of the present application.
  • Fig. 8 is a schematic structural diagram of a trust evaluation device provided by an embodiment of the present application.
  • FIG. 9 is a schematic structural diagram of a computing device provided by an embodiment of the present application.
  • words such as “exemplary”, “for example” or “for example” are used to represent examples, illustrations or illustrations. Any embodiment or design described as “exemplary”, “for example” or “for example” in the embodiments of the present application shall not be construed as being more preferred or more advantageous than other embodiments or designs. Rather, the use of words such as “exemplary”, “for example” or “for example” is intended to present related concepts in a specific manner.
  • the term "and/or" is only an association relationship describing associated objects, indicating that there may be three relationships, for example, A and/or B may indicate: A exists alone, A exists alone There is B, and there are three cases of A and B at the same time.
  • the term "plurality" means two or more. For example, multiple systems refer to two or more systems, and multiple screen terminals refer to two or more screen terminals.
  • first and second are used for descriptive purposes only, and cannot be understood as indicating or implying relative importance or implicitly specifying indicated technical features. Thus, a feature defined as “first” and “second” may explicitly or implicitly include one or more of these features.
  • the terms “including”, “comprising”, “having” and variations thereof mean “including but not limited to”, unless specifically stated otherwise. .
  • Communication network refers to the physical connection of various isolated devices to realize the link of information exchange between people, people and computers, and computers, so as to achieve the purpose of resource sharing and communication.
  • the communication network includes network devices on the user side, servers on the cloud, and switches, routers, and base stations connecting network devices and servers.
  • the communication network may be a network using the same access technology, such as a cellular network, an Internet Protocol network (Internet Protocol, IP, IP network for short), and a satellite network.
  • Network devices can be smart watches, smartphones, laptops, etc.
  • the communication network can also be a highly integrated heterogeneous network of two or more single networks using different access technologies, or it can be two or more networks using the same access technology but belonging to different operators.
  • a heterogeneous network composed of a single network.
  • heterogeneous networks are becoming more and more important in the communication field. Compared with independent and closed communication networks such as cellular networks, IP networks, and satellite networks, heterogeneous networks can expand the coverage of the network and make the network more scalable. Heterogeneous networks can also make full use of existing network resources, reduce operating costs, and enhance competitiveness. Heterogeneous networks can also provide various services to different users, which can better meet the diverse needs of network users. Heterogeneous networks can also improve network reliability and anti-attack capabilities.
  • FIG. 1 is a schematic structural diagram of a heterogeneous network provided by an embodiment of the present application.
  • the heterogeneous network includes: a network 1 and a network 2 .
  • network 1 may be a cellular network
  • network 2 may be an IP network. It can be understood that the structure shown in FIG. 1 is only an example of a heterogeneous network in this embodiment of the present application.
  • Network 1 and network 2 include multiple network devices, for example, devices 1-3 in network 1 and devices 4-6 in network 2.
  • Devices 1-3 in network 1 can communicate through the base station therein.
  • the devices 4-6 in the network 2 can establish communication through the routers therein.
  • Devices on network 1 can also communicate with devices on network 2.
  • devices 1-6 may be one of devices such as smart phones, tablet computers, and notebook computers.
  • the network device can communicate with other network devices after confirming that other network devices are trusted.
  • the network device may determine whether other network devices are trustworthy according to the trust level of other network devices.
  • the trust level of the network device may be obtained by evaluating the network device using a trust model.
  • the trust model can be established based on mathematical theory and machine learning.
  • the method of establishing a trust model based on mathematical theory is to use mathematical theory to establish a trust model that characterizes the trust relationship between network devices.
  • This modeling approach There are two problems with this modeling approach.
  • the trust relationship between network devices exists in a specific scenario. Therefore, the trust model has a strong dependence on the application scenario and poor transferability.
  • mathematical theory often involves many parameters such as weight factors when modeling trust relationships, and these parameters are generally determined through experience. There is uncertainty in the accuracy of trust evaluation of trust models, resulting in low robustness of trust models. .
  • the method of building a trust model based on machine learning is to use a large number of training samples to train the machine learning model. This method does not need to rely on experience to determine the parameters in the model, and can use different training samples to train the model for different application scenarios. Therefore, this method can not only adapt to different application scenarios, but also the robustness of the obtained trust model is high. Although this method can solve the problem of using mathematical theory to establish a trust model, it also faces new problems. In practical applications, the label data in the training samples often needs to be obtained by manually labeling the feature data. When the amount of data is large, the labor cost of labeling the data increases, resulting in an increase in the cost of obtaining labeled data. It should be understood that the tag data of the network device indicates the trust level of the network device.
  • an embodiment of the present application provides a trust model training method, which is applied to a model training device. This method combines threshold judgment and algorithm clustering to determine the label data of network equipment, and then trains a machine learning model based on the label data of network equipment to obtain a trust model for trust evaluation of network equipment.
  • the model training device may be a device located in a different environment.
  • the model training device may be a server located in the cloud, or a local network device.
  • the model When the model is trained on a local network device of the device, specifically, it may be any one of the network devices shown in FIG. 1 .
  • the following describes in detail the trust model training method provided by the embodiment of the present application with reference to FIG. 2 .
  • FIG. 2 is a flow chart of a trust model training method provided by an embodiment of the present application.
  • the training method includes the following steps S201-S205.
  • Step S201 acquiring communication data of multiple network devices.
  • the model training device can obtain communication data from multiple network devices. Multiple network devices can also send the communication data to the model training device after recording the communication data each time.
  • the network device may be a network device in the heterogeneous network shown in FIG. 1 , or may be a network device in another communication network.
  • the communication data may include: one or more of the number of successes and failures of data transmission, the number of successes and failures of direct data transmission, and the number of successes and failures of indirect data transmission kind.
  • the number of successful data transmissions may be the sum of the number of successful direct data transmissions and the number of successful indirect data transmissions.
  • Step S202 the characteristic model determines the characteristic data of the plurality of network devices according to the communication data of the plurality of network devices.
  • the model training device can input the communication data of multiple network devices into the characteristic model to obtain the characteristic data of the multiple network devices.
  • the feature model may include a mathematical model. Specifically, the characteristic model is determined according to the communication data.
  • the feature model may include a mathematical model for calculating a success rate of data transmission and/or a mathematical model for calculating a failure rate of data transmission. It should be understood that the feature data may include a success rate of data transmission and a failure rate of data transmission.
  • the feature model may further include a mathematical model for calculating the success rate of direct data transmission and/or a mathematical model for calculating the failure rate of direct data transmission. It should be understood that the feature data may also include a success rate of direct data transmission and a failure rate of direct data transmission.
  • the feature model may further include a mathematical model for calculating the success rate of the indirect data transmission and/or a mathematical model for calculating the failure rate of the indirect data transmission. It should be understood that the feature data may also include a success rate of indirect data transmission and a failure rate of indirect data transmission.
  • Step S203 Determine label data of at least one first network device among the plurality of network devices according to the threshold condition and the feature data of the plurality of network devices.
  • the threshold condition and the trust level corresponding to the threshold condition can be preset in the model training device.
  • the threshold condition can be set according to feature data. Taking the trust level including credible and untrustworthy as an example, when the feature data includes the success rate of transmitted data, the threshold condition can be set as two conditions: the success rate of transmitted data is greater than 0.8 and the success rate of transmitted data is less than 0.3, and the transmitted data The trust level corresponding to the success rate greater than 0.8 may be set as credible, and the trust level corresponding to the success rate of data transmission less than 0.3 may be set as untrustworthy.
  • the model training device can match the characteristic data of each network device among the plurality of network devices with the threshold condition, and judge whether the characteristic data of each network device satisfies the threshold condition.
  • the network device is the first network device, and the trust level corresponding to the threshold condition is used as the label data of the first network device.
  • the label data of the network device is the trust level corresponding to the threshold condition of less than the minimum value, that is, untrustworthy; if the feature data of the network device is greater than the maximum value, the label data of the network device The trust level corresponding to the threshold condition that the data is greater than the maximum value, that is, credible.
  • the model training device divides the plurality of network devices into two categories, including: first network devices and second network devices that obtain label data through threshold conditions.
  • the second network device is unable to determine the tag data through the threshold condition.
  • the label data of the second network device is determined through the clustering method in step S204.
  • the label data of the first network device is determined by using the threshold condition, which can avoid that the first network device independently becomes a clustering group due to the large difference between the characteristic data of the first network device and the characteristic data of the second network device. Thus affecting the final clustering result.
  • Step S204 divide the multiple second network devices into a preset number of cluster groups, and obtain label data corresponding to each cluster group as label data of the second network devices in each cluster group.
  • the model training device may use a clustering algorithm to divide the plurality of second network devices into a preset number of cluster groups.
  • Each clustering group includes at least one second network device.
  • the preset number may be determined according to the number of predetermined trust levels. For example, when the trust level is determined to be credible and untrustworthy, the clustering algorithm may be configured to divide the historical trust determination values of multiple network devices into two clustering groups.
  • the model training device After the model training device obtains the cluster groups, it can send each cluster group to the user for manual labeling, and obtain the label data corresponding to each cluster group returned by the user. Specifically, the user can judge the feature data corresponding to each network device in the cluster group, and determine the label data of the cluster group.
  • the model training device can also use an algorithm to automatically label each cluster group, and determine the label data corresponding to each cluster group. Specifically, the present application does not specifically limit the method for automatically labeling each cluster group. After determining the label data corresponding to the cluster group, the model training device may use the label data corresponding to the cluster group as the label data of each second network device in the cluster group.
  • the clustering algorithm may be any one of K-means clustering algorithm, mean shift clustering algorithm and maximum expectation clustering algorithm.
  • Step S205 updating the parameters of the trust model according to the feature data and label data of multiple network devices.
  • the model training device can obtain the label data of each network device among the above-mentioned multiple network devices.
  • the model training device can use the feature data and label data of multiple network devices to train the trust model, so as to update the parameters of the trust model.
  • the model training device may use feature data of multiple network devices as input data of the trust model to obtain output data of the trust model.
  • the model training device can use the error function to calculate the error value between the output data of the trust model and the label data of multiple network devices, and use the gradient descent method to update the parameters of the trust model according to the error value.
  • the model training device judges that the error value or the number of training times meets the preset requirements, the training of the trust model ends, and the trust model at the end of the training is used as the final trust model.
  • the trust model may be one of machine learning models such as convolutional neural network, BP neural network, and deep neural network, or one of other networks used for machine learning.
  • the work of training each layer in the deep neural network can be expressed mathematically to describe.
  • the input data of a layer For the output data of this layer, the input data and output data can be expressed in vector form. It can be understood that the input of the first layer in the deep neural network is the feature data of the network device, and the output of the last layer is the prediction data of the trust level of the network device by the deep neural network.
  • the work of each layer in the deep neural network can be understood as completing the transformation from the input space to the output space (that is, the rows of the matrix) through five operations on the input space (a collection of input data).
  • these five operations include: 1. Dimension increase/reduction; 2. Enlargement/Reduction; 3. Rotation; 4. Translation; 5. "Bending”.
  • the operations of 1, 2, and 3 are performed by Complete, the operation of 4 is completed by +b, and the operation of 5 is realized by a().
  • space refers to the collection of all individuals of this kind of thing.
  • W is a weight vector, and each value in the vector represents the weight value of a neuron in this layer of neural network.
  • the vector W determines the space transformation from the input space to the output space described above, that is, the weight W of each layer controls how to transform the space.
  • the purpose of training the deep neural network is to finally obtain the weight matrix of all layers of the trained neural network (the weight matrix formed by the vector W of many layers). Therefore, the training process of the neural network is essentially to learn the way to control the spatial transformation, and more specifically, to learn the weight matrix.
  • the output of the deep neural network is as close as possible to the value you really want to predict
  • the weight vector of the network (of course, there is usually an initialization process before the first update, which is to pre-configure parameters for each layer in the deep neural network). For example, if the network's prediction data is high, adjust the weight vector to make it The prediction is lower, and it is constantly adjusted until the neural network can predict the label data it really wants.
  • the model training device may also use the feature model to convert the feature data into data that the trust model can recognize, and then input the data output by the feature model into the trust model.
  • the feature model For the introduction of the feature model, reference may be made to the description in the foregoing method embodiment shown in FIG. 2 , which will not be repeated here.
  • the network device when a network device is in the cold start mode, the network device does not generate characteristic data, and the trust evaluation device can conduct a comprehensive evaluation based on objective factors such as the manufacturer, scope of use, importance, and deployment location of the network device , to determine the trust level of the network device.
  • this embodiment of the present application also provides a trust evaluation method for network devices.
  • the method is applied to a trust evaluation device.
  • the trust evaluation device can input the characteristic data of the network device into the trust model, so as to determine the trust level of the network device according to the output of the trust model.
  • the trust evaluation device may use the trust model training method shown in FIG. 2 to obtain the trust model.
  • the trust model in the trust evaluation device may be obtained by the model training device, that is, the model training device sends the model training device to the trust evaluation device after obtaining the trust model.
  • the trust evaluation device may be a device located in a different environment.
  • the trust evaluation device may be a server located in the cloud, or a local network device.
  • the trust evaluation device is a local network device, it may specifically be any one of the network devices shown in FIG. 1 .
  • the following describes in detail how the trust assessment device uses the trust model to perform trust assessment on the network equipment in the heterogeneous network with reference to FIG. 3 .
  • Fig. 3 is a flow chart of a method for trust evaluation of a network device provided by an embodiment of the present application.
  • the trust evaluation method for network equipment specifically includes the following steps S301-S303.
  • Step S301 acquiring communication data of a network device.
  • the trust evaluation device may send the first instruction to the network device.
  • the network device receives the first instruction, it sends its own communication data to the trust evaluation device.
  • the network device may also be configured to record current communication data after each communication ends.
  • the network device may be a network device in the heterogeneous network shown in FIG. 1 , or may be a network device in another communication network.
  • the communication data reference may be made to the description in step S201 in the method embodiment shown in FIG. 2 above, and details are not repeated here.
  • Step S302 the characteristic model determines the characteristic data of the network device according to the communication data of the network device.
  • the trust evaluation device can input the communication data of the network device into the pre-established feature model, and obtain the feature data of the network device output by the feature model.
  • step S202 for the introduction of feature data and feature models, refer to the description in step S202 in the method embodiment shown in FIG. 2 above, and details are not repeated here.
  • Step S303 the trust model determines the trust level of the network device according to the feature data of the network device.
  • the trust evaluation device can input the feature data of the network device into a pre-established trust model, and determine the trust level of the network device according to the output of the trust model. Among them, the process of obtaining the trust model will be described in detail later, and will not be repeated here.
  • the trust evaluation device after the trust evaluation device obtains the trust level of the network device, it can store the identifier of the network device and the trust level in a block chain in association.
  • the trust evaluation device uses the block to store the trust level, which can fully disclose the trust level and ensure that the trust level cannot be tampered with.
  • the trust evaluation device may first determine the hash value corresponding to the trust level. Then, the trust evaluation device associates the identifier of the network device with the hash value corresponding to the trust level and stores it in the block chain, and stores the trust level of the network device and the hash value corresponding to the trust level in the storage system.
  • the storage system may be an interplanetary file system.
  • the trust evaluation device stores the hash value corresponding to the trust level on the blockchain, which can reduce the data storage pressure on the blockchain.
  • the trust evaluation device may also broadcast the trust level of the network device to the outside.
  • the trust evaluation device sends the trust level in the form of broadcast, which can save the cost and time for the network device to obtain the trust level of other network devices, thereby improving communication efficiency.
  • the trust model is based on the same communication data of each network device, and the trust evaluation is performed on each network device, which can realize the trust evaluation of network devices in a heterogeneous network.
  • the subject network device needs to communicate with the object network device, it judges the trust level of the object network device.
  • the subject network device confirms that the object network device can be trusted, it communicates with it, which can improve the security of the subject network device.
  • this embodiment of the present application further provides a communication method for a network device.
  • the communication method is applied to the first network device. For example, when the first network device needs to communicate with the second network device, it may obtain the trust level of the second network device, and determine whether to communicate with the second network device according to the trust level of the network device.
  • FIG. 4 is a communication method of a network device provided by an embodiment of the present application.
  • the method includes the following steps S401-S402.
  • Step S401 acquire the trust level of the second network device from the block chain.
  • the first network device determines that it needs to communicate with the second network device, it can obtain the trust level of the second network device from the block chain according to the identification of the second network device.
  • the first network device and the second network device may be any device in any sub-network of the heterogeneous network shown in FIG. 1 , or may be network devices in other communication networks.
  • the trust evaluation device After the trust evaluation device obtains the trust level of each network device, it can store the identification of the network device and the trust level in association on the block chain, as shown in step S400 in FIG. 4 .
  • the identifier of the network device may be one of a name, an IP address, or an identity number (identity, ID) of the network device.
  • Step S402 determine to communicate with the second network device according to the trust level of the second network device.
  • the first network device may determine whether to communicate with the second network device according to the trust level of the second network device and preset communication conditions.
  • the first network device may establish a communication connection with the second network device when determining that the trust level of the second network device satisfies the communication condition.
  • the communication condition can be set according to the category of the trust level.
  • the communication condition may be that the trust level of the network device to establish the communication connection is trusted.
  • the communication condition may be that the trust level of the network device to establish the communication connection is trustworthy or very trustworthy.
  • the communication condition may be that the trust level of the network device to establish the communication connection is not lower than level three.
  • FIG. 5 is another communication method of a network device provided by an embodiment of the present application.
  • the method includes the following steps S502-S504.
  • Step S502 acquiring a hash value corresponding to the trust level of the second network device from the blockchain.
  • the first network device When the first network device determines that it needs to communicate with the second network device, it can obtain a hash value corresponding to the trust level of the second network device from the block chain. Specifically, the first network device can obtain the hash value corresponding to the trust level of the second network device from the blockchain according to the identifier of the second network device.
  • the trust evaluation device After the trust evaluation device obtains the trust level of each network device, it can determine the hash value corresponding to the trust level of the network device according to the trust level of the network device, and then associate the identifier of the network device with the hash value corresponding to the trust level of the network device is stored on the blockchain, as shown in step S500 in FIG. 5 . Then, the trust evaluation device associates the hash value corresponding to the trust level of the network device with the trust level of the network device and stores it in the storage system, as shown in step S501 in FIG. 5 .
  • the storage system may be an interplanetary file system (inter planetary file system, IPFS).
  • IPFS is a media protocol based on blockchain technology. It uses distributed storage and content addressing technology to change point-to-point single-point transmission into multi-point-to-multipoint P2P transmission. Storing the hash value corresponding to the trust level of the network device in IPFS can reduce the pressure of storing data on the blockchain.
  • Step S503 Determine the trust level of the second network device according to the hash value corresponding to the trust level of the second network device.
  • the first device may obtain the trust level of the second network device from the storage system according to the hash value corresponding to the trust level of the second network device.
  • Step S504 determine to communicate with the second network device according to the trust level of the second network device.
  • step S402 in the aforementioned method embodiment shown in FIG. 4 , and will not be repeated here.
  • FIG. 6 is another communication method of a network device provided by an embodiment of the present application.
  • the method includes the following steps S601-S603.
  • Step S601 receiving a broadcast signal.
  • the first network device may be configured to receive broadcast signals in real time. Wherein, the first network device may discard the received broadcast signal when there is no need for communication.
  • the trust evaluation device After obtaining the trust level of each network device, the trust evaluation device sends out the trust level of each network device through a broadcast signal, as shown in step S600 in FIG. 6 .
  • the trust evaluation device may also send the trust level of each network device to the broadcast device after obtaining the trust level of each network device.
  • the broadcast device receives the trust level of each network device, it sends a broadcast signal.
  • Step S602. Determine the trust level of the second network device according to the broadcast signal.
  • the first network device When the first network device receives the broadcast signal, it analyzes the broadcast signal to obtain the identifier and trust level of the network device included in the broadcast signal. Then, the first network device obtains the trust level of the second network device from the parsed trust level of the network device according to the identifier of the second network device.
  • Step S603 determine to communicate with the second network device according to the trust level of the second network device.
  • step S402 in the aforementioned method embodiment shown in FIG. 4 , and will not be repeated here.
  • an embodiment of the present application further provides a trust model training device.
  • FIG. 7 is a schematic structural diagram of a trust model training device 700 provided by an embodiment of the present application.
  • the training device 700 is used to realize step S201-step S205 in FIG. 2 .
  • the training device 700 includes: an acquisition module 701 , a feature extraction module 702 , a first determination module 703 , a second determination module 704 and a training module 705 .
  • the acquiring module 701 is configured to acquire communication data of multiple network devices.
  • the feature extraction module 702 is used to determine the feature data of multiple network devices according to the communication data of multiple network devices by using a feature model.
  • the first determination module 703 is used to determine the label data of at least one first network device in the plurality of network devices according to the threshold condition and the characteristic data of the plurality of network devices; the plurality of network devices include at least one first network device and multiple a second network device; the tag data indicates the trust level of the network device.
  • the second determination module 704 is used to divide multiple second network devices into a preset number of cluster groups, and obtain label data corresponding to each cluster group as the second network device in each cluster group tag data; the preset number corresponds to the number of trust levels.
  • the training module 705 is used to update the parameters of the trust model according to the feature data and label data of multiple network devices.
  • the training device 700 provided in the embodiment shown in FIG. 7 executes the trust model training method, it only uses the division of the above-mentioned functional modules as an example for illustration. In practical applications, the functions performed by the various modules in the above-mentioned training device 700 can be assigned to other different functional modules according to needs, that is, the internal structure of the training device 700 can be divided into different functional modules to complete all or all of the above-described functions. Some functions.
  • the training device 700 provided in the above embodiment is based on the same idea as the embodiment of the trust model training method shown in FIG. 2 , and its specific implementation process is detailed in the method embodiment, and will not be repeated here.
  • an embodiment of the present application further provides a trust assessment device.
  • FIG. 8 is a schematic structural diagram of a trust evaluation device 800 provided by an embodiment of the present application.
  • the trust evaluation apparatus 800 is used to realize step S301-step S303 in FIG. 3 .
  • the trust evaluation apparatus 800 includes: an acquisition module 801 , a feature extraction module 802 and an evaluation module 803 .
  • the obtaining module 801 is used for obtaining communication data of network devices.
  • the feature extraction module 802 is used to determine the feature data of the network device according to the communication data of the network device by using the feature model.
  • the evaluation module 803 is used to determine the trust level of the network device according to the characteristic data of the network device by using the trust model.
  • the trust assessment apparatus 800 when the trust assessment apparatus 800 provided in the embodiment shown in FIG. 8 executes the trust assessment method, only the division of the above functional modules is used as an example for illustration. In practical applications, the functions performed by the various modules in the above-mentioned trust evaluation device 800 can be assigned to other different functional modules according to needs, that is, the internal structure of the trust evaluation device 800 is divided into different functional modules to complete the above-described full or partial functionality.
  • the trust evaluation device 800 provided in the above embodiment is based on the same idea as the trust evaluation method embodiment shown in FIG. 3 , and its specific implementation process is detailed in the method embodiment, and will not be repeated here.
  • FIG. 9 is a schematic diagram of a hardware structure of a computing device 900 provided by an embodiment of the present application.
  • the computing device 900 may be a network device in the aforementioned communication network, the aforementioned trust evaluation device, or a model training device.
  • the computing device 900 includes a processor 910 , a memory 920 , a communication interface 930 and a bus 940 , and the processor 910 , the memory 920 and the communication interface 930 are connected to each other through the bus 940 .
  • the processor 910 , the memory 920 and the communication interface 930 may also be connected by other connection methods than the bus 940 .
  • the memory 920 can be various types of storage media, such as random access memory (random access memory, RAM), read-only memory (read-only memory, ROM), non-volatile RAM (non-volatile RAM, NVRAM ), programmable ROM (programmable ROM, PROM), erasable PROM (erasable PROM, EPROM), electrically erasable PROM (electrically erasable PROM, EEPROM), flash memory, optical memory, hard disk, etc.
  • RAM random access memory
  • read-only memory read-only memory
  • NVRAM non-volatile RAM
  • PROM programmable ROM
  • PROM erasable PROM
  • EPROM erasable PROM
  • electrically erasable PROM electrically erasable PROM
  • flash memory optical memory, hard disk, etc.
  • the processor 910 may be a general-purpose processor, and the general-purpose processor may be a processor that performs specific steps and/or operations by reading and executing contents stored in a memory (such as the memory 920 ).
  • the general processor may be a central processing unit (CPU).
  • the processor 910 may include at least one circuit to execute all or part of the steps of the method provided by the embodiments shown in FIGS. 2-6 .
  • the communication interface 930 includes an input/output (input/output, I/O) interface, a physical interface and a logical interface, etc., which are used to realize the interconnection of devices inside the network device 900, and are used to realize the connection between the network device 900 and other devices. (such as other network equipment or user equipment) interconnection interface.
  • the physical interface can be Ethernet interface, optical fiber interface, ATM interface, etc.
  • the bus 940 may be any type of communication bus for interconnecting the processor 910, the memory 920 and the communication interface 930, such as a system bus.
  • the above-mentioned devices may be respectively arranged on independent chips, or at least partly or all of them may be arranged on the same chip. Whether each device is independently arranged on different chips or integrated and arranged on one or more chips often depends on the needs of product design.
  • the embodiments of the present application do not limit the specific implementation forms of the foregoing devices.
  • the computing device 900 shown in FIG. 9 is only exemplary. During implementation, the computing device 900 may also include other components, which will not be listed here.
  • all or part of them may be implemented by software, hardware, firmware or any combination thereof.
  • software When implemented using software, it may be implemented in whole or in part in the form of a computer program product.
  • the computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on the computer, the processes or functions according to the embodiments of the present invention will be generated in whole or in part.
  • the computer can be a general purpose computer, a special purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from a website, computer, server or data center Transmission to another website site, computer, server, or data center by wired (eg, coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (eg, infrared, wireless, microwave, etc.).
  • the computer-readable storage medium may be any available medium that can be accessed by a computer, or a data storage device such as a server or a data center integrated with one or more available media.
  • the available medium may be a magnetic medium (for example, a floppy disk, a hard disk, or a magnetic tape), an optical medium (for example, DVD), or a semiconductor medium (for example, a solid state disk (solid state disk, SSD)), etc.
  • a magnetic medium for example, a floppy disk, a hard disk, or a magnetic tape
  • an optical medium for example, DVD
  • a semiconductor medium for example, a solid state disk (solid state disk, SSD)

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computational Linguistics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Evolutionary Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Data Exchanges In Wide-Area Networks (AREA)

Abstract

L'invention concerne un procédé et un appareil d'entraînement d'un modèle de confiance, qui appartiennent au domaine technique des communications et au domaine technique de l'intelligence artificielle. Le procédé consiste à : en fonction de données de caractéristiques d'une pluralité de dispositifs de réseau et d'une condition de seuil, déterminer des données d'étiquette de chaque premier dispositif de réseau parmi la pluralité de dispositifs de réseau, la pluralité de dispositifs de réseau comprenant au moins un premier dispositif de réseau et une pluralité de seconds dispositifs de réseau ; diviser la pluralité de seconds dispositifs de réseau en un nombre prédéfini de groupes de regroupement et obtenir des données d'étiquette correspondant à chaque groupe de regroupement en tant que données d'étiquette de chaque second dispositif de réseau dans chaque groupe de regroupement ; et mettre à jour des paramètres d'un modèle de confiance en fonction des données de caractéristiques de la pluralité de dispositifs de réseau et des données d'étiquette. Le procédé combine un seuil et des procédés de regroupement pour déterminer les données d'étiquette de la pluralité de dispositifs de réseau, ce qui peut réduire les coûts de main-d'œuvre d'étiquetage de dispositifs de réseau.
PCT/CN2022/121297 2021-09-28 2022-09-26 Procédé et appareil d'entraînement de modèle de confiance WO2023051455A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111141934.3A CN115878991A (zh) 2021-09-28 2021-09-28 一种信任模型的训练方法及装置
CN202111141934.3 2021-09-28

Publications (1)

Publication Number Publication Date
WO2023051455A1 true WO2023051455A1 (fr) 2023-04-06

Family

ID=85763340

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/121297 WO2023051455A1 (fr) 2021-09-28 2022-09-26 Procédé et appareil d'entraînement de modèle de confiance

Country Status (2)

Country Link
CN (1) CN115878991A (fr)
WO (1) WO2023051455A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116668095A (zh) * 2023-05-16 2023-08-29 江苏信创网安数据科技有限公司 一种网络安全智能评估方法及***

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130042298A1 (en) * 2009-12-15 2013-02-14 Telefonica S.A. System and method for generating trust among data network users
CN103118379A (zh) * 2013-02-06 2013-05-22 西北工业大学 面向移动自组网的节点合作度评估方法
CN109919771A (zh) * 2019-03-18 2019-06-21 徐雪松 一种应用于工业互联网的分层区块链网络及交易方法
CN110972231A (zh) * 2019-11-14 2020-04-07 深圳前海达闼云端智能科技有限公司 配置***信息的方法、装置、存储介质及网络设备和终端
CN112367355A (zh) * 2020-10-12 2021-02-12 新华三技术有限公司 信任等级的发布方法及装置
CN112884159A (zh) * 2019-11-30 2021-06-01 华为技术有限公司 模型更新***、模型更新方法及相关设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130042298A1 (en) * 2009-12-15 2013-02-14 Telefonica S.A. System and method for generating trust among data network users
CN103118379A (zh) * 2013-02-06 2013-05-22 西北工业大学 面向移动自组网的节点合作度评估方法
CN109919771A (zh) * 2019-03-18 2019-06-21 徐雪松 一种应用于工业互联网的分层区块链网络及交易方法
CN110972231A (zh) * 2019-11-14 2020-04-07 深圳前海达闼云端智能科技有限公司 配置***信息的方法、装置、存储介质及网络设备和终端
CN112884159A (zh) * 2019-11-30 2021-06-01 华为技术有限公司 模型更新***、模型更新方法及相关设备
CN112367355A (zh) * 2020-10-12 2021-02-12 新华三技术有限公司 信任等级的发布方法及装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116668095A (zh) * 2023-05-16 2023-08-29 江苏信创网安数据科技有限公司 一种网络安全智能评估方法及***
CN116668095B (zh) * 2023-05-16 2024-03-29 江苏信创网安数据科技有限公司 一种网络安全智能评估方法及***

Also Published As

Publication number Publication date
CN115878991A (zh) 2023-03-31

Similar Documents

Publication Publication Date Title
CN112203282B (zh) 一种基于联邦迁移学习的5g物联网入侵检测方法及***
CN112235264B (zh) 一种基于深度迁移学习的网络流量识别方法及装置
US20240049108A1 (en) Network provisioning
CN111866162B (zh) 一种业务分配方法及装置
US11489837B2 (en) Network filter
KR102288521B1 (ko) 블록체인 기반의 데이터 저장 장치 및 방법
US11924694B2 (en) Predictive client mobility session management
EP3346435A1 (fr) Procédé de génération d'une communauté privée virtuelle et réseau utilisant la communauté privée virtuelle
WO2022174533A1 (fr) Procédé et appareil d'apprentissage fédéré en fonction d'un groupe auto-organisé, dispositif et support de stockage
US20220414487A1 (en) Method and Apparatus for Updating Application Identification Model, and Storage Medium
WO2023051455A1 (fr) Procédé et appareil d'entraînement de modèle de confiance
US11997526B2 (en) Systems and methods for network device management
US11930020B2 (en) Detection and mitigation of security threats to a domain name system for a communication network
EP4024765A1 (fr) Procédé et appareil d'extraction de condition de propagation de défaut, et support d'informations
CN112800045A (zh) 一种基于大数据的数据信息分析***
US10148516B2 (en) Inter-networking device link provisioning system
US10924593B2 (en) Virtualization with distributed adaptive message brokering
CN114567678A (zh) 一种云安全服务的资源调用方法、装置及电子设备
CN109005034B (zh) 一种多租户量子密钥供应方法及装置
US20170248916A1 (en) Method and system for image processing and data transmission in network-based multi-camera environment
CN114239010B (zh) 一种多节点分布式认证方法、***、电子设备及介质
WO2024017248A9 (fr) Procédé et appareil de réponse, dispositif et support de stockage lisible par ordinateur
CN117978612B (zh) 网络故障检测方法、存储介质以及电子设备
EP4297456A1 (fr) Procédé de configuration de réseau pour dispositif de communication sans fil
US10769586B2 (en) Implementation of rolling key to identify systems inventories

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE