CN114186263B - Data regression method based on longitudinal federal learning and electronic device - Google Patents

Data regression method based on longitudinal federal learning and electronic device Download PDF

Info

Publication number
CN114186263B
CN114186263B CN202111555002.3A CN202111555002A CN114186263B CN 114186263 B CN114186263 B CN 114186263B CN 202111555002 A CN202111555002 A CN 202111555002A CN 114186263 B CN114186263 B CN 114186263B
Authority
CN
China
Prior art keywords
client
data
gradient
regression
server
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111555002.3A
Other languages
Chinese (zh)
Other versions
CN114186263A (en
Inventor
吴铭侃
王湾湾
王波
黄一珉
付海燕
何浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Dongjian Intelligent Technology Co ltd
Dalian University of Technology
Original Assignee
Shenzhen Dongjian Intelligent Technology Co ltd
Dalian University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Dongjian Intelligent Technology Co ltd, Dalian University of Technology filed Critical Shenzhen Dongjian Intelligent Technology Co ltd
Priority to CN202111555002.3A priority Critical patent/CN114186263B/en
Publication of CN114186263A publication Critical patent/CN114186263A/en
Application granted granted Critical
Publication of CN114186263B publication Critical patent/CN114186263B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/602Providing cryptographic facilities or services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Bioethics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Hardware Design (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Storage Device Security (AREA)

Abstract

The invention provides a data regression method based on longitudinal federal learning and an electronic device. The method comprises the following steps: the first client and the second client respectively read training data from a local database and initialize local model parameters; the first client and the second client respectively calculate respective index data, and the second client encrypts the index data calculated by the second client based on the public key and sends the index data to the first client; the first client calculates to obtain a partial expression of the characteristic gradient of the encrypted data; the first client and the second client respectively obtain gradients of the local regression models, add noise masks to the gradients and then send the gradients to the server for decryption; the server decrypts the gradient data sent by the first client and the second client and sends back the gradient data to the first client and the second client to denoise the gradient respectively to obtain an original gradient, and the model parameters are updated by using the new gradient to obtain new rounds of model parameters.

Description

Data regression method based on longitudinal federal learning and electronic device
Technical Field
The invention relates to the technical field of artificial intelligence, in particular to a data regression method based on longitudinal federal learning and an electronic device.
Background
In most industries, data exists in island form, and due to the problems of industry competition, privacy safety, complex administrative procedures and the like, even if data integration is realized among different departments of the same company, important resistance is faced, and in reality, it is almost impossible or the required cost is huge to integrate data scattered in various places and various institutions. Against this background, the demands for data sharing and fusion are becoming stronger, but during data sharing, there are mainly the following problems:
1. Data island: in real life, except for a few huge companies, most enterprises have the problems of small data quantity and poor data quality, and the realization of a machine learning model is not supported sufficiently.
2. Privacy security: with the increasing awareness of large companies in protecting data security and user privacy, importance of data privacy and security has become a significant problem worldwide.
On the premise of meeting the requirements of data privacy, safety and supervision, a machine learning framework is designed, so that the artificial intelligence system can more efficiently and accurately commonly use respective data, and the machine learning framework is an important subject for the development of the current artificial intelligence.
Disclosure of Invention
In order to obtain a regression prediction model of full sample full characteristics on the premise that any party does not share own data, the invention provides a data regression method based on longitudinal federal learning and an electronic device. According to the invention, a non-data trusted third party and two data providers perform longitudinal federal training by utilizing the data of the data providers on the premise of not sharing the original data, so that a joint prediction result is obtained.
The invention adopts the following technical means:
The data regression method based on longitudinal federal learning is applied to a distributed network system, wherein the distributed network system comprises a first client, at least one second client and a server, wherein the first client is provided with a data tag, the second client is not provided with a data tag, and the server is not provided with data;
the method comprises the following steps:
the server side generates a public key and a private key and sends the same homomorphic encryption public key to the first client side and the second client side through a network;
The first client and the second client respectively read training data from a local database, initialize local model parameters and divide the data into training batches;
For each batch, the first client and the second client respectively calculate respective index data, and the second client encrypts the index data calculated by the second client based on a public key and sends the encrypted index data to the first client;
The first client calculates a partial expression gradient of the characteristic gradient of the encrypted data according to the self data and the tag and the encrypted index data sent by the second client, and the first client sends the partial expression of the characteristic gradient of the encrypted data to the second client based on the public key;
The first client and the second client multiply own data by gradient_part to respectively obtain gradients of respective local regression models, add noise masks to the gradients and then send the gradients to the server for decryption;
the server end decrypts the gradient data sent by the first client end and the second client end according to the private key obtained by searching the public key, obtains the original gradient after noise addition and sends the original gradient back to the first client end and the second client end;
The first client and the second client respectively denoise the gradients to obtain original gradients, and update model parameters by using the new gradients to obtain new rounds of model parameters.
Further, the method further comprises:
The first client calculates an encrypted loss value according to a loss function calculation formula under the federal encryption condition, sends the encrypted loss value to the server for decryption, and judges whether training is stopped or not according to the loss function value and the training round number.
Further, the local model of the first client and the second data terminal is tweedie regression model.
Further, the loss function is obtained from the following calculation:
Where L is the loss function, p is the regression index parameter, y is the data tag, First index parameter calculated for first client,/>First index parameter calculated for second client,/>Second index parameter calculated for first client,/>And the second index parameter calculated for the second client, [ ] represents data encryption.
Further, gradient data was obtained according to the following calculation:
Let the first client hold data x a and y, and the second client hold data x b, then:
extracting consistent portions:
Order the
Then:
The invention also provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor executing the method of any one of the preceding claims by the computer program execution.
Compared with the prior art, the invention has the following advantages:
The method can be used for training the joint prediction model of privacy sensitive data (such as financial industry, insurance industry and the like) and is obtained through calculation and derivation. And on the premise of not sharing the original data, a trusted third party without data and two data providers are used for obtaining the result of joint prediction, wherein the data of the data providers accords with the longitudinal federal characteristic. The method can be expanded step by step through a tree structure, and the N-party federal learning effect is realized.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings may be obtained according to the drawings without inventive effort to a person skilled in the art.
FIG. 1 is a flow chart of a data regression method based on longitudinal federal learning according to the present invention.
Fig. 2is a schematic diagram of feature distribution in an embodiment.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Federal learning is a distributed machine learning and deep learning training framework that aims to ensure the privacy and security of training data. In federal learning, the original data is not directly transmitted, but only the model parameters and part of the encrypted data or labels are transmitted, and the fact that the original data cannot be decrypted by other parties is ensured. Federal learning is classified into transverse federal learning and longitudinal federal learning according to the difference in overlapping properties of the samples and features. The transverse federal learning refers to federal learning with features having large overlap and sample points not overlapping; longitudinal federal learning refers to federal learning in which sample points overlap and feature differences are large.
Tweedie (composite poisson gamma regression) is a function in a generalized linear regression cluster, and has wide application in insurance, weather prediction, and the like. Generalized linear regression is a set of regression functions with the following formal expression:
var(y)=φμp
Wherein p is an exponential part of the regression model, μ and Φ are expected and attenuation parameters of data respectively, and the expected and attenuation parameters are classified into linear regression, logistic regression, poisson regression, composite poisson gamma regression, inverse normal regression and the like according to different parameter values. Table one shows several linear regressions and various hyper-parametric ranges for generalized linear regressions. Wherein tweedie regression is performed when the value of p is between 1 and 2.
TABLE 1
Several regression models in Table 1 all have a unified standard probability density function:
and tweedie regression model satisfies the following equation:
Based on the background, tweedie regression model estimation of a trusted third party is performed under federal learning multiparty cooperation. Specifically: in a three-party model, the trusted third party C, i.e. the server side, the data provider A, B, i.e. the first client and the second client, the total number of samples is n, the feature number is m, and the label y. Each client is used as a node of a data provider and holds a part of characteristics of a part of samples, the data provider A holds a label, namely the first client holds the label, and on the premise that each node does not share original data of the client to any party, the data of multiple parties are combined to train to obtain a regression prediction model of the whole characteristics of the whole samples. For the data which must be transmitted, we adopt homomorphic encryption mode (the encrypted data is represented by [ ], and the fourth rule has a certain specificity). Given tweedie the regression index parameter p, we derive the data transmission, calculation and training process through the formula.
Specifically, as shown in fig. 1, the data regression method based on longitudinal federal learning disclosed by the invention specifically comprises the following steps:
S1, a server side generates a public key and a private key, and sends the same homomorphic encryption public key to a first client side and a second client side through a network.
S2, the first client side and the second client side respectively read training data from a local database, initialize local model parameters and divide the data into training batches. The setting of batch is typically an integer power of 2, such as a value of 128, 256, 512, etc., in order to improve computing performance.
In this embodiment, the local regression model of the first client is preferably w axa, and the local regression model of the second client is preferably w bxb. The server side sends the same homomorphic encryption public key to the first client side and the second client side respectively. The first client and the second client respectively read training data from a local database, initialize local model parameters and divide the data into training batches.
S3, for each batch, the first client and the second client respectively calculate respective index data, and the second client encrypts the index data calculated by the second client based on the public key and sends the index data to the first client.
Specifically, for each batch, the first client computes an acquisitionSecond client computing acquisition/>Simultaneously, the second client calculates the obtained encrypted dataAnd sending the message to the first client.
S4, the first client calculates and obtains a partial expression gradient_part of the characteristic gradient of the encrypted data according to the self data and the label and the encrypted index data sent by the second client, and the first client sends the partial expression of the characteristic gradient of the encrypted data to the second client based on the public key.
Specifically, to find the feature gradient of the data, a loss function is first constructed, and in this embodiment, a general expression of tweedie regression loss function is used:
Where the linking function log μ=wx, hence μ=e wx.
Substituting the equation of the loss function to obtain:
this equation represents the calculation of the loss function at node a (first client), which requires obtaining the data encrypted by the homomorphic encryption public key of node B (second client) ([ ] part), and decrypting the result by the private key of the trusted third party C.
The gradient is the result of the loss function derivative:
let node a hold data x a and y, and node B hold data x b, then:
extracting consistent portions:
Order the
Then:
S5, the first client and the second client respectively obtain gradients of the local regression models by multiplying own data by gradient_part Adding noise mask to the gradient and then sending the gradient to a server for decryption;
S6, the server decrypts the gradient data sent by the first client and the second client according to the private key obtained by searching the public key, obtains the original gradient after noise addition and sends the original gradient back to the first client and the second client;
S7, the first client and the second client respectively denoise the gradients to obtain original gradients, and model parameters are updated by using the new gradients to obtain new rounds of model parameters. The original gradient is referred to herein as the new gradient resulting from this round of training.
Further, the method further comprises:
And S8, the first client calculates an encrypted loss value according to a loss function calculation formula under the federal encryption condition, sends the encrypted loss value to the server for decryption, and judges whether to stop training according to the loss function value and the training round number.
The process according to the invention is further illustrated by the following specific examples of application.
Two institutions are arranged to respectively contain some information of clients, and an A institution (bank) stores information of identities, work units, academia, sexes, ages, professions, deposits and the like of the clients. The B institution (insurance company) also stores part of the information of the customer, the identity, the work unit, the occupation, the property value, the automobile value and the insurance purchase condition. The label is financial product provided by A institution for customer.
In order to recommend proper financial products to new clients, the A mechanism wants to train a regression model, and predicts the financial products meeting the client requirements by using the information. But organization a contains only a portion of the features of the training data (known clients) who want to utilize another portion of the data features stored in B because of data sensitivity and legal restrictions he cannot access. It may do the training of the longitudinal federal learning model of the process described above under the coordination of a regulatory agency (trusted third party). And a tweedie regression estimate of the full feature space is obtained.
Tweedie regression is suitable for occasions with more 0-valued features in the training data features, and is suitable for scenes such as insurance claims, weather prediction and the like. The feature distribution characteristics are shown in fig. 2.
The invention also provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor executing the method of any one of the preceding claims by the computer program execution.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the invention.

Claims (6)

1. The data regression method based on longitudinal federal learning is characterized by being applied to a distributed network system, wherein the distributed network system comprises a first client, at least one second client and a server, the first client is provided with a data tag, the second client is not provided with a data tag, and the server is not provided with data;
the method comprises the following steps:
the server side generates a public key and a private key and sends the same homomorphic encryption public key to the first client side and the second client side through a network;
The first client and the second client respectively read training data from a local database, initialize local model parameters and divide the data into training batches;
For each batch, the first client and the second client respectively calculate respective index data, and the second client encrypts the index data calculated by the second client based on a public key and sends the encrypted index data to the first client;
The first client calculates a partial expression gradient of the characteristic gradient of the encrypted data according to the self data and the tag and the encrypted index data sent by the second client, and the first client sends the partial expression of the characteristic gradient of the encrypted data to the second client based on the public key;
The first client and the second client multiply own data by gradient_part to respectively obtain gradients of respective local regression models, add noise masks to the gradients and then send the gradients to the server for decryption;
the server end decrypts the gradient data sent by the first client end and the second client end according to the private key obtained by searching the public key, obtains the original gradient after noise addition and sends the original gradient back to the first client end and the second client end;
The first client and the second client respectively denoise the gradient to obtain an original gradient, and update model parameters by using the original gradient to obtain new model parameters.
2. The longitudinal federal learning-based data regression method of claim 1, further comprising:
The first client calculates an encrypted loss value according to a loss function calculation formula under the federal encryption condition, sends the encrypted loss value to the server for decryption, and judges whether training is stopped or not according to the loss function value and the training round number.
3. The data regression method based on longitudinal federal learning of claim 1 or 2 wherein the local model of the first client and the second data terminal is tweedie regression model.
4. A data regression method based on longitudinal federal learning according to claim 3, wherein the loss function is obtained from the following calculation:
Where L is the loss function, p is the regression index parameter, y is the data tag, First index parameter calculated for first client,/>First index parameter calculated for second client,/>Second index parameter calculated for first client,/>And the second index parameter calculated for the second client, [ ] represents data encryption.
5. The data regression method based on longitudinal federal learning of claim 4 wherein the gradient data is calculated from the following calculations:
Let the first client hold data x a and y, and the second client hold data x b, then:
extracting consistent portions:
Order the
Then:
6. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor performs the method of any one of claims 1 to 5 by execution of the computer program.
CN202111555002.3A 2021-12-17 2021-12-17 Data regression method based on longitudinal federal learning and electronic device Active CN114186263B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111555002.3A CN114186263B (en) 2021-12-17 2021-12-17 Data regression method based on longitudinal federal learning and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111555002.3A CN114186263B (en) 2021-12-17 2021-12-17 Data regression method based on longitudinal federal learning and electronic device

Publications (2)

Publication Number Publication Date
CN114186263A CN114186263A (en) 2022-03-15
CN114186263B true CN114186263B (en) 2024-05-03

Family

ID=80544383

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111555002.3A Active CN114186263B (en) 2021-12-17 2021-12-17 Data regression method based on longitudinal federal learning and electronic device

Country Status (1)

Country Link
CN (1) CN114186263B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114912136B (en) * 2022-07-14 2022-10-28 之江实验室 Competition mechanism based cooperative analysis method and system for medical data on block chain
CN115329369B (en) * 2022-07-28 2023-04-14 上海光之树科技有限公司 Model joint construction method based on multi-party longitudinal privacy protection and logistic regression
CN117077816B (en) * 2023-10-13 2024-03-29 杭州金智塔科技有限公司 Training method and system of federal model

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110572253A (en) * 2019-09-16 2019-12-13 济南大学 Method and system for enhancing privacy of federated learning training data
CN111967910A (en) * 2020-08-18 2020-11-20 中国银行股份有限公司 User passenger group classification method and device
WO2021004551A1 (en) * 2019-09-26 2021-01-14 深圳前海微众银行股份有限公司 Method, apparatus, and device for optimization of vertically federated learning system, and a readable storage medium
CN112613618A (en) * 2021-01-04 2021-04-06 神谱科技(上海)有限公司 Safe federal learning logistic regression algorithm
CN112906912A (en) * 2021-04-01 2021-06-04 深圳市洞见智慧科技有限公司 Method and system for training regression model without trusted third party in longitudinal federal learning
WO2021120888A1 (en) * 2019-12-20 2021-06-24 支付宝(杭州)信息技术有限公司 Method and system for performing model training on the basis of private data
WO2021121106A1 (en) * 2019-12-20 2021-06-24 深圳前海微众银行股份有限公司 Federated learning-based personalized recommendation method, apparatus and device, and medium
CN113240461A (en) * 2021-05-07 2021-08-10 广州银行股份有限公司 Method, system and medium for identifying potential customers based on longitudinal federal learning

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110572253A (en) * 2019-09-16 2019-12-13 济南大学 Method and system for enhancing privacy of federated learning training data
WO2021004551A1 (en) * 2019-09-26 2021-01-14 深圳前海微众银行股份有限公司 Method, apparatus, and device for optimization of vertically federated learning system, and a readable storage medium
WO2021120888A1 (en) * 2019-12-20 2021-06-24 支付宝(杭州)信息技术有限公司 Method and system for performing model training on the basis of private data
WO2021121106A1 (en) * 2019-12-20 2021-06-24 深圳前海微众银行股份有限公司 Federated learning-based personalized recommendation method, apparatus and device, and medium
CN111967910A (en) * 2020-08-18 2020-11-20 中国银行股份有限公司 User passenger group classification method and device
CN112613618A (en) * 2021-01-04 2021-04-06 神谱科技(上海)有限公司 Safe federal learning logistic regression algorithm
CN112906912A (en) * 2021-04-01 2021-06-04 深圳市洞见智慧科技有限公司 Method and system for training regression model without trusted third party in longitudinal federal learning
CN113240461A (en) * 2021-05-07 2021-08-10 广州银行股份有限公司 Method, system and medium for identifying potential customers based on longitudinal federal learning

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
IPv6中Anycast的一种加权通信模型;王晓喃;钱焕延;计算机科学;20060525;-(05);全文 *
基于联邦学习的通信诈骗识别模型的实现;陈国润;母美荣;张蕊;孙丹;钱栋军;;电信科学;20200430(S1);全文 *
邓介一.大数据环境下的多分类逻辑回归算法研究与应用.2018,全文. *

Also Published As

Publication number Publication date
CN114186263A (en) 2022-03-15

Similar Documents

Publication Publication Date Title
CN114186263B (en) Data regression method based on longitudinal federal learning and electronic device
CN110189192B (en) Information recommendation model generation method and device
US8130947B2 (en) Privacy preserving social network analysis
Khan et al. Cloud log forensics: Foundations, state of the art, and future directions
CN112132198B (en) Data processing method, device and system and server
Garrido et al. Revealing the landscape of privacy-enhancing technologies in the context of data markets for the IoT: A systematic literature review
CN113516256B (en) Third-party-free federal learning method and system based on secret sharing and homomorphic encryption
CN110245510A (en) Method and apparatus for predictive information
US11790094B2 (en) Evaluation of a monitoring function
US11853461B2 (en) Differential privacy security for benchmarking
EP3966988B1 (en) Generating sequences of network data while preventing acquisition or manipulation of time data
CN112000979B (en) Database operation method, system and storage medium for private data
CN110363025A (en) A kind of user data privacy management method, apparatus and electronic equipment
WO2019191579A1 (en) System and methods for recording codes in a distributed environment
He et al. PrivC—A framework for efficient secure two-party computation
EP3306489B1 (en) Interaction record query processing method and device
Nguyen et al. The benefits and challenges of applying Blockchain technology into Big Data: A literature review
Guo et al. Privacy-Preserving Multi-Label Propagation Based on Federated Learning
Lohmöller et al. Poster: bridging trust gaps: data usage transparency in federated data ecosystems
Shah et al. Secure featurization and applications to secure phishing detection
Alfuhaid et al. A Mapping Review on Cyber-Physical Smart Contracts: Architectures, Platforms, and Challenges
Alarabi et al. Two Level Based Privacy Protection Approach for Internet of Things Users in Cloud Computing
CN114004456B (en) Data tag calculation method, device, computer equipment and storage medium
CN116028965B (en) Data protection method, server and storage medium in distributed LVC training environment
Shi et al. AUDITEM: toward an automated and efficient data integrity verification model using blockchain

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant