CN116720594A - Decentralized hierarchical federal learning method - Google Patents
Decentralized hierarchical federal learning method Download PDFInfo
- Publication number
- CN116720594A CN116720594A CN202310998646.2A CN202310998646A CN116720594A CN 116720594 A CN116720594 A CN 116720594A CN 202310998646 A CN202310998646 A CN 202310998646A CN 116720594 A CN116720594 A CN 116720594A
- Authority
- CN
- China
- Prior art keywords
- model
- mec server
- aggregation
- federal learning
- global
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 31
- 230000002776 aggregation Effects 0.000 claims abstract description 54
- 238000004220 aggregation Methods 0.000 claims abstract description 54
- 238000012549 training Methods 0.000 claims description 7
- 238000013475 authorization Methods 0.000 claims description 2
- 238000004891 communication Methods 0.000 abstract description 10
- 238000013473 artificial intelligence Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000012423 maintenance Methods 0.000 description 2
- 230000004931 aggregating effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000012634 fragment Substances 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000006116 polymerization reaction Methods 0.000 description 1
- 230000000379 polymerizing effect Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
- G06N20/20—Ensemble learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- Computer Security & Cryptography (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Computation (AREA)
- Computing Systems (AREA)
- Medical Informatics (AREA)
- Data Mining & Analysis (AREA)
- Mathematical Physics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Artificial Intelligence (AREA)
- Computer Hardware Design (AREA)
- Storage Device Security (AREA)
Abstract
The application discloses a decentralized hierarchical federal learning method, which comprises the steps of initializing a federal learning system through an authority mechanism; dividing the initialized federal learning system into an edge aggregation stage led by the MEC server and a global aggregation stage based on consensus; in the edge aggregation stage, the MEC server is used as a leader node to collect updated models from clients in a group and aggregate the updated models into an intermediate model; in the global aggregation stage, the security of the global aggregation result is ensured by the consensus of each MEC server with the assistance of a blockchain. The method can divide federal learning into two stages of edge aggregation led by the MEC server and global aggregation based on consensus, thereby eliminating threat of malicious servers, improving communication safety and reducing system communication overhead.
Description
Technical Field
The application relates to the technical field of federal learning, in particular to a decentralized hierarchical federal learning method.
Background
With the vigorous development of big data driven artificial intelligence (Artificial Intelligence, AI) technology, AI algorithms have shown much higher accuracy and efficiency than conventional methods in the fields of image processing, speech recognition, etc., and thus are widely used in various industries. However, the effective acquisition and maintenance of data are the bottleneck restricting the development of the data, and as the importance of data privacy and security is gradually increased in different industries, the data of each industry basically exist in the form of island. Federal learning is a distributed machine learning architecture involving multiple clients and an aggregation server, where the clients may be personal terminal devices (such as mobile phones) or represent different departments or enterprises, and are responsible for storing personal data of users or private data of organizations, and the clients train models locally and send the trained model parameters to the aggregation server; the aggregation server is responsible for aggregating model parameters of part or all of the clients, and synchronizing the aggregated model to the clients to start a new training round. The combined collaborative training mode can avoid the leakage of personal data and effectively solve the problem of data island on the premise of ensuring the performance of a model.
In a distributed federal learning system, a central server performs central operations of update aggregation, client selection, and global model maintenance. The server needs to collect updated information from many clients for aggregation operation, and also needs to broadcast a new global model to these clients, which puts high demands on network bandwidth. In addition, cloud service provider stability can also impact cloud-based servers, central servers can skew the global model by biasing certain clients, malicious central servers can break the model, and even collect the privacy of the clients from updates. Therefore, in federal learning, while participants with training data can delegate the server to accept and aggregate shared gradients, malicious users and servers may manipulate the global model by uploading malicious updated gradients, thereby affecting system communication security.
Disclosure of Invention
The application aims to provide a decentralized hierarchical federation learning method, which can divide federation learning into two stages of edge aggregation of MEC server leaders and global aggregation based on consensus, thereby eliminating threat of malicious servers, improving communication safety and reducing system communication overhead.
The application aims at realizing the following technical scheme:
a method of decentralized hierarchical federal learning, the method comprising:
step 1, initializing a federal learning system through an authority mechanism;
step 2, dividing the initialized federal learning system into an edge aggregation stage led by a mobile edge computing server MEC and a global aggregation stage based on consensus;
step 3, in the edge aggregation stage, the MEC server is used as a leading node to collect updated models from clients in the group and aggregate the updated models into an intermediate model;
and 4, in the global aggregation stage, ensuring the security of a global aggregation result through the consensus of each MEC server under the assistance of a blockchain.
According to the technical scheme provided by the application, the federal learning can be divided into two stages of edge aggregation led by the MEC server and global aggregation based on consensus, so that the threat of a malicious server is eliminated, the communication safety is improved, and the system communication overhead is reduced.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic flow chart of a decentralized hierarchical federal learning method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of an architecture of decentralized hierarchical federal learning according to an embodiment of the present application.
Detailed Description
The technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments of the present application, and this is not limiting to the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to fall within the scope of the application.
Fig. 1 is a schematic flow chart of a decentralized hierarchical federal learning method according to an embodiment of the present application, where the method includes:
step 1, initializing a binding learning system through an authority mechanism (Trusted Authority);
in this step, before participating in model training, both the client n and the MEC server m joining the federal learning system need to pass identity authentication by the authority TA (e.g., government department), and become legal entities having proprietary registration information, specifically:
client n generates a key pair using public parameters published by an authorityAnd obtain from the authority binding his registration information +.>Certificate of->As a unique identity identifier; then use certificate->Adding the learning system into a federal learning system;
similar to the registration flow of the client, the MEC server m generates a key pairAnd obtain bindingFixed registration information->Certificate of->The method comprises the steps of carrying out a first treatment on the surface of the Then use certificate->Adding the learning system into a federal learning system;
the authority is only used for parameter initialization, identity authorization and certificate issuance of the participant are provided before the blockchain is operated, and the authority is kept offline for the rest of time, so that the authority and the decentralization do not conflict.
Step 2, dividing the initialized federal learning system into an edge aggregation stage led by a mobile edge computing server MEC and a global aggregation stage based on consensus;
step 3, in the edge aggregation stage, the MEC server is used as a leading node to collect updated models from clients in the group and aggregate the updated models into an intermediate model;
in the step, when each management period starts, each client selects a corresponding MEC server as an intermediate aggregation node, and the MEC server as a leading node publishes member information in the group to a blockchain;
in the edge aggregation stage, the MEC server transmits the global model to clients in the group, and the clients in the group verify the correctness of the model according to the global model hash published in the blockchain;
then, the client in the group trains the model by using the local data, and submits the updated model to an MEC server serving as a leader node;
the MEC server collects the updated models uploaded by the clients in the group and obtains an intermediate model by executing security gradient aggregation.
For example, as shown in fig. 2, which is a schematic diagram of architecture of decentralized hierarchical federal learning according to an embodiment of the present application, in an edge aggregation stage, an MEC server manages N clients, and the updated model parameters obtained by training the client N with local data areThe method comprises the steps of carrying out a first treatment on the surface of the Since the strong protection mechanism based on secret sharing can realize secure aggregation without exposing parameters at all, but has higher communication overhead, the embodiment uses the model parameters updated by the client n +.>Dividing into key parameters->And other parameters->The strong protection mechanism is only used for polymerizing key parameters +.>Other parameters->Aggregation by an anonymity-based weak protection mechanism; finally, the MEC server aggregates the key parameters +.>And polymerization results of other parameters->Parameters of the intermediate model are obtained after splicing +.>。
And 4, in the global aggregation stage, ensuring the security of a global aggregation result through the consensus of each MEC server under the assistance of a blockchain.
In this step, in the global aggregation stage, as shown in fig. 2, a lightweight blockchain platform is set up using a consensus protocol PoA (Proof of Authority); wherein the consensus protocol PoA is a solution to a private blockchain network in which a set of servers is selected as authoritative nodes responsible for checking and verifying all transactions; compared with the traditional federal learning architecture, the consensus protocol PoA provides higher reliability, avoids the problem of single-point failure, has higher performance and lower occupation of calculation and communication resources compared with a public blockchain (PoW, poS), and can support a large-scale federal learning system;
after the MEC server executes the aggregation in the step 3, uploading the aggregated intermediate model to an authoritative node of the consensus protocol PoA;
then the authoritative node utilizes a security aggregation mechanism to aggregate the intermediate models, and the hashes of the aggregated models are uploaded to a blockchain, and most votes are carried out by utilizing intelligent contracts, so that the model with the largest vote number is used as a global model;
in order to maintain the performance of the blockchain, the authority node only uploads the hash and the storage address of the global model to the blockchain, and the global model is verified by the authority node and sent to each MEC server by the under-chain and then sent to each participant client by each MEC server;
after each participant client uses the hash in the blockchain to verify the validity of the global model, the local data is used for carrying out a new round of model training.
In addition, in the concrete implementation, the hierarchical federation learning method supported by the blockchain can ensure the security of the aggregation result, but cannot protect the privacy of the user. In the edge aggregation stage, the curious MEC server can extract the privacy information of the user from the collected update gradients, so that aiming at the privacy problem in the edge aggregation, the application adopts Shamir secret sharing to hide key gradients and combines an anonymity mechanism to hide the user identity, thereby realizing the simultaneous protection of model safety and user privacy on the premise of not sacrificing model precision, and specifically:
the method adopts a strong protection method based on secret sharing to realize safe aggregation without exposing the true value of the update parameter of the user, and each client needs to send fragments of the update parameter to all other clients in the group in the aggregation process, and has the following characteristics thatTherefore, the key parameters are aggregated by adopting a strong protection method based on secret sharingWeak protection of the residual model parameters is achieved by using the linkable ring signature, the true values of the residual model parameters are transmitted to the MEC server, but the user identity is hidden, so that the MEC server cannot associate the received gradient with the intra-group member identity, and the privacy of the target user is difficult to extract pertinently.
It is noted that what is not described in detail in the embodiments of the present application belongs to the prior art known to those skilled in the art.
In summary, the method of the embodiment of the application has the following advantages:
1. the application uses blockchain to replace a central server to realize the decentralization of federation learning, and divides federation learning into edge aggregation led by MEC servers and global aggregation based on consensus so as to eliminate the threat of malicious servers and reduce the communication overhead of the system;
2. the application adopts Shamir secret sharing to hide key gradients and combines an anonymity mechanism to hide user identities, thereby realizing the protection of model safety and user privacy on the premise of not sacrificing model precision.
In addition, it will be understood by those skilled in the art that all or part of the steps in implementing the methods of the above embodiments may be implemented by a program to instruct related hardware, and the corresponding program may be stored in a computer readable storage medium, where the storage medium may be a read only memory, a magnetic disk or an optical disk, etc.
The foregoing is only a preferred embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions easily contemplated by those skilled in the art within the scope of the present application should be included in the scope of the present application. Therefore, the protection scope of the present application should be subject to the protection scope of the claims. The information disclosed in the background section herein is only for enhancement of understanding of the general background of the application and is not to be taken as an admission or any form of suggestion that this information forms the prior art already known to those of ordinary skill in the art.
Claims (5)
1. A method of decentralized hierarchical federal learning, the method comprising:
step 1, initializing a federal learning system through an authority mechanism;
step 2, dividing the initialized federal learning system into an edge aggregation stage led by a mobile edge computing server MEC and a global aggregation stage based on consensus;
step 3, in the edge aggregation stage, the MEC server is used as a leading node to collect updated models from clients in the group and aggregate the updated models into an intermediate model;
and 4, in the global aggregation stage, ensuring the security of a global aggregation result through the consensus of each MEC server under the assistance of a blockchain.
2. The decentralized hierarchical federal learning method according to claim 1, wherein the process of step 1 is specifically:
client n generates a key pair using public parameters published by an authorityAnd obtain from the authority binding his registration information +.>Certificate of->As a unique identity identifier; then use the certificateAdding the learning system into a federal learning system;
similar to the registration flow of the client, the MEC server m generates a key pairAnd obtain binding registration informationCertificate of->The method comprises the steps of carrying out a first treatment on the surface of the Then use certificate->Adding the learning system into a federal learning system;
wherein the authority is used only for parameter initialization, and provides identity authorization and certificate issuance of the participants before the blockchain is operated, and keeps offline for the rest of the time.
3. The decentralized hierarchical federal learning method according to claim 1, wherein, in step 3,
when each management period starts, each client selects a corresponding MEC server as an intermediate aggregation node, and the MEC server as a leading node publishes member information in the group to a blockchain;
in the edge aggregation stage, the MEC server transmits the global model to clients in the group, and the clients in the group verify the correctness of the model according to the global model hash published in the blockchain;
then, the client in the group trains the model by using the local data, and submits the updated model to an MEC server serving as a leader node;
the MEC server collects the updated models uploaded by the clients in the group and obtains an intermediate model by executing security gradient aggregation.
4. The decentralized hierarchical federal learning method according to claim 1, wherein the process of step 4 is specifically:
in the global aggregation stage, a consensus protocol PoA is adopted to set up a lightweight block chain platform;
after the MEC server executes the aggregation in the step 3, uploading the aggregated intermediate model to an authoritative node of the consensus protocol PoA;
then the authoritative node utilizes a security aggregation mechanism to aggregate the intermediate models, and the hashes of the aggregated models are uploaded to a blockchain, and most votes are carried out by utilizing intelligent contracts, so that the model with the largest vote number is used as a global model;
the authority node only uploads the hash and the storage address of the global model to the blockchain, and the global model is verified by the authority node and sent to each MEC server by the chain down, and then sent to each participant client by each MEC server;
after each participant client uses the hash in the blockchain to verify the validity of the global model, the local data is used for carrying out a new round of model training.
5. The decentralized hierarchical federal learning method according to claim 1, wherein the method further comprises:
in the edge aggregation stage, key parameters are aggregated by adopting a strong protection method based on secret sharing, and the rest model parameters are subjected to weak protection by using a linkable ring signature;
the true values of the remaining model parameters are transmitted to the MEC server, hiding the user identity in combination with an anonymity mechanism.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310998646.2A CN116720594B (en) | 2023-08-09 | 2023-08-09 | Decentralized hierarchical federal learning method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310998646.2A CN116720594B (en) | 2023-08-09 | 2023-08-09 | Decentralized hierarchical federal learning method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116720594A true CN116720594A (en) | 2023-09-08 |
CN116720594B CN116720594B (en) | 2023-11-28 |
Family
ID=87870071
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310998646.2A Active CN116720594B (en) | 2023-08-09 | 2023-08-09 | Decentralized hierarchical federal learning method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116720594B (en) |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112232527A (en) * | 2020-09-21 | 2021-01-15 | 北京邮电大学 | Safe distributed federal deep learning method |
CN114363043A (en) * | 2021-12-30 | 2022-04-15 | 华东师范大学 | Asynchronous federated learning method based on verifiable aggregation and differential privacy in peer-to-peer network |
EP4002231A1 (en) * | 2020-11-18 | 2022-05-25 | Telefonica Digital España, S.L.U. | Federated machine learning as a service |
CN114579957A (en) * | 2022-01-20 | 2022-06-03 | 北京邮电大学 | Credible sandbox-based federated learning model training method and device and electronic equipment |
CN114610813A (en) * | 2022-03-14 | 2022-06-10 | 广东工业大学 | Distributed storage method, device, equipment and medium for federal learning |
CN114912622A (en) * | 2022-01-30 | 2022-08-16 | 中图科信数智技术(北京)有限公司 | Decentralized safe federal learning method and system |
WO2022197650A1 (en) * | 2021-03-15 | 2022-09-22 | Interdigital Patent Holdings, Inc. | Methods, architectures, apparatuses and systems directed to blockchain-enabled model storage, sharing and deployment for supporting distributed learning |
CN115438322A (en) * | 2022-08-29 | 2022-12-06 | 成都安恒信息技术有限公司 | Federal learning method, system, equipment and medium supporting identity certification |
WO2023030730A1 (en) * | 2021-09-03 | 2023-03-09 | Telefonaktiebolaget Lm Ericsson (Publ) | Methods and apparatuses for performing federated learning |
CN116094993A (en) * | 2022-12-22 | 2023-05-09 | 电子科技大学 | Federal learning security aggregation method suitable for edge computing scene |
US20230177349A1 (en) * | 2020-06-01 | 2023-06-08 | Intel Corporation | Federated learning optimizations |
CN116523034A (en) * | 2023-04-18 | 2023-08-01 | 中山大学 | Federal learning method and related device based on blockchain |
-
2023
- 2023-08-09 CN CN202310998646.2A patent/CN116720594B/en active Active
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20230177349A1 (en) * | 2020-06-01 | 2023-06-08 | Intel Corporation | Federated learning optimizations |
CN112232527A (en) * | 2020-09-21 | 2021-01-15 | 北京邮电大学 | Safe distributed federal deep learning method |
EP4002231A1 (en) * | 2020-11-18 | 2022-05-25 | Telefonica Digital España, S.L.U. | Federated machine learning as a service |
WO2022197650A1 (en) * | 2021-03-15 | 2022-09-22 | Interdigital Patent Holdings, Inc. | Methods, architectures, apparatuses and systems directed to blockchain-enabled model storage, sharing and deployment for supporting distributed learning |
WO2023030730A1 (en) * | 2021-09-03 | 2023-03-09 | Telefonaktiebolaget Lm Ericsson (Publ) | Methods and apparatuses for performing federated learning |
CN114363043A (en) * | 2021-12-30 | 2022-04-15 | 华东师范大学 | Asynchronous federated learning method based on verifiable aggregation and differential privacy in peer-to-peer network |
CN114579957A (en) * | 2022-01-20 | 2022-06-03 | 北京邮电大学 | Credible sandbox-based federated learning model training method and device and electronic equipment |
CN114912622A (en) * | 2022-01-30 | 2022-08-16 | 中图科信数智技术(北京)有限公司 | Decentralized safe federal learning method and system |
CN114610813A (en) * | 2022-03-14 | 2022-06-10 | 广东工业大学 | Distributed storage method, device, equipment and medium for federal learning |
CN115438322A (en) * | 2022-08-29 | 2022-12-06 | 成都安恒信息技术有限公司 | Federal learning method, system, equipment and medium supporting identity certification |
CN116094993A (en) * | 2022-12-22 | 2023-05-09 | 电子科技大学 | Federal learning security aggregation method suitable for edge computing scene |
CN116523034A (en) * | 2023-04-18 | 2023-08-01 | 中山大学 | Federal learning method and related device based on blockchain |
Non-Patent Citations (2)
Title |
---|
DONG JIN等: "Federated Incremental Learning based Evolvable Intrusion Detection System for Zero-Day Attacks", 《IEEE NETWORK》, pages 125 - 132 * |
吴琪;卢健圳;伍沛然;王帅;陈立;夏明华;: "边缘学习:关键技术、应用与挑战", 无线电通信技术, no. 01, pages 10 - 29 * |
Also Published As
Publication number | Publication date |
---|---|
CN116720594B (en) | 2023-11-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111144881B (en) | Selective access to asset transfer data | |
US11310234B2 (en) | Securing permissioned blockchain network from pseudospoofing network attacks | |
AU2018349940B2 (en) | System and method for information protection | |
EP3590226B1 (en) | System and method for generating digital marks | |
CN109005036B (en) | Block chain member management method and system based on identification cipher algorithm | |
WO2019101227A2 (en) | System and method for implementing blockchain-based digital certificates | |
Cai et al. | Towards private, robust, and verifiable crowdsensing systems via public blockchains | |
Varaprasada Rao et al. | Secure electronic voting (E-voting) system based on blockchain on various platforms | |
US20200204338A1 (en) | Securing public key cryptographic algorithms | |
US20200202349A1 (en) | Multiple asset transactions | |
KR20200081533A (en) | Blockchain Consensus Method based Improved Dynamic Blind Voting for Internet of Things Environment | |
Qu et al. | A electronic voting protocol based on blockchain and homomorphic signcryption | |
KR102349014B1 (en) | Method and system for building fast synchronizable decentralized distributed database | |
Kohad et al. | Scalability issues of blockchain technology | |
US20230208640A1 (en) | Selective audit process for privacy-preserving blockchain | |
CN112287040B (en) | Rights and interests combination method, device, equipment and medium based on block chain | |
CN116720594B (en) | Decentralized hierarchical federal learning method | |
WO2021139545A1 (en) | Methods and devices for facilitating split invoice financing | |
Xu et al. | Fedbc: an efficient and privacy-preserving federated consensus scheme | |
US20230267457A1 (en) | Privacy preserving asset transfer between networks | |
Sri et al. | E-voting system using blockchain | |
Wang et al. | Blockchain-Based Unbalanced PSI with Public Verification and Financial Security | |
US20230267220A1 (en) | Privacy preserving asset token exchange | |
Guo | Cypherium: a scalable and permissionless smart contract platform | |
CN112633890B (en) | Verification method and device for hidden rights and interests evidence based on blockchain |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |