CN113821828B - Data privacy protection method, device, equipment and storage medium - Google Patents

Data privacy protection method, device, equipment and storage medium Download PDF

Info

Publication number
CN113821828B
CN113821828B CN202111383643.5A CN202111383643A CN113821828B CN 113821828 B CN113821828 B CN 113821828B CN 202111383643 A CN202111383643 A CN 202111383643A CN 113821828 B CN113821828 B CN 113821828B
Authority
CN
China
Prior art keywords
data
verification
verification result
agent
seller
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111383643.5A
Other languages
Chinese (zh)
Other versions
CN113821828A (en
Inventor
蔡天琪
蔡恒进
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Longjin Science And Technology Inc
Wuhan University WHU
Original Assignee
Wuhan Longjin Science And Technology Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Longjin Science And Technology Inc filed Critical Wuhan Longjin Science And Technology Inc
Priority to CN202111383643.5A priority Critical patent/CN113821828B/en
Publication of CN113821828A publication Critical patent/CN113821828A/en
Application granted granted Critical
Publication of CN113821828B publication Critical patent/CN113821828B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/602Providing cryptographic facilities or services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/38Payment protocols; Details thereof
    • G06Q20/382Payment protocols; Details thereof insuring higher security of transaction
    • G06Q20/3829Payment protocols; Details thereof insuring higher security of transaction involving key management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Computer Security & Cryptography (AREA)
  • Bioethics (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Accounting & Taxation (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Finance (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Storage Device Security (AREA)

Abstract

The invention discloses a data privacy protection method, a data privacy protection device, data privacy protection equipment and a storage medium. Wherein the method is applied to a trusted Artificial Intelligence (AI) system; the trusted AI system comprises at least a buyer AI agent and a seller AI agent; the method comprises the following steps: obtaining authorization data of a seller, and generating privacy data according to the authorization data and a preset encryption dictionary; under the condition that the private data is successfully verified, generating proxy data with copyright for the private data; determining a data description of the seller according to the agency data under the condition that the agency data is successfully verified; matching the requirement description of the buyer with the data description, and determining a target data description related to the requirement description in the data description; and generating an intelligent contract based on the target data description, and carrying out transaction according to the intelligent contract.

Description

Data privacy protection method, device, equipment and storage medium
Technical Field
The present invention relates to the field of internet and block chain technologies, and in particular, to a data privacy protection method, apparatus, device, and storage medium.
Background
The consistency of the data inside and outside the data kettle; the data description is consistent, and the buyer trust is obtained; how to confirm that the credible data is not fake under the condition of protecting the data privacy; and (4) destroying the data after the transaction. Data descriptions are core solutions. The technical problem to be solved by data description is how to describe a large number of data packets/blocks in machine language and human language, how to overcome word inadequacy, reject false description and attract buyers. No effective solution to this problem is currently available.
Disclosure of Invention
In order to solve the existing technical problems, the present invention mainly aims to provide a data privacy protection method, apparatus, device and storage medium.
In order to achieve the purpose, the technical scheme of the invention is realized as follows:
in a first aspect, the invention provides a data privacy protection method, which is applied to a trusted Artificial Intelligence (AI) system; the trusted AI system comprises at least a buyer AI agent and a seller AI agent; the method comprises the following steps:
obtaining authorization data of a seller, and generating privacy data according to the authorization data and a preset encryption dictionary;
under the condition that the private data is successfully verified, generating proxy data with copyright for the private data;
determining a data description of the seller according to the agency data under the condition that the agency data is successfully verified;
matching the requirement description of the buyer with the data description, and determining a target data description related to the requirement description in the data description;
and generating an intelligent contract based on the target data description, and carrying out transaction according to the intelligent contract.
In the foregoing scheme, the generating private data according to the authorization data and a preset encryption dictionary includes:
determining the type of the authorization data, and obtaining personalized fields related to the type from the preset encryption dictionary based on the type;
and encrypting the authorization data according to the personalized field to generate the privacy data.
In the above aspect, the method further includes:
verifying the private data to obtain a first verification result;
and freezing a seller AI agent corresponding to the seller if the first verification result indicates that the privacy data verification is unsuccessful.
In the foregoing scheme, the verifying the private data to obtain a first verification result includes:
identifying an initial personalized field in the private data;
obtaining a complete personalized field from the preset encryption dictionary;
judging whether the initial personalized field can form an interception relation in the complete personalized field;
determining that the first verification result is that the privacy data is successfully verified under the condition that the initial personalized field can form an interception relation in the complete personalized field;
and under the condition that the initial personalized field cannot form an interception relation in the complete personalized field, determining that the first verification result is that the private data verification is unsuccessful.
In the foregoing solution, the generating proxy data with copyright for the private data includes:
carrying out watermarking processing on the private data to obtain watermarked private data;
and carrying out public key encryption of the seller AI agent on the private data after the watermarking processing to generate the agent data.
In the above aspect, the method further includes:
judging whether the proxy data is successfully verified or not, and obtaining a second verification result;
and in the case that the second verification result shows that the proxy data is not verified successfully, the proxy data is not opened.
In the foregoing scheme, the determining whether the proxy data is successfully verified to obtain a second verification result includes:
carrying out private key decryption verification on the agent data by the seller AI agent to obtain a third verification result;
carrying out watermark validity verification on the proxy data to obtain a fourth verification result;
and determining the second verification result according to the third verification result and the fourth verification result.
In the foregoing solution, the determining the second verification result according to the third verification result and the fourth verification result includes:
determining that the second verification result is the proxy data verification success under the condition that the third verification result shows that the proxy data is successfully subjected to the private key decryption verification of the seller AI agent and the fourth verification result shows that the proxy data is successfully subjected to the watermark validity verification;
and determining that the second verification result is the proxy data verification unsuccessfully under the condition that the third verification result indicates that the proxy data is unsuccessfully subjected to the private key decryption verification of the seller AI proxy and/or the fourth verification result indicates that the proxy data is unsuccessfully subjected to the watermark validity verification.
In the above scheme, the authorization data includes: authorized use data, a number of times the data is authorized for use, and a time at which the data is authorized for use; the method further comprises the following steps:
in the event that the transaction is successful, reducing the number of authorized uses of the data by one.
In a second aspect, the invention further provides a data privacy protection device, which is applied to a trusted Artificial Intelligence (AI) system; the trusted AI system comprises at least a buyer AI agent and a seller AI agent; the device comprises: the system comprises an obtaining unit, a generating unit, a determining unit and a transaction unit, wherein the obtaining unit is used for obtaining the data;
the obtaining unit is used for obtaining the authorization data of the seller; generating private data according to the authorization data and a preset encryption dictionary;
the generating unit is used for generating proxy data with copyright for the private data under the condition that the private data is successfully verified;
the determining unit is used for determining the data description of the seller according to the proxy data under the condition that the proxy data is successfully verified; matching the requirement description of the buyer with the data description, and determining a target data description related to the requirement description in the data description;
and the transaction unit is used for generating an intelligent contract based on the target data description and performing transaction according to the intelligent contract.
In the above scheme, the obtaining unit is further configured to determine a type of the authorization data, and obtain, based on the type, a personalized field related to the type from the preset encrypted dictionary; and encrypting the authorization data according to the personalized field to generate the privacy data.
In the above scheme, the apparatus further includes a verification unit, configured to verify the private data to obtain a first verification result; and freezing a seller AI agent corresponding to the seller if the first verification result indicates that the privacy data verification is unsuccessful.
In the above scheme, the verification unit is further configured to identify an initial personalized field in the privacy data; obtaining a complete personalized field from the preset encryption dictionary; judging whether the initial personalized field can form an interception relation in the complete personalized field; determining that the first verification result is that the privacy data is successfully verified under the condition that the initial personalized field can form an interception relation in the complete personalized field; and under the condition that the initial personalized field cannot form an interception relation in the complete personalized field, determining that the first verification result is that the private data verification is unsuccessful.
In the above scheme, the generating unit is further configured to perform watermarking on the private data to obtain watermarked private data; and carrying out public key encryption of the seller AI agent on the private data after the watermarking processing to generate the agent data.
In the above scheme, the verification unit is further configured to determine whether the proxy data is successfully verified, and obtain a second verification result; and in the case that the second verification result shows that the proxy data is not verified successfully, the proxy data is not opened.
In the above scheme, the verification unit is further configured to perform decryption verification on a private key of the seller AI agent on the agent data to obtain a third verification result; carrying out watermark validity verification on the proxy data to obtain a fourth verification result; and determining the second verification result according to the third verification result and the fourth verification result.
In the foregoing solution, the verification unit is further configured to determine that the second verification result is that the proxy data is successfully verified, when the third verification result indicates that the private key decryption verification of the seller AI agent by the proxy data is successfully performed, and the fourth verification result indicates that the watermark validity verification by the proxy data is successfully performed; and determining that the second verification result is the proxy data verification unsuccessfully under the condition that the third verification result indicates that the proxy data is unsuccessfully subjected to the private key decryption verification of the seller AI proxy and/or the fourth verification result indicates that the proxy data is unsuccessfully subjected to the watermark validity verification.
In the above scheme, the authorization data includes: authorized use data, a number of times the data is authorized for use, and a time at which the data is authorized for use; the transaction unit is further configured to reduce the number of times of the authorized use data by one if the transaction is successful.
In a third aspect, an embodiment of the present invention provides a storage medium, where a computer program is stored on the storage medium; the computer program, when executed by a processor, implements the steps of any of the methods described above.
In a fourth aspect, an embodiment of the present invention provides a data privacy protecting apparatus, where the data privacy protecting apparatus includes: a processor and a memory for storing a computer program operable on the processor, wherein the processor is operable to perform the steps of any of the above methods when executing the computer program.
The embodiment of the invention provides a data privacy protection method, a data privacy protection device, data privacy protection equipment and a storage medium. Wherein, the method is applied to a credible artificial intelligence AI system; the trusted AI system comprises at least a buyer AI agent and a seller AI agent; the method comprises the following steps: obtaining authorization data of a seller, and generating privacy data according to the authorization data and a preset encryption dictionary; under the condition that the private data is successfully verified, generating proxy data with copyright for the private data; determining a data description of the seller according to the agency data under the condition that the agency data is successfully verified; matching the requirement description of the buyer with the data description, and determining a target data description related to the requirement description in the data description; and generating an intelligent contract based on the target data description, and carrying out transaction according to the intelligent contract. By adopting the technical scheme of the embodiment of the invention, private data is generated according to the authorization data and a preset encryption dictionary; under the condition that the private data is successfully verified, generating proxy data with copyright for the private data; determining a data description of the seller according to the agency data under the condition that the agency data is successfully verified; matching the requirement description of the buyer with the data description, and determining a target data description related to the requirement description in the data description; and generating an intelligent contract based on the target data description, and carrying out transaction according to the intelligent contract, so that data privacy can be protected, and credible data can be confirmed not to be counterfeited.
Drawings
Fig. 1 is a schematic flowchart of a data privacy protection method according to an embodiment of the present invention;
fig. 2 is a schematic diagram of an application scenario in the data privacy protection method according to the embodiment of the present invention;
FIG. 3 is a schematic diagram illustrating a structure of a data privacy protection apparatus according to an embodiment of the present invention;
fig. 4 is a schematic diagram of a hardware entity structure of the data privacy protecting apparatus in the embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the following describes specific technical solutions of the present invention in further detail with reference to the accompanying drawings in the embodiments of the present invention. The following examples are intended to illustrate the invention but are not intended to limit the scope of the invention.
The present invention will be described in further detail with reference to the accompanying drawings and specific embodiments.
Fig. 1 is a schematic flowchart of a data privacy protection method according to an embodiment of the present invention. As shown in fig. 1, the method is applied to an trusted Artificial Intelligence (AI) system; the trusted AI system comprises at least a buyer AI agent and a seller AI agent; the method comprises the following steps:
s101: and obtaining authorization data of the seller, and generating privacy data according to the authorization data and a preset encryption dictionary.
It should be noted that the trusted AI system at least includes a buyer AI agent and a seller AI agent; the buyer AI agent can be understood as an agent of a data demander, which is called a buyer agent for short; the seller AI agent may be understood as an agent of the data operator, for short an agent of the seller.
Obtaining authorization data of the seller can be the authorization data of the seller selecting a trusted machine agent node; the authorization data may be determined according to actual situations, and is not limited herein, and as an example, the authorization data may include: authorized use data, the number of times the data is authorized for use, and the time the data is authorized for use.
Generating private data according to the authorization data and a preset encryption dictionary can be used for generating private data by encrypting the authorization data through the preset encryption dictionary; the preset encryption dictionary may be determined according to actual conditions, and is not limited herein, and the preset encryption dictionary may be a set of personalized encryption dictionaries related to personalized fields provided by the agent. As an example, encrypting the authorization data by a preset encryption dictionary to generate private data may be to determine a type of the authorization data, and obtain a personalized field related to the type from the preset encryption dictionary based on the type; and encrypting the authorization data according to the personalized field to generate the privacy data.
S102: and under the condition that the private data is successfully verified, generating proxy data with copyright for the private data.
It should be noted that, in the case that the verification of the private data is successful, the verification of the private data may be understood to obtain a result that the verification of the private data is successful; wherein the verification of the private data may be a personalized field verification of the private data. As an example, the private data is verified, and the result that the verification of the private data is successful may be obtained by decrypting the personalized field of the private data and obtaining the result that the decryption of the private data is successful.
Generating the proxy data having copyright to the private data may be understood as generating the proxy data having uniqueness identity to the private data. As an example, generating proxy data with copyright for the private data may be to perform watermarking on the private data to obtain watermarked private data; and carrying out public key encryption of the seller AI agent on the private data after the watermarking processing to generate the agent data.
S103: and determining the data description of the seller according to the agency data under the condition that the agency data is successfully verified.
It should be noted that, in the case that the proxy data is successfully verified, it may be understood that whether the proxy data is successfully verified is determined, and a result that the proxy data is successfully verified is obtained; the determining whether the agent data is successfully verified may be performing private key decryption and watermark validity verification of the seller AI agent on the agent data. As an example, whether the agent data is successfully verified is determined, and the result of successful verification of the agent data may be a result of successful decryption of the private key of the seller AI agent on the agent data; and then carrying out watermark validity verification on the proxy data to obtain a result of successful watermark validity verification of the proxy data, namely under the condition of successful watermark validity verification of the proxy data.
Determining a data description of the seller from the proxy data may be determining a data description of the seller from attributes of the proxy data; the attribute may be determined according to actual conditions, and is not limited herein, and as an example, the attribute may include a scene of data generation, a data format, a number of pieces of data, a supplementary description, and the like. As an example, the data generation scenario may be medical, educational, shopping, transportation, lodging, dining, entertainment, customization, etc.; the data format can be numbers, characters, pictures, audio, video, self-definition and the like. Determining a data description of the seller from the attributes of the proxy data may be summarizing the data description of the seller from the attributes of the proxy data.
S104: matching the demand description of the buyer with the data description, and determining a target data description related to the demand description in the data description.
It should be noted that, matching the requirement description of the buyer with the data description, determining the target data description related to the requirement description in the data description may be to match the requirement description of the buyer with the data description, and obtain the similarity between the requirement description of the buyer and the data description; and determining target data description related to the requirement description in the data description according to the similarity. In practical applications, the target data description related to the requirement description may be determined to be the data description with the highest similarity as the target data description according to the similarity.
S105: and generating an intelligent contract based on the target data description, and carrying out transaction according to the intelligent contract.
It should be noted that generating an intelligent contract based on the target data description may be to generate an initial intelligent contract based on the target data description, and feed the initial intelligent contract back to the buyer AI agent and the seller AI agent for verification and supplementation, respectively; and acquiring the verified and supplemented initial intelligent contract, and taking the supplemented initial intelligent contract as an intelligent contract.
Conducting a transaction according to the smart contract may be completing a transaction for a buyer and a seller according to the smart contract.
In practical application, the credible AI system further comprises a data kettle, and in the data kettle, each credible AI is combined with the bottom layer of the block chain to realize data authority determination and data value evaluation; the seller's agent: authorization to obtain personal data is required; being able to provide verifiable data descriptions; verifying the data by each agent in the data kettle; the data is sold to the appropriate buyer, the price of the data is agreed (data pricing), and the income is shared with the owner of the relevant data. The buyer's agent: the buying demands of the buyer and the proxy authorization are required; understanding the requirements, finding a possibly suitable data target through data description issued by a seller in a data kettle; bargaining with a seller agent; and feeding back information to the buyer to complete the transaction. For convenience of understanding, fig. 2 may be combined to understand, and fig. 2 is a schematic view of an application scenario in the data privacy protection method according to the embodiment of the present invention.
According to the data privacy protection method provided by the embodiment of the invention, privacy data are generated according to the authorization data and a preset encryption dictionary; under the condition that the private data is successfully verified, generating proxy data with copyright for the private data; determining a data description of the seller according to the agency data under the condition that the agency data is successfully verified; matching the requirement description of the buyer with the data description, and determining a target data description related to the requirement description in the data description; and generating an intelligent contract based on the target data description, and carrying out transaction according to the intelligent contract, so that data privacy can be protected, and credible data can be confirmed not to be counterfeited.
In an optional embodiment of the present invention, the generating of the privacy data according to the authorization data and a preset encryption dictionary includes: determining the type of the authorization data, and obtaining personalized fields related to the type from the preset encryption dictionary based on the type; and encrypting the authorization data according to the personalized field to generate the privacy data.
It should be noted that, the determining of the type of the authorization data may be classifying according to the characteristics of the authorization data to obtain types representing different characteristics of the authorization data.
Obtaining personalized fields related to the type from the preset encryption dictionary based on the type; the preset encryption dictionary may be a set of personalized encryption dictionaries, and the personalized encryption dictionary is related to personalized fields provided by the agent.
Encrypting the authorization data according to the personalized field, and generating the privacy data may be encrypting the authorization data according to the personalized field to generate encrypted authorization data, that is, the privacy data. In practical applications, each trusted ai (agent) has a set of personalized encryption dictionaries for encrypting and decrypting data. The personalized encryption dictionary is associated with a provided personalized field of the agent. The data encrypted by the encryption dictionary is also related to the personalized field provided by the agent.
In an optional embodiment of the invention, the method further comprises: verifying the private data to obtain a first verification result; and freezing a seller AI agent corresponding to the seller if the first verification result indicates that the privacy data verification is unsuccessful.
It should be noted that, the private data is verified, and the first verification result obtained may be personalized field verification performed on the private data to obtain a first verification result; the personalized field verification of the private data can be personalized field decryption of the private data. As an example, personalized field verification is performed on the private data, and obtaining a first verification result may be identifying an initial personalized field in the private data; obtaining a complete personalized field from the preset encryption dictionary; judging whether the initial personalized field can form an interception relation in the complete personalized field; determining that the first verification result is that the privacy data is successfully verified under the condition that the initial personalized field can form an interception relation in the complete personalized field; and under the condition that the initial personalized field cannot form an interception relation in the complete personalized field, determining that the first verification result is that the private data verification is unsuccessful.
In a case that the first verification result indicates that the privacy data verification is unsuccessful, freezing the seller-corresponding seller AI agent may be understood as prohibiting use of the seller-corresponding seller AI agent in a case that the first verification result indicates that the privacy data decryption is unsuccessful.
In practical applications, each trusted ai (agent) has a set of personalized encryption dictionaries for encrypting and decrypting data. The personalized encryption dictionary is associated with a provided personalized field of the agent. The data encrypted by the encryption dictionary is also related to the personalized field provided by the agent, if other people Alex steal the encryption dictionary of a certain agent, the dictionary can be obtained not to belong to Alex by comparing the encrypted historical result of Alex with the current encryption result. The method specifically comprises the following steps: the node provides the personalized field and generates an encryption dictionary or private key. When a node performs an encryption operation, in addition to verifying a dictionary or a private key, it is necessary to provide a complete field of a personalized field or to intercept a part of the complete field for confirming a current encrypted result version. The subset field example, if the complete personalized field is 11110001, its sub-field may be either 10001 or 1111. The reason for the need to confirm the version is that, for example, encoded with a fibonacci number sequence, the same character may have multiple representations, and in combination with the personalized field or subfield provided by the node, the representation in which the field appears the most is selected. When a node is encrypted by using a dictionary, if the system recognizes that the original personalized field of the node and the personalized field in the history encrypted version of the dictionary cannot form an interception relation, the node can be automatically frozen for further examination.
In an optional embodiment of the present invention, the verifying the privacy data to obtain a first verification result includes: identifying an initial personalized field in the private data; obtaining a complete personalized field from the preset encryption dictionary; judging whether the initial personalized field can form an interception relation in the complete personalized field; determining that the first verification result is that the privacy data is successfully verified under the condition that the initial personalized field can form an interception relation in the complete personalized field; and under the condition that the initial personalized field cannot form an interception relation in the complete personalized field, determining that the first verification result is that the private data verification is unsuccessful.
It should be noted that, the identifying of the initial personalized field in the privacy data may be intelligently identifying the initial personalized field in the privacy data through the AI system.
Obtaining the complete personalization field from the preset encryption dictionary may be obtaining the complete personalization field related to the private data type from the preset encryption dictionary.
The determining whether the initial personalized field can form an interception relationship in the complete personalized field may be determining whether the initial personalized field is a partial field in the complete personalized field. For convenience of understanding, the full personalization field may be 11110001, and if the initial personalization field 10001 or 1111 is used, the initial personalization field 10001 or 1111 may form a truncated relationship in the full personalization field 11110001; in practical applications, the complete personalization field may also be referred to as a subfield. If the initial personalized field 1010, the initial personalized field 1010 cannot form an interception relationship in the full personalized field 11110001.
In a case where the initial personalized field can form an interception relationship in the complete personalized field, determining that the first verification result is that the private data is successfully verified may be understood as determining that the first verification result is that the private data is successfully verified in a case where the initial personalized field can form a subfield in the complete personalized field.
Determining that the first verification result is that the private data verification is unsuccessful in the case that the initial personalization field cannot form an interception relationship in the full personalization field may be understood as determining that the first verification result is that the private data verification is unsuccessful in the case that the initial personalization field cannot form a subfield in the full personalization field.
In practical applications, when a node performs an encryption operation, it is necessary to provide a complete field of the personalized field or to intercept a part of the complete field for confirming a current encrypted result version. The subset field example, if the complete personalized field is 11110001, its sub-field may be either 10001 or 1111. The reason for the need to confirm the version is that, for example, encoded with a fibonacci number sequence, the same character may have multiple representations, and in combination with the personalized field or subfield provided by the node, the representation in which the field appears the most is selected.
In an optional embodiment of the present invention, the generating of copyrighted proxy data for the private data includes: carrying out watermarking processing on the private data to obtain watermarked private data; and carrying out public key encryption of the seller AI agent on the private data after the watermarking processing to generate the agent data.
It should be noted that, the watermarking is performed on the private data, and the obtaining of the watermarked private data may be performing watermarking on the private data to obtain the watermarked private data. The watermarking processing may be determined according to an actual situation, and is not limited herein, and as an example, the watermarking processing may be digital watermarking processing; the digital watermarking processing can be picture watermarking, text watermarking and the like.
The public key encryption of the seller AI agent is performed on the private data after the watermarking, and the generation of the agent data may be performed by performing the public key encryption of the seller AI agent on the private data after the watermarking, so as to generate private data which has both watermarking and encryption, that is, the agent data.
In practical application, corresponding digital watermarking technologies (picture watermarking, text watermarking and the like) can be adopted according to different types of authorized data, the public keys of the nodes can be fused, proxy data can be generated, and the uniqueness of the same data given to different proxy nodes is guaranteed.
In an optional embodiment of the invention, the method further comprises: judging whether the proxy data is successfully verified or not, and obtaining a second verification result; and in the case that the second verification result shows that the proxy data is not verified successfully, the proxy data is not opened.
In this embodiment, determining whether the proxy data is successfully verified, and obtaining the second verification result may be to perform private key decryption and watermark validity verification on the seller AI agent on the proxy data to obtain the second verification result.
In a case where the second verification result indicates that the proxy data is not verified successfully, not opening the proxy data may be such that the proxy data cannot be used in a case where the second verification result indicates that the proxy data is not verified successfully.
In practice, each agent may write the agent's validity, either by number (number of successful transactions) or by time limit, into the authorization file. Each trusted machine agent node has a pair of public and private keys, and the private keys are stored in a TEE module of the data kettle, so that the agent nodes can only open authorized data in the data kettle. When data is opened, the validity of the agent is verified in addition to verifying the private key of the agent.
In an optional embodiment of the present invention, the determining whether the proxy data is successfully verified to obtain a second verification result includes: carrying out private key decryption verification on the agent data by the seller AI agent to obtain a third verification result; carrying out watermark validity verification on the proxy data to obtain a fourth verification result;
and determining the second verification result according to the third verification result and the fourth verification result.
In this embodiment, each trusted machine agent node has a pair of public and private keys, and the private key is stored in the TEE module of the data kettle.
The private key decryption verification of the seller AI agent is performed on the agent data, and the third verification result obtained may be a third verification result that the private key decryption of the seller AI agent is performed on the agent data, and the private key decryption of the seller AI agent performed on the agent data is successful or the private key decryption of the seller AI agent performed on the agent data is unsuccessful.
And performing watermark validity verification on the proxy data, wherein the fourth verification result can be obtained by performing watermark validity verification on the proxy data, and obtaining a fourth verification result that the proxy data is successfully subjected to watermark validity verification or the proxy data is unsuccessfully subjected to watermark validity verification.
Determining, according to the third verification result and the fourth verification result, that the second verification result may be a third verification result that the private key of the seller AI agent is successfully decrypted for the proxy data or the private key of the seller AI agent is unsuccessfully decrypted for the proxy data, and a fourth verification result that the watermark validity verification is successfully performed for the proxy data or the watermark validity verification is unsuccessfully performed for the proxy data.
In an optional embodiment of the invention, the determining the second verification result according to the third verification result and the fourth verification result comprises: determining that the second verification result is the proxy data verification success under the condition that the third verification result shows that the proxy data is successfully subjected to the private key decryption verification of the seller AI agent and the fourth verification result shows that the proxy data is successfully subjected to the watermark validity verification; and determining that the second verification result is the proxy data verification unsuccessfully under the condition that the third verification result indicates that the proxy data is unsuccessfully subjected to the private key decryption verification of the seller AI proxy and/or the fourth verification result indicates that the proxy data is unsuccessfully subjected to the watermark validity verification.
In this embodiment, when the third verification result indicates that the private key decryption verification of the seller AI agent by the agent data is successful, and the fourth verification result indicates that the watermark validity verification by the agent data is successful, it may be determined that the second verification result is that the agent data is successfully verified, where the agent data is successfully verified only if the agent data satisfies both the private key decryption verification success of the seller AI agent and the watermark validity verification success.
And when the third verification result shows that the private key decryption verification of the seller AI agent is not successful by the agent data, and/or when the fourth verification result shows that the watermark validity verification is not successful by the agent data, determining that the second verification result is that the agent data verification is unsuccessful to understand that the private key decryption verification of the seller AI agent is unsuccessful as long as the agent data exists, and/or when the watermark validity verification is unsuccessful, the agent data verification is unsuccessful.
In an optional embodiment of the invention, the authorisation data comprises: authorized use data, a number of times the data is authorized for use, and a time at which the data is authorized for use; the method further comprises the following steps: in the event that the transaction is successful, reducing the number of authorized uses of the data by one.
In this embodiment, the authorized use data, the number of times of the authorized use data, and the time of the authorized use data may all be determined according to actual situations, and are not limited herein.
In the event that the transaction is successful, the number of times the usage data is authorized is reduced by one, primarily considering that the usage data may be authorized by number of times, the number of times the agent's data usage rights are available is reduced by 1.
According to the data privacy protection method provided by the embodiment of the invention, the data generates uniqueness by a method of adding watermarks to the data, a plurality of copies are generated on the basis of the uniqueness, and a data platform carries out description and transaction. It is even possible in certain cases to try out once/twice through the data, resulting in the user not being able to use such data anymore, because of the existence of data watermark-like technology.
The embodiment of the invention considers the consistency of the data inside and outside the data kettle; the data description is consistent, and the buyer trust is obtained; how to confirm that trusted data is not counterfeit; and (4) destroying the data after the transaction. Data descriptions are core solutions. The technical problem to be solved by data description is how to describe a large number of data packets/blocks in machine language and human language, how to overcome word inadequacy, reject false description and attract buyers. On the basis of accurate data description, the data consistency and the data value can be used as a core technical scheme for solving corresponding technical problems. Data destruction and data rights, which may also be an important direction. One solution to data copyright is data watermarking, data is made unique by watermarking data, on the basis of the uniqueness, a plurality of copies are generated, and a data platform carries out description and transaction. It is even possible in certain cases to try out once/twice through the data, resulting in the user not being able to use such data anymore, because of the existence of data watermark-like technology.
Digital watermarking technology:
(1) technologies based on common text file format information, including technologies that utilize character or word shifting, line shifting technologies that utilize text line spacing, technologies that utilize character characteristics (font, color, height, width, stroke width, whether underlined, whether italic, etc., character topology, etc.), etc.;
(2) invisible coding-based technologies including substitution, addition, redundant coding-based technologies, graphics of characters and mutually independent coding technologies, and the like; technologies based on text content include synonym replacement technology, syntax-based text digital watermarking technology, semantic-based text digital watermarking technology, and the like;
(3) the Chinese character structure-based technology comprises the technology of utilizing combinable characteristics of the Chinese character radicals and the distance between the Chinese character radicals;
(4) technologies based on image watermarking technology include watermark embedding technology based on various airspaces and transform domains;
(5) techniques based on special format files, such as watermark embedding techniques based on special file formats like HTML, PDF, etc.
The idea and the characteristics of the scheme are as follows: introducing a trusted machine (AI) agent which is responsible for collecting user data (obtaining the authorization of the data to use right), describing and packaging the collected data and finding an agent of a potential buyer in a safe environment (a data kettle); matching of data requirements and execution of data are completed in the data kettle, and the result of data execution is taken out of the kettle. The method comprises the following specific steps:
in a first step, a user node (data owner) selects a trusted machine proxy node, forming proxy data with a watermark. According to different authorized data types, corresponding digital watermarking technologies (picture watermarking, text watermarking and the like) are adopted, the public keys of the nodes can be fused, and the identity of the same data given to different agent nodes is guaranteed. Each agent may write the agent's validity to the authorization file, either by number of times (number of successful transactions) or by time limit. Each trusted machine agent node has a pair of public and private keys, and the private keys are stored in a TEE module of the data kettle, so that the agent nodes can only open authorized data in the data kettle. When data is opened, the validity of the agent is verified in addition to verifying the private key of the agent.
And secondly, entering a data kettle, and enabling each agent node to start to comb data, pack and describe (refer to LJSQ 2110).
And thirdly, the nodes are butted in a data kettle through data description (seller) and execution requirement description (buyer). After the buying and selling agent is verified and confirms the data connection, the verification and transaction of the data are completed in the data kettle in a mode of generating an intelligent contract (if the authorization is carried out according to the number of times, the available times of the data use right of the agent is reduced by 1 time).
Each trusted ai (agent) has a set of personalized encryption dictionaries for encrypting and decrypting data. The personalized encryption dictionary is associated with a provided personalized field of the agent. The data encrypted by the encryption dictionary is also related to the personalized field provided by the agent, if other people Alex steal the encryption dictionary of a certain agent, the dictionary can be obtained not to belong to Alex by comparing the encrypted historical result of Alex with the current encryption result. The method comprises the following specific steps:
in a first step, the node provides a personalized field, generating an encrypted dictionary or private key.
And secondly, when a certain node carries out encryption operation, besides verifying the dictionary or the private key, providing a complete field of the personalized field or intercepting a part of the complete field for confirming the current encrypted result version. The subset field example, if the complete personalized field is 11110001, its sub-field may be either 10001 or 1111. The reason for the need to confirm the version is that, for example, encoded with a fibonacci number sequence, the same character may have multiple representations, and in combination with the personalized field or subfield provided by the node, the representation in which the field appears the most is selected.
And thirdly, when a node is encrypted by using a dictionary, if the system identifies that the initial personalized field of the node and the personalized field in the history encrypted version of the dictionary cannot form an interception relation, the node can be automatically frozen to be further examined.
It should be noted that the terms appearing herein have been described in detail above, and are not repeated herein.
Fig. 3 is a schematic structural diagram of a data privacy protection apparatus according to an embodiment of the present invention, and as shown in fig. 3, the apparatus 200 is applied to a trusted artificial intelligence AI system; the trusted AI system comprises at least a buyer AI agent and a seller AI agent; the device comprises: an obtaining unit 201, a generating unit 202, a determining unit 203 and a transaction unit 204, wherein the obtaining unit 201 is used for obtaining authorization data of a seller; generating private data according to the authorization data and a preset encryption dictionary;
the generating unit 202 is configured to generate proxy data with copyright for the private data if the private data is successfully verified;
the determining unit 203 is configured to determine a data description of the seller according to the proxy data if the proxy data is successfully verified; matching the requirement description of the buyer with the data description, and determining a target data description related to the requirement description in the data description;
the trading unit 204 is configured to generate an intelligent contract based on the target data description, and trade according to the intelligent contract.
In other embodiments, the obtaining unit 201 is further configured to determine a type of the authorization data, and obtain a personalized field related to the type from the preset encrypted dictionary based on the type; and encrypting the authorization data according to the personalized field to generate the privacy data.
In other embodiments, the apparatus 200 further includes a verification unit, configured to verify the private data to obtain a first verification result; and freezing a seller AI agent corresponding to the seller if the first verification result indicates that the privacy data verification is unsuccessful.
In other embodiments, the verification unit is further configured to identify an initial personalized field in the privacy data; obtaining a complete personalized field from the preset encryption dictionary; judging whether the initial personalized field can form an interception relation in the complete personalized field; determining that the first verification result is that the privacy data is successfully verified under the condition that the initial personalized field can form an interception relation in the complete personalized field; and under the condition that the initial personalized field cannot form an interception relation in the complete personalized field, determining that the first verification result is that the private data verification is unsuccessful.
In other embodiments, the generating unit 202 is further configured to perform watermarking on the private data to obtain watermarked private data; and carrying out public key encryption of the seller AI agent on the private data after the watermarking processing to generate the agent data.
In other embodiments, the verifying unit is further configured to determine whether the proxy data is successfully verified, and obtain a second verification result; and in the case that the second verification result shows that the proxy data is not verified successfully, the proxy data is not opened.
In other embodiments, the verification unit is further configured to perform decryption verification on a private key of the seller AI agent on the agent data to obtain a third verification result; carrying out watermark validity verification on the proxy data to obtain a fourth verification result; and determining the second verification result according to the third verification result and the fourth verification result.
In other embodiments, the verification unit is further configured to determine that the second verification result is that the proxy data is successfully verified, if the third verification result indicates that the private key decryption verification of the vendor AI agent by the proxy data is successfully performed, and the fourth verification result indicates that the watermark validity verification by the proxy data is successfully performed; and determining that the second verification result is the proxy data verification unsuccessfully under the condition that the third verification result indicates that the proxy data is unsuccessfully subjected to the private key decryption verification of the seller AI proxy and/or the fourth verification result indicates that the proxy data is unsuccessfully subjected to the watermark validity verification.
In other embodiments, the authorization data includes: authorized use data, a number of times the data is authorized for use, and a time at which the data is authorized for use; the transaction unit is further configured to reduce the number of times of the authorized use data by one if the transaction is successful.
The above description of the apparatus embodiments, similar to the above description of the method embodiments, has similar beneficial effects as the method embodiments. For technical details not disclosed in the embodiments of the apparatus according to the invention, reference is made to the description of the embodiments of the method according to the invention for understanding.
It should be noted that, in the embodiment of the present invention, if the data privacy protection method is implemented in the form of a software functional module and is sold or used as a standalone product, the data privacy protection method may also be stored in a computer-readable storage medium. With this understanding, technical embodiments of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a control server (which may be a personal computer, a server, or a network server) to perform all or part of the methods described in the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a magnetic disk, or an optical disk. Thus, embodiments of the invention are not limited to any specific combination of hardware and software.
Correspondingly, an embodiment of the present invention provides a data privacy protecting apparatus, including a memory and a processor, where the memory stores a computer program that is executable on the processor, and the processor executes the computer program to implement the steps in the control method provided in the foregoing embodiment.
Correspondingly, the embodiment of the invention provides a computer readable storage medium, on which a computer program is stored, which when executed by a processor implements the steps in the control method provided by the above-mentioned embodiment.
Here, it should be noted that: the above description of the storage medium and device embodiments is similar to the description of the method embodiments above, with similar advantageous effects as the method embodiments. For technical details not disclosed in the embodiments of the storage medium and the apparatus according to the invention, reference is made to the description of the embodiments of the method according to the invention.
It should be noted that fig. 4 is a schematic structural diagram of a hardware entity of a data privacy protecting apparatus in an embodiment of the present invention, and as shown in fig. 4, the hardware entity of the data privacy protecting apparatus 300 includes: a processor 301 and a memory 303, optionally, the data privacy protecting apparatus 300 may further comprise a communication interface 302.
It will be appreciated that the memory 303 can be either volatile memory or nonvolatile memory, and can include both volatile and nonvolatile memory. Among them, the nonvolatile Memory may be a Read Only Memory (ROM), a Programmable Read Only Memory (PROM), an Erasable Programmable Read-Only Memory (EPROM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a magnetic random access Memory (FRAM), a Flash Memory (Flash Memory), a magnetic surface Memory, an optical disk, or a Compact Disc Read-Only Memory (CD-ROM); the magnetic surface storage may be disk storage or tape storage. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of illustration and not limitation, many forms of RAM are available, such as Static Random Access Memory (SRAM), Synchronous Static Random Access Memory (SSRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic Random Access Memory (SDRAM), Double Data Rate Synchronous Dynamic Random Access Memory (DDRSDRAM), Enhanced Synchronous Dynamic Random Access Memory (ESDRAM), Enhanced Synchronous Dynamic Random Access Memory (Enhanced DRAM), Synchronous Dynamic Random Access Memory (SLDRAM), Direct Memory (DRmb Access), and Random Access Memory (DRAM). The memory 303 described in connection with the embodiments of the invention is intended to comprise, without being limited to, these and any other suitable types of memory.
The method disclosed in the above embodiments of the present invention may be applied to the processor 301, or implemented by the processor 301. The processor 301 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 301. The Processor 301 may be a general purpose Processor, a Digital Signal Processor (DSP), or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like. Processor 301 may implement or perform the methods, steps, and logic blocks disclosed in embodiments of the present invention. A general purpose processor may be a microprocessor or any conventional processor or the like. The steps of the method disclosed by the embodiment of the invention can be directly implemented by a hardware decoding processor, or can be implemented by combining hardware and software modules in the decoding processor. The software modules may be located in a storage medium located in the memory 303, and the processor 301 reads the information in the memory 303 and performs the steps of the aforementioned methods in conjunction with its hardware.
In an exemplary embodiment, the Device may be implemented by one or more Application Specific Integrated Circuits (ASICs), DSPs, Programmable Logic Devices (PLDs), Complex Programmable Logic Devices (CPLDs), Field-Programmable Gate arrays (FPGAs), general purpose processors, controllers, Micro Controllers (MCUs), microprocessors (microprocessors), or other electronic components for performing the foregoing methods.
In the embodiments provided in the present invention, it should be understood that the disclosed method and apparatus can be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another observation, or some features may be omitted, or not performed. In addition, the communication connections between the components shown or discussed may be through interfaces, indirect couplings or communication connections of devices or units, and may be electrical, mechanical or other.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the embodiment.
Those of ordinary skill in the art will understand that: all or part of the steps for realizing the method embodiments can be completed by hardware related to program instructions, the program can be stored in a computer readable storage medium, and the program executes the steps comprising the method embodiments when executed; and the aforementioned storage medium includes: various media that can store program codes, such as a removable Memory device, a Read-Only Memory (ROM), a magnetic disk, or an optical disk.
Alternatively, the integrated unit according to the embodiment of the present invention may be stored in a computer-readable storage medium if it is implemented in the form of a software functional unit and sold or used as a separate product. With this understanding, technical embodiments of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a device (which may be a personal computer, a server, or a network device) to perform all or part of the methods described in the embodiments of the present invention. And the aforementioned storage medium includes: a removable storage device, a ROM, a magnetic or optical disk, or other various media that can store program code.
The data privacy protection method, apparatus, device and storage medium described in the embodiments of the present invention are only examples of the embodiments of the present invention, but are not limited thereto, and the data privacy protection method, apparatus, device and storage medium are all within the protection scope of the present invention.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. It should be understood that, in various embodiments of the present invention, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation on the implementation process of the embodiments of the present invention. The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The methods disclosed in the several method embodiments provided by the present invention can be combined arbitrarily without conflict to obtain new method embodiments.
Features disclosed in several of the product embodiments provided by the invention may be combined in any combination to yield new product embodiments without conflict.
The features disclosed in the several method or apparatus embodiments provided by the present invention may be combined arbitrarily, without conflict, to arrive at new method embodiments or apparatus embodiments.
The above description is only an embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of changes or substitutions within the technical scope of the present invention, and all such changes or substitutions are included in the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (11)

1. A data privacy protection method is characterized in that the method is applied to a credible artificial intelligence AI system; the credible artificial intelligence AI system at least comprises a buyer AI agent and a seller AI agent; the method comprises the following steps:
obtaining authorization data of a seller, and generating privacy data according to the authorization data and a preset encryption dictionary;
under the condition that the private data is successfully verified, generating proxy data with copyright for the private data;
determining a data description of the seller according to the agency data under the condition that the agency data is successfully verified;
matching the requirement description of the buyer with the data description, and determining a target data description related to the requirement description in the data description;
generating an intelligent contract based on the target data description, and conducting transaction according to the intelligent contract;
wherein the method further comprises:
identifying an initial personalized field in the private data;
obtaining a complete personalized field from the preset encryption dictionary;
judging whether the initial personalized field can form an interception relation in the complete personalized field;
determining a first verification result indicating that the privacy data is successfully verified under the condition that the initial personalized field can form an interception relation in the complete personalized field;
determining a first verification result indicating that the privacy data verification is unsuccessful in case the initial personalization field fails to form an interception relation in the complete personalization field.
2. The method according to claim 1, wherein generating privacy data from the authorization data and a preset encryption dictionary comprises:
determining the type of the authorization data, and obtaining personalized fields related to the type from the preset encryption dictionary based on the type;
and encrypting the authorization data according to the personalized field to generate the privacy data.
3. The method of claim 2, further comprising:
and freezing a seller AI agent corresponding to the seller if the first verification result indicates that the privacy data verification is unsuccessful.
4. The method of claim 1, wherein generating copyrighted proxy data for the private data comprises:
carrying out watermarking processing on the private data to obtain watermarked private data;
and carrying out public key encryption of the seller AI agent on the private data after the watermarking processing to generate the agent data.
5. The method of claim 4, further comprising:
judging whether the proxy data is successfully verified or not, and obtaining a second verification result;
and in the case that the second verification result shows that the proxy data is not verified successfully, the proxy data is not opened.
6. The method of claim 5, wherein the determining whether the proxy data is successfully verified to obtain a second verification result comprises:
carrying out private key decryption verification on the agent data by the seller AI agent to obtain a third verification result;
carrying out watermark validity verification on the proxy data to obtain a fourth verification result;
and determining the second verification result according to the third verification result and the fourth verification result.
7. The method of claim 6, wherein determining the second verification result from the third verification result and the fourth verification result comprises:
determining that the second verification result is the proxy data verification success under the condition that the third verification result shows that the proxy data is successfully subjected to the private key decryption verification of the seller AI agent and the fourth verification result shows that the proxy data is successfully subjected to the watermark validity verification;
and determining that the second verification result is the proxy data verification unsuccessfully under the condition that the third verification result indicates that the proxy data is unsuccessfully subjected to the private key decryption verification of the seller AI proxy and/or the fourth verification result indicates that the proxy data is unsuccessfully subjected to the watermark validity verification.
8. The method of claim 1, wherein the authorization data comprises: authorized use data, a number of times the data is authorized for use, and a time at which the data is authorized for use; the method further comprises the following steps:
in the event that the transaction is successful, reducing the number of authorized uses of the data by one.
9. A data privacy protection device is characterized by being applied to a trusted Artificial Intelligence (AI) system; the credible artificial intelligence AI system at least comprises a buyer AI agent and a seller AI agent; the device comprises: an obtaining unit, a generating unit, a determining unit and a transaction unit, wherein,
the obtaining unit is used for obtaining the authorization data of the seller; generating private data according to the authorization data and a preset encryption dictionary;
the generating unit is used for generating proxy data with copyright for the private data under the condition that the private data is successfully verified;
the determining unit is used for determining the data description of the seller according to the proxy data under the condition that the proxy data is successfully verified; matching the requirement description of the buyer with the data description, and determining a target data description related to the requirement description in the data description;
the transaction unit is used for generating an intelligent contract based on the target data description and conducting transaction according to the intelligent contract;
wherein the apparatus further comprises a verification unit for identifying an initial personalization field in the private data; obtaining a complete personalized field from the preset encryption dictionary; judging whether the initial personalized field can form an interception relation in the complete personalized field; determining a first verification result indicating that the privacy data is successfully verified under the condition that the initial personalized field can form an interception relation in the complete personalized field; determining a first verification result indicating that the privacy data verification is unsuccessful in case the initial personalization field fails to form an interception relation in the complete personalization field.
10. A storage medium, characterized in that the storage medium has stored thereon a computer program; the computer program when executed by a processor implements the steps of the method of any one of claims 1 to 8.
11. A data privacy protecting apparatus characterized by comprising: a processor and a memory for storing a computer program operable on the processor, wherein the processor is operable to perform the steps of the method of any of claims 1 to 8 when the computer program is executed.
CN202111383643.5A 2021-11-22 2021-11-22 Data privacy protection method, device, equipment and storage medium Active CN113821828B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111383643.5A CN113821828B (en) 2021-11-22 2021-11-22 Data privacy protection method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111383643.5A CN113821828B (en) 2021-11-22 2021-11-22 Data privacy protection method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113821828A CN113821828A (en) 2021-12-21
CN113821828B true CN113821828B (en) 2022-02-08

Family

ID=78918020

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111383643.5A Active CN113821828B (en) 2021-11-22 2021-11-22 Data privacy protection method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113821828B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW200733686A (en) * 2005-12-06 2007-09-01 Boncle Inc Asynchronous encryption for secured electronic communications
CN105959115A (en) * 2016-07-19 2016-09-21 贵州大学 Multi-party fault-tolerant authorization oriented public verifiable big data transaction method
CN109816323A (en) * 2018-12-12 2019-05-28 上海点融信息科技有限责任公司 Transaction auditing method, calculating equipment, storage medium for block chain network
CN110569675A (en) * 2019-09-18 2019-12-13 上海海事大学 Multi-Agent transaction information protection method based on block chain technology
CN111125756A (en) * 2019-12-13 2020-05-08 江苏通付盾数字化技术有限公司 Data right-confirming system and method for zero trust and protecting data privacy
CN111178880A (en) * 2019-12-07 2020-05-19 江苏通付盾数字化技术有限公司 Secure data circulation method for zero trust and protecting data privacy
CN111357026A (en) * 2020-02-03 2020-06-30 支付宝(杭州)信息技术有限公司 Credible insurance letter based on block chain
CN111414434A (en) * 2020-05-20 2020-07-14 毕红伟 Block chain-based data transaction management network, transaction device and storage medium
CN112085484A (en) * 2020-07-20 2020-12-15 西安电子科技大学 Digital content distributed transaction method, system, storage medium and computer equipment
CN112801827A (en) * 2020-10-29 2021-05-14 西安纸贵互联网科技有限公司 Intellectual property management system based on block chain
CN113409144A (en) * 2021-06-18 2021-09-17 东北大学 Block chain data transaction method with privacy protection

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11488158B2 (en) * 2018-09-05 2022-11-01 Atrium Separate Ip Holdings Number 4, Llc Blockchain architecture, system, method and device for automated cybersecurity and data privacy law compliance with proprietary off-chain storage mechanism
WO2020076234A1 (en) * 2018-10-12 2020-04-16 Aioz Pte Ltd Apparatus and method for controlling data access

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW200733686A (en) * 2005-12-06 2007-09-01 Boncle Inc Asynchronous encryption for secured electronic communications
CN105959115A (en) * 2016-07-19 2016-09-21 贵州大学 Multi-party fault-tolerant authorization oriented public verifiable big data transaction method
CN109816323A (en) * 2018-12-12 2019-05-28 上海点融信息科技有限责任公司 Transaction auditing method, calculating equipment, storage medium for block chain network
CN110569675A (en) * 2019-09-18 2019-12-13 上海海事大学 Multi-Agent transaction information protection method based on block chain technology
CN111178880A (en) * 2019-12-07 2020-05-19 江苏通付盾数字化技术有限公司 Secure data circulation method for zero trust and protecting data privacy
CN111125756A (en) * 2019-12-13 2020-05-08 江苏通付盾数字化技术有限公司 Data right-confirming system and method for zero trust and protecting data privacy
CN111357026A (en) * 2020-02-03 2020-06-30 支付宝(杭州)信息技术有限公司 Credible insurance letter based on block chain
CN111414434A (en) * 2020-05-20 2020-07-14 毕红伟 Block chain-based data transaction management network, transaction device and storage medium
CN112085484A (en) * 2020-07-20 2020-12-15 西安电子科技大学 Digital content distributed transaction method, system, storage medium and computer equipment
CN112801827A (en) * 2020-10-29 2021-05-14 西安纸贵互联网科技有限公司 Intellectual property management system based on block chain
CN113409144A (en) * 2021-06-18 2021-09-17 东北大学 Block chain data transaction method with privacy protection

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于区块链技术的数字知识产权保护方案研究;宁梦月等;《情报理论与实践》;20171231(第07期);全文 *
基于区块链技术的跨境电子商务平台体系构建;焦良;《商业经济研究》;20200910(第17期);全文 *

Also Published As

Publication number Publication date
CN113821828A (en) 2021-12-21

Similar Documents

Publication Publication Date Title
US9830600B2 (en) Systems, methods and devices for trusted transactions
CN111226249B (en) Trusted platform based on blockchain
CN111213139B (en) Blockchain-based paperless document processing
CN111108522B (en) Block chain based citation delivery
US7177845B2 (en) Copy detection for digitally-formatted works
US20020095579A1 (en) Digital data authentication method
KR20130129478A (en) Method for securely drawing up a virtual multiparty contract capable of being physically represented
KR20210037274A (en) Apparatus and method for managing contents
CN111460398A (en) Watermark adding method, device, equipment and storage medium
KR102319006B1 (en) First copyright holder authentication system using blockchain and its method
WO2001043026A1 (en) Systems, methods and devices for trusted transactions
CN113821828B (en) Data privacy protection method, device, equipment and storage medium
KR20090001575A (en) Digital contents maker identification system and thereof
JP3184869B2 (en) Web page authenticity confirmation system
CN114331730A (en) Information processing method, device and storage medium
WO2016120627A1 (en) Monitoring goods offered for sale online
US20230368186A1 (en) Process for Creation storage retrieval of immutable NFT Non-fungible token based electronic book publishing on a decentralized proof ofstake blockchain
KR20200139365A (en) Activation server and method
KR102550994B1 (en) Method and system for temporal leasing digital contents by use of NFT
EP4366228A1 (en) Generation and validation of trusted non-fungible tokens
Gao et al. Blockchain-based PDF File Copyright Protection and Tracing
KR20240050748A (en) Payment authentication method through double encryption and payment server performing the same method
KR20240012014A (en) System for protection of trade content by verifying transaction sender
KR20040027649A (en) The electronic management system of ledger based on the biometrics data for issuing the documents
WO2024059758A1 (en) Systems and methods for token-based asset ownership

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20231215

Address after: 430014 Building 2, Guannan Industrial Park, Donghu New Technology Development Zone, Wuhan City, Hubei Province

Patentee after: WUHAN LONGJIN SCIENCE AND TECHNOLOGY Inc.

Patentee after: WUHAN University

Address before: 430014 Building 2, Guannan Industrial Park, Donghu New Technology Development Zone, Wuhan City, Hubei Province

Patentee before: WUHAN LONGJIN SCIENCE AND TECHNOLOGY Inc.