CN113742770A - Cross-modal privacy protection oriented AI (Artificial Intelligence) management method and device - Google Patents

Cross-modal privacy protection oriented AI (Artificial Intelligence) management method and device Download PDF

Info

Publication number
CN113742770A
CN113742770A CN202110908765.5A CN202110908765A CN113742770A CN 113742770 A CN113742770 A CN 113742770A CN 202110908765 A CN202110908765 A CN 202110908765A CN 113742770 A CN113742770 A CN 113742770A
Authority
CN
China
Prior art keywords
privacy
resources
resource
visitor
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110908765.5A
Other languages
Chinese (zh)
Other versions
CN113742770B (en
Inventor
段玉聪
雷羽潇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hainan University
Original Assignee
Hainan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hainan University filed Critical Hainan University
Priority to CN202110908765.5A priority Critical patent/CN113742770B/en
Publication of CN113742770A publication Critical patent/CN113742770A/en
Application granted granted Critical
Publication of CN113742770B publication Critical patent/CN113742770B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/901Indexing; Data structures therefor; Storage structures
    • G06F16/9024Graphs; Linked lists
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • G06F21/6254Protecting personal data, e.g. for financial or medical purposes by anonymising data, e.g. decorrelating personal data from the owner's identification

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Bioethics (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Medical Informatics (AREA)
  • Computer Hardware Design (AREA)
  • Computer Security & Cryptography (AREA)
  • Data Mining & Analysis (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The application discloses a cross-modal privacy protection-oriented AI management method and device, which are used for extracting privacy resources from interactive behavior information of a user. Modeling is carried out on the private resources to obtain a DIKW map. And inquiring the unknown privacy resource group of the visitor from the DIKW map according to the identity and the intention of the visitor. And starting an anonymous protection mechanism to encrypt the target resource. And starting a risk evaluation mechanism, and calculating the risk value of each privacy resource in the privacy resource group. And deleting the privacy resources with the risk values larger than the preset risk threshold value in the privacy resource group. And (5) enabling a supervision mechanism to detect whether the privacy resources have logic errors. In the absence of a logical error for the privacy resource, transmitting the set of privacy resources to the visitor. Compared with the prior art, the scheme of the application utilizes the AI system to replace the manual work, avoids the manual work to participate in the encryption protection operation of the privacy resources, and therefore can effectively improve the processing efficiency of privacy resource protection.

Description

Cross-modal privacy protection oriented AI (Artificial Intelligence) management method and device
Technical Field
The application relates to the field of artificial intelligence, in particular to an AI (artificial intelligence) management method and device for cross-modal privacy protection.
Background
With the popularization of various virtual communities in the internet, internet users can communicate through the virtual communities. However, user interaction in the Virtual community results in various digital types of privacy resources, including Virtual Trace (T)virtual) And User-generated content (User-generated content) resulting from interactive behaviornt, UGC). The virtual trace can reflect the self character and behavior habits of the user, can be used for observing, recording and analyzing the behavior of the user in the virtual community, reaches a certain order of magnitude of virtual trace and even can generate dispute effect, and guides network public opinion. In addition, part of the user generated content also has higher utilization value. In order to ensure that the privacy resources are not stolen and used maliciously, the privacy resources need to be protected.
At present, a mode of protecting the private resources is mostly a mode of manual decision, for example, an administrator of the virtual community manually performs encryption protection operation on the private resources, so that the processing efficiency is low, and careless mistakes are easy to occur.
Disclosure of Invention
The application provides an AI management method and device for cross-modal privacy protection, and aims to improve the processing efficiency of privacy resource protection.
In order to achieve the above object, the present application provides the following technical solutions:
an AI treatment method facing cross-modal privacy protection is applied to an AI system and comprises the following steps:
extracting privacy resources from the interactive behavior information of the user in the virtual community;
modeling the privacy resources to obtain a DIKW map;
analyzing an access request of a visitor to obtain the identity and intention of the visitor and a privacy resource group required by the visitor;
inquiring and obtaining the set of acquaintance privacy resources of the visitor from the DIKW map according to the identity and the intention of the visitor;
starting a preset anonymous protection mechanism to encrypt the target resource under the condition that the informed privacy resource group covers the privacy resource group and the privacy resource group contains the target resource; the target resource is a privacy resource containing preset sensitive content;
starting a preset risk evaluation mechanism, and calculating the risk value of each privacy resource in the privacy resource group;
deleting the privacy resources with the risk values larger than a preset risk threshold value from the privacy resource groups;
starting a preset supervision mechanism, and detecting whether a logic error exists in the privacy resources in the privacy resource group;
transmitting the set of privacy resources to the visitor in the absence of a logical error for a privacy resource in the set of privacy resources.
Optionally, the method further includes:
presetting the privacy rights of all participants in the circulation process of the privacy resources;
wherein the circulation process comprises a sensing process, a storage process, a transmission process and a processing process;
the perception process is as follows: the AI system extracts the privacy resources from the interactive behavior information of the user;
the storage process is as follows: the AI system converts the extracted privacy resources, models the converted privacy resources to obtain the DIKW map, and stores the DIKW map;
the transmission process comprises the following steps: a process in which the AI system transmits the set of privacy resources to the visitor;
the treatment process comprises the following steps: a process for the visitor to use the set of privacy resources;
the participants include the user, the visitor, and the AI system;
the privacy right comprises an informed right, a participation right, a forgetting right and a supervision right.
Optionally, modeling the privacy resources to obtain a didw map includes:
calculating to obtain the retention degree of the user on the privacy resources based on the interaction behavior information;
filtering the privacy resources with the value of the retention degree larger than a preset retention threshold value;
converting the rest privacy resources to obtain new privacy resources;
and modeling the new privacy resources to obtain a DIKW map, and storing the DIKW map.
Optionally, the converting the remaining privacy resources to obtain new privacy resources includes:
deriving a new privacy resource set from the remaining privacy resources, the new privacy resource set comprising a plurality of new privacy resources;
deriving the new set of privacy resources from a set of privacy resources consisting of the remaining privacy resources;
regarding the rest privacy resources as entities, and acquiring the technology and resources of the entities;
calculating a conversion difficulty of the entity according to the technology and the resource;
and under the condition that the value of the conversion difficulty is not less than a preset conversion capacity threshold value, converting the rest privacy resources into the new privacy resources by using the technology and the resources.
Optionally, the analyzing the access request of the visitor to obtain the identity and intention of the visitor and the privacy resource group required by the visitor includes:
under the condition of receiving an access request sent by an accessor, judging whether the accessor has an access right;
under the condition that the visitor has the access right, analyzing the access request to obtain the identity and intention of the visitor and a privacy resource group required by the visitor;
in the case that the visitor does not have the access right, prohibiting the visitor from acquiring the private resources in the DIKW map.
Optionally, the anonymous protection mechanism includes data anonymous protection, information anonymous protection, knowledge anonymous protection, and group anonymous protection; the privacy resources comprise data resources, information resources, knowledge resources and group privacy resources;
the enabling of the preset anonymous protection mechanism encrypts the target resource, and comprises the following steps:
identifying data resources containing preset sensitive contents in the privacy resource group as the target resources, and performing data anonymity protection on the target resources;
according to the intention of the visitor and the content sensitivity of each information resource in the privacy resource group, determining the information resources which need to be anonymized in the privacy resource group, and carrying out information anonymization protection on the information resources which need to be anonymized;
according to the intention of the visitor and the accuracy of each knowledge resource in the privacy resource group, determining the knowledge resources needing anonymous processing in the privacy resource group, and performing knowledge anonymity protection on the knowledge resources needing anonymous processing;
and group anonymity protection is carried out on group privacy resources contained in the privacy resource group.
Optionally, the preset supervision mechanism includes logic supervision, value supervision, and authority supervision;
the enabling a preset supervision mechanism, detecting whether a logic error exists in the privacy resource group, includes:
carrying out logic supervision on the decision rule and the decision result of the AI system; the decision result is used for indicating a circulation process of the privacy resource group;
determining that the value of the logic supervision is true under the condition that the decision result conforms to the decision rule;
calculating the privacy value of the privacy resources in the privacy resource group, and judging whether the privacy value meets a preset privacy value standard;
determining that the value of the value supervision is true when the privacy value meets the preset privacy value standard;
acquiring the supervision right of the AI system on the informed right, the supervision right of the AI system on the participation right and the supervision right of the AI system on the forgetting right;
determining that the value supervised by the authority is true under the conditions that the value of the supervision right of the AI system to the informed right is true, the value of the supervision right of the AI system to the participation right is true, and the value of the supervision right of the AI system to the forgetting right is true;
and under the conditions that the value of the logic supervision is true, the value of the value supervision is true, and the value of the authority supervision is true, determining that the privacy resources in the privacy resource group have no logic errors.
An AI administers device towards trans-modal privacy protection, includes:
the extraction unit is used for extracting privacy resources from the interactive behavior information of the user in the virtual community;
the modeling unit is used for modeling the privacy resources to obtain a DIKW map;
the analysis unit is used for analyzing the access request of the visitor to obtain the identity and intention of the visitor and the privacy resource group required by the visitor;
the inquiry unit is used for inquiring and obtaining the set of acquaintance privacy resources of the visitor from the DIKW map according to the identity and the intention of the visitor;
the protection unit is used for starting a preset anonymous protection mechanism and encrypting the target resource under the condition that the informed privacy resource group covers the privacy resource group and the privacy resource group contains the target resource; the target resource is a privacy resource containing preset sensitive content;
the evaluation unit is used for starting a preset risk evaluation mechanism and calculating the risk value of each privacy resource in the privacy resource group;
the deleting unit is used for deleting the privacy resources with the risk values larger than a preset risk threshold value in the privacy resource group;
the detection unit is used for starting a preset supervision mechanism and detecting whether the privacy resources in the privacy resource group have logic errors or not;
a transmission unit, configured to transmit the privacy resource group to the visitor when there is no logic error in the privacy resource group.
A computer-readable storage medium comprising a stored program, wherein the program performs the cross-modal privacy protection oriented AI governance method.
An AI administration device oriented to cross-modal privacy protection, comprising: a processor, a memory, and a bus; the processor and the memory are connected through the bus;
the memory is used for storing a program, and the processor is used for executing the program, wherein the cross-modal privacy protection oriented AI governance method is executed when the program is executed.
According to the technical scheme, the privacy resources are extracted from the interaction behavior information of the user in the virtual community. Modeling is carried out on the private resources to obtain a DIKW map. And analyzing the access request of the visitor to obtain the identity and intention of the visitor and the privacy resource group required by the visitor. And inquiring the unknown privacy resource group of the visitor from the DIKW map according to the identity and the intention of the visitor. And under the condition that the informed privacy resource group covers the privacy resource group and the privacy resource group contains the target resource, starting a preset anonymity protection mechanism to encrypt the target resource, wherein the target resource is the privacy resource containing preset sensitive content. And starting a preset risk evaluation mechanism, and calculating the risk value of each privacy resource in the privacy resource group. And deleting the privacy resources with the risk values larger than the preset risk threshold value in the privacy resource group. And starting a preset supervision mechanism, and detecting whether the privacy resources in the privacy resource group have logic errors. In the absence of a logical error for a privacy resource in the set of privacy resources, transmitting the set of privacy resources to the visitor. Compared with the prior art, the scheme of the application utilizes the AI system to replace the manual work, avoids the manual work to participate in the encryption protection operation of the privacy resources, and therefore can effectively improve the processing efficiency of privacy resource protection.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1a is a schematic diagram of an AI governance method for cross-modal privacy protection according to an embodiment of the present application;
fig. 1b is a schematic diagram illustrating a circulation process of a private resource according to an embodiment of the present application;
fig. 1c is a schematic diagram of a conversion process of a privacy resource according to an embodiment of the present application;
fig. 1d is a schematic diagram of a DIKW map provided in the embodiment of the present application;
fig. 2 is a schematic diagram of another cross-modal privacy protection-oriented AI governance method according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of an AI management apparatus for cross-modal privacy protection according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
As shown in fig. 1a, a schematic diagram of an AI governance method for cross-modal privacy protection according to an embodiment of the present application is applied to an AI system, and includes the following steps:
s101: and presetting the privacy rights of all participants in the circulation process of the privacy resources.
The circulation process comprises a sensing process, a storage process, a transmission process and a processing process, and the participants comprise a User (User), a Visitor (viewer) and an AI system. The user is a producer of the privacy resource, the visitor is a user of the privacy resource, and the AI system is a storer and a transmission center of the privacy resource. In the embodiment of the present application, the privacy right includes an informed right (Know), a participation right (participant), a forgetting right (form), and a supervision right (supervisor).
The so-called sensing process is: the AI system extracts the privacy resources from the interactive behavior information of the user, specifically, the AI system extracts the privacy resources from the interactive behavior information of the user in the virtual community. The participants in the perception process include users and AI systems, and the privacy involved in the perception process includes:
user's right of awareness (Know)U): the system is used for collecting the perceived informed rights of all the privacy resources of the AI system;
knowledge of AI systems (Know)AI): the AI system is allowed to perceive the collected privacy resources from the user, the privacy resources only comprise the virtual trace and the user generated content agreed by the user and do not comprise the privacy resources obtained by the AI system from other illegal channels;
participation right of AI system (participant)AI): in the sensing process, the AI system extracts a privacy resource mode, and the participation right of the AI system comprises the regulation and the limitation of the sensing time, the sensing content and the sensing mode of the AI system;
supervision of the user (Supervise)(U→AI)): monitoring whether the behavior of the AI system in the sensing process meets the rules or not by the user;
supervision of AI systems (Supervise)AI): the AI system self-supervises its own behavior in the sensing process.
The so-called storage procedure is: and the AI system converts the extracted privacy resources, models the converted privacy resources to obtain a DIKW map, and stores the DIKW map. The storage process comprises a conversion process of the privacy resources and a loading process of the privacy resources. The conversion process of the privacy resources comprises the following steps: the AI system converts the acquired privacy resources in the sensing process. The loading process of the privacy resources comprises the following steps: the AI system models the converted private resources to obtain a DIKW map (including individual DIKW maps and group DIKW maps), and stores the DIKW map in a medium that can be accessed and restored. The participants of the storage process include the AI system, and the privacy rights involved in the storage process include:
participation right of AI system (participant)AI): the participation processing right of the AI system to the privacy resources comprises a privacy resource conversion process and a privacy resource loading process;
forgetting right of AI system (Forget)AI): the AI system removes the forgetting right of the privacy resources losing the storage value on the DIKW map;
user's forgetting right (Forget)U): the user requires the AI system to clear away the forgotten privacy resources which are updated and wrongly expressed on the DIKW map;
supervision of AI systems (Supervise)AI): self-supervision of the AI system in the storage process comprises supervision of the participation right and the forgetting right of the AI system;
supervision of the user (Supervise)(U→AI)): and the user supervises whether the AI system stores the privacy resources in a proper mode in the storage process and whether the outdated privacy resources are systematically cleared and forgotten.
The so-called transmission process is: the AI system transmits the privacy resource group required by the visitor to the visitor in the form of transmitting the privacy resource on the basis of the visit intention of the visitor under the condition that the privacy value is met. The participants in the transmission include the visitor and the AI system, and the privacy rights involved in the transmission include:
visitor's right of awareness (Know)V): the content of the private resources which can be known by the visitor can be obtained by calculation and analysis of the AI system according to the identity and intention of the visitor;
user's right of awareness (Know)U): the AI system transmits the knowledge of the process to the visitor about the private resources of the user;
participant right of visitor (participant)V): in the transmission process, the processing participation right of the visitor to the transmission privacy resource is determined by the AI system;
supervision of AI systems (Supervise)(AI→V)): the AI system supervises whether the visitor has access right and whether the access intention is reasonable;
supervision of visitors (Supervise)(V→AI)): monitoring whether the transmission privacy resources transmitted by the AI system are real and credible and whether the transmission privacy resources accord with the self access requirements by the visitor;
supervision of the user (Supervise)U): user supervision of the transmission process of the privacy resources, including a first supervision right and a second supervision right (Supervise)(U→V));
First supervision (Supervise)(U→AI)): monitoring transmission privacy resources transmitted by the AI system by a user;
second supervision (Supervise)(U→V)): user supervision of visitor identity, and access intent.
The so-called treatment process is as follows: the process that the visitor uses the privacy resource group specifically refers to the process that the visitor uses the transmission privacy resource obtained in the transmission process. The parties to the process include visitors, and the privacy rights involved in the process include:
participant right of visitor (participant)V): the visitor handles the rights of the private resources, the content of the participation right being identical to the content of the participation right during the transmission.
Supervision of the user (Supervise)(U→V)): the user supervises the way the visitor uses his own private resources.
Supervise in access (Supervise)V): the visitor supervises himself how to handle and utilize the private resources.
The privacy rights of the participants in the above-mentioned sensing process, storage process, transmission process and processing process can be seen in fig. 1 b.
It should be noted that the privacy resource is generated based on various interactive behaviors of the user in the virtual community, and when a visitor submits an access application to the AI system (the AI system acts as an administrator of the virtual community) for his own intention, the circulation process of the privacy resource is started. Each participant has different privacy rights in different privacy circulation links (namely a perception process, a storage process, a transmission process and a processing process), and the privacy rights are necessary measures for ensuring the security of privacy resources. In the circulation process of the privacy resources, the privacy right of the user is an inherent right and is not bound with privacy obligations, and the privacy right of the user is not exercised by the user in time and is still kept.
Specifically, the right to know means that the participants know and acquire the freedom and right of the privacy resources, and the rights to know of different participants are differentiated in the circulation process of the privacy resources.
The informed rights include process informed rights and content informed rights. The process informed right refers to an informed right of a privacy resource circulation process, belongs to a binary Boolean value attribute, and is a unique right of a user as a privacy resource owner, as shown in formula (1), the process informed right of the user includes the informed right of the whole privacy resource circulation process, and the process informed right of the user is true, and is a necessary condition for an AI system to decide that a behavior is legal.
KnowU(course)={SensingK,StorageK,TransferK,ProcessingK} (1)
In the formula (1), KnowU(court) stands for Process awareness, Sensing, of the userKRepresenting the user's right to perceive the process, StorageKTransfer, representing the user's right to know about the stored procedureKProcessing on behalf of the user's knowledge of the transmission processKRepresenting the user's right to know about the process.
The content awareness right refers to privacy resources that can be informed by the participating entity, and as shown in formula (2), the informed privacy resource group of the entity is calculated according to the identity and intention of the participating entity and the participating process.
PDIK(G) Know=Know(E,process) (2)
In the formula (2), PDIK(G) KnowRepresent canThe privacy-aware resource group, Know is a pre-constructed function, E represents an entity (i.e. a privacy resource), and the process represents a participation process (a process in which a participant participates, specifically including any one or more of a perception process, a storage process, a transmission process, and a processing process).
Specifically, the participation right refers to the right of the participating entity to make a decision and use the privacy resource according to the preset specification. The participation right is a large concept category, and the specific content of the participation right can be characterized by a participation right group, as shown in formula (3), the participation right group includes but is not limited to the participation form, the participation times, the participation deadline and the like of the entity, in other words, the participation right group is a rule limit to the behavior of the participation entity. And (4) associating the participating entity with the participating process, and calculating to obtain a participating weight group.
Participation={Time,Form,Deadline,...} (3)
Participation=Paticipate(E,process) (4)
In the formulas (3) and (4), partitioning represents a Participation weight group, Time represents an entity obtaining Participation Deadline, Form represents a Participation Form of the entity, Deadline represents a Participation number of the entity, and partitioning is a pre-constructed function.
In addition, the participation right of the user comprises a ticket veto right, and the user has the right to exercise the ticket veto right on any participation process or any privacy resource and prevent the circulation of the privacy resource. After the user exercises a vote rejection right, the participation right of the user is false, and the decision-making behavior of the AI system is judged to be illegal and is stopped.
The participation right of the AI system comprises the collection of the privacy resources in the sensing process, the conversion of the privacy resources in the storage process, the establishment and the update of a DIKW map in the storage process, the decision of the transmission of the privacy resources in the transmission process and the like.
The participation right of the visitor comprises the access to the AI system in the transmission process, the collection of the privacy resources in the transmission process and the development and utilization of the privacy resources in the processing process.
Specifically, the forgetting right refers to the right of a participant to systematically remove forgetting of outdated privacy resources and worthless privacy resources in a preset DIKW map.
The outdated privacy resources are privacy resources which are replaced by other privacy resources with different semantic expressions along with the passage of time, and a user has forgetting right of the outdated privacy resources. Taking the decision event of "prize credit evaluation" as an example, it is known that in the sensing process, the timestamp of the second information resource is later than the timestamp of the first information resource, and the decision logic (event) is as follows:
IDIK1,IDIK2from IGraph(Ua);
IDIK1user UaFailed english level exam ";
IDIK2user UaPass english level exam ";
Vfairness=Ffairness(Event,IDIK1,Uprice)=False;
Vfairness=Ffairness(Event,IDIK2,Uprice)=True。
in decision logic decision (event), IDIK1Representing a first information resource, IDIK2Representing a second information resource, IGraph(Ua) Representing user UaInformation profile, V, of the generated privacy resourcesfairnessValue representing privacy fairness, FfairnessRepresenting a pre-constructed function, U, for calculating privacy valuepriceRepresenting the value of the user's utility.
For the decision logic decision (event), the AI system has timing to the user UaThe past privacy resources in the information map are subject to the obligation of system forgetting removal so as to avoid the first information resources and the second information resources existing in the user U at the same timeaLeads to the violation of privacy value of the AI system in event decision, i.e. leads to VfairnessIs false. In the embodiment of the application, the information resource is a modality (i.e. concrete expression form) of the privacy resourceThe information map is a map expression form contained in a DIKW map.
In addition to the user, the AI system also has forgetting rights to the non-valuable privacy resources. Along with the time, the number of the privacy resources stored on the DIKW map can increase geometrically, the AI system needs to bear more and more storage cost, and meanwhile, excessive privacy resources on the DIKW map can also lead to the rise of the search cost in the process of automatic decision-making. After time changes, the value of a part of privacy resources is reduced, so that the part of privacy resources is converted into non-valuable privacy resources, the forgetting of the non-valuable privacy resources by the AI system is a necessary step for reducing the operation cost and ensuring the decision efficiency, and as shown in formula (5), when the cost to be paid for storing the privacy resources is higher than a preset cost threshold value, the privacy resources are regarded as the non-valuable privacy resources.
PDIK unvalue={PDIK|PDIK(cost)>PDIK(cost)W} (5)
In the formula (5), PDIK unvalueRepresenting a non-valuable private resource, PDIKRepresenting private resources, PDIK(cost) represents the cost to be paid for storing the privacy resources, PDIK(cost)WRepresenting a preset cost threshold.
In order to prevent outdated privacy resources and influence of non-value privacy resources on AI system decision making, an AI system sets a forgetting period, and when a forgetting period passes, a DIKW atlas needs to be systematically forgotten once, as shown in formula (6), a constructor Forget classifies the privacy resources on the DIKW atlas.
{PDIK(G) F,PDIK(G) R}=Forget(DIKWGraph) (6)
In the formula (6), PDIK(G) FRepresenting a forgotten set of private resources, PDIK(G) RRepresenting a set of reserved privacy resources.
Specifically, the supervision right includes a supervision right of an informed right, a supervision right of a participation right, and a supervision right of a forgetting right of the participant. In the embodiment of the application, the initial values of the monitoring right of the participant on the informed right, the monitoring right of the participant on the participation right and the monitoring right of the forgotten right are all true, when the participant proposes reasonable objection, the respective values of the monitoring right of the participant on the informed right, the monitoring right of the participant on the participation right and the monitoring right of the forgotten right are set to be false, and at the moment, the decision-making behavior of the AI system is determined to be illegal.
The monitoring right of the participants to the informed right corresponds to the content of the informed right, and comprises the monitoring right of the informed process and the monitoring right of the informed content. As shown in equation (7), when the right to supervise the informed process and the right to supervise the informed content are both true, the right to supervise the informed content by the participant is true.
Sknow=Sknow(course)&&Sknow(content) (7)
In the formula (7), SknowSupervision of the right to know on behalf of the participant, Sknow(court) stands for supervision of the informed Process, Sknow(content) represents a supervisory right to the informed content. Supervision of the informed process is part of the privacy value protection as "Drug1The decision Event of the human experiment volunteer registration and examination is taken as an example, and the decision rule Event is known1(rule) includes a first rule and a second rule, as follows:
Event1(rule1): "the volunteer who participated in the human experiment can obtain XX Yuan subsidy";
Event1(rule2): "participants need to be voluntary entries".
Accordingly, Decision logic Decision (Event)1) As follows:
KDIK1 fromKGraph(Ua);
KDIK2 fromKGraph(Ub);
KDIK1=K(Ua,Event1) User UaHas certain medical knowledge and can know Event1Risk behind ";
KDIK2=K(Ua,Event1) User UbNeed to use money, and do not have knowledge of events1Conditions of risk ";
Sknow(course)=Supervise(Event1(rule),KDIK1,KDIK2)=False
Uprice(Ua)<Uprice(Ub);
Vfairness=Ffairness(Event1,PDIK(G),Uprice)=False
in Decision logic Decision (Event)1) In Event1(rule1) Representing a first rule, Event1(rule2) Represents a second rule, KDIK1Representing a first knowledge resource, KGraph(Ua) Representing user UaKnowledge graph of generated private resources, KDIK2Representing a second knowledge resource, KGraph(Ub) Representing user UbKnowledge-graph of the generated private resources, VfairnessRepresentative of privacy value, FfairnessConstructor, U, representing privacy valueprice(Ua) Representing user UaValue of utilization of Uprice(Ub) Representing user UbUtilization value of (P)DIK(G)Represents a set of privacy resource groups (which may specifically include a first knowledge resource and a second knowledge resource). It should be noted that the knowledge resource is a modality (i.e., a concrete expression form) of the privacy resource, and the knowledge graph is a graph expression form included in the DIKW graph.
For the above Decision logic Decision (Event)1) In order to guarantee fairness in the AI decision-making process, the AI system sets a risk reminding notice in a volunteer registration page to guarantee the right of all users to know the risk of a decision-making event.
Further, as shown in formula (8), the transmission privacy resource group and the informed privacy resource group calculated by the above formula (2) are compared and calculated. When the transmission privacy resource group is consistent with the informed privacy resource group, the supervision right of the informed content is true. When the transmission privacy resource group is inconsistent with the informed privacy resource group, the behavior that the informed right is over-authorized or the behavior that the informed right is not satisfied exists in the participant, and the supervision right of the informed content is false.
Sknow(content)=Supervise(PDIK(G) T,PDIK(G) Know) (8)
In the formula (8), Sknow(content) represents a supervision of the content of an informed day, PDIK(G) TDelegate group of Transmission privacy resources, PDIK(G) KnowSupervise is a pre-constructed function that represents a set of informed privacy resources.
And the monitoring right of the participation party on the participation right is used for monitoring whether the entity collects, extracts and uses the privacy resources according to the rule indicated by the participation right group. In the embodiment of the present application, the content of supervision includes whether the number of times the entity participates in the processing exceeds a standard, whether the participation form of the entity is within an allowable range, whether the participation deadline of the entity belongs to a preset time period, and the like.
In addition, as shown in equation (9), the constructor Supervise compares and calculates the actual participation content and the participation right of the entity participating in the decision process, and if the actual participation content does not exceed the range specified by the participation right, the monitoring right of the participation right by the participant is true.
Spaticipate=Supervise(ParticipationReal,Participation) (9)
In formula (9), SpaticipateSupervision of Participation on behalf of a participantRealRepresenting the actual Participation content and the Participation representing the Participation right.
The supervision of the forgotten right by the participant comprises the supervision of the use of the forgotten right and the supervision of the forgotten content as shown in the formula (10).
Sforget=Sforget(course)&&Sforget(content) (10)
In the formula (10), SforgetSupervision of the forgetting rights on behalf of the participants, Sforget(court) stands for forgetfulnessSupervision of rights to use, Sforget(content) represents a supervision right to forgotten content.
And when the forgetting period is finished, if the AI system executes forgetting clearing, the right to supervise the use of the forgetting right is set to be true.
Further, as shown in equation (11), the sum of the outdated privacy resource and the invaluable privacy resource is calculated in advance, the sum and the forgotten privacy resource set calculated by the above equation (6) are compared and calculated, and when the sum coincides with the forgotten privacy resource set, the right to supervise the forgotten content is true.
Sforget(content)=Supervise(PDIK(G) F,PDIK(G) old+PDIK(G) unvalue) (11)
In formula (11), Sforget(content) represents a supervision right to forgotten content, PDIK(G) FRepresenting a forgotten set of private resources, PDIK(G) oldRepresenting outdated private resources, PDIK(G) unvalueRepresenting a non-valuable privacy resource, Supervise is a pre-constructed function.
S102: and acquiring the interactive behavior information of the user in the virtual community, and extracting the privacy resources from the interactive behavior information.
Here, S102 shows that the substance is the above-mentioned sensing process.
S103: and calculating to obtain the retention degree of the user on the privacy resources based on the interactive behavior information.
The extraction of the privacy resources is a process of acquiring PDIK from homogeneous or heterogeneous sources. The privacy is self-subjective, and the collection standard of the privacy resources is determined according to the retention degree of the privacy resources by the user. In addition, the content shown in S103 is also part of the above-mentioned sensing process.
The degree of the user's retention of the private resources is a basic attribute of the private resources. The larger the value of the retention degree is, the higher the retention degree of the user on the privacy resources is. As shown in equation (12), the degree of retention depends on the self-attribute of the privacy resource and the related behavior record group of the pair between the user and the privacy resource. The self attribute of the privacy resource can determine the basic value of the retention degree, and the relative behavior record group of the user and the privacy resource can determine the dynamic value of the retention degree.
PDIK(DRes)=Reserve(PDIK,Inter(PDIK)) (12)
In the formula (12), PDIK(DRes) Representing the degree of retention, PDIKRepresenting a private resource, Inter (P)DIK) The Reserve is a pre-constructed function for calculating the degree of retention, representing the set of relevant behavior records for the pair between the user and the private resource.
Specifically, the sources of the privacy resources include virtual traces and user-generated content, the virtual traces belong to passive resources, and the user-generated content belongs to active resources. In the calculation of the retention degree, the retention degree of the privacy resources of the virtual traces by the user is greater than the retention degree of the privacy resources of the generated content by the user.
Obviously, the behavior of the user in the virtual community can reflect the retention degree of the user psychological layer facing the private resource. The relative behavior record group of the pair between the user and the privacy resource comprises a privacy protection positive correlation behavior and a privacy protection negative correlation behavior, wherein the privacy protection positive correlation behavior and the privacy protection negative correlation behavior are as follows:
Inter(PDIK1)poseither "close virtual community read Mobile device microphone Authority" (13)
Inter(PDIK2)negBecoming "publish its annual income" (14)
In equations (13) and (14), Inter (P)DIK1)posRepresenting a privacy preserving positively correlated behavior, Inter (P)DIK2)negRepresenting privacy preserving negatively-related behavior. Privacy preserving positive correlationThe more behaviors are recorded in the behaviors, the higher the retention degree of the user on the privacy resources is, and the value of the retention degree is increased. The more actions recorded in the privacy preserving negative correlation actions, the lower the value of the degree of preservation. Therefore, the privacy protection positive correlation behavior is positively correlated with the value of the retention degree, and the privacy protection negative correlation behavior is negatively correlated with the value of the retention degree.
S104: and filtering the privacy resources with the value of the retention degree larger than a preset retention threshold value.
And if the value of the retention degree is larger than the privacy resource of the preset retention threshold value, the privacy resource is regarded as a secret type resource, namely a resource range which cannot be extracted and is specified by the privacy resource acquisition standard. And after the privacy resources with the value of the retention degree larger than the preset retention threshold are filtered, only the privacy resources with the value of the retention degree not larger than the preset retention threshold are left.
It should be noted that the content shown in S104 is also part of the above-mentioned sensing process.
S105: and converting the rest privacy resources to obtain new privacy resources.
The privacy resources with incomplete and inconsistent contents in the virtual community have uncertainty, so that people lack confidence in the resource analysis process, and AI decision results lack persuasion for people. The uncertainty of the privacy resources can be compensated by resource transformation, as shown in fig. 1c, the privacy resources can be transformed into new privacy resources, and the specific transformation manner includes basic transformation, combined transformation, and technical transformation.
In the embodiment of the application, the privacy resource has two basic attributes, namely, the conversion degree PDIK(in) and the conversion out degree PDIK(out)。PDIK(out) denotes privacy resource convertible Generation PDIK(out) new privacy resources, PDIK(in) indicates that when the privacy resource itself is a new privacy resource, it can be managed by other PDIK(in) privacy resource conversion generation. In addition, the substance of S105 is the above-mentioned storage process, specifically, the privacy resource conversion process mentioned in the storage process.
Specifically, the basic conversion means: and deriving a new privacy resource set from the rest privacy resources, wherein the new privacy resource set comprises a plurality of new privacy resources, and the types and the number of the privacy resources contained in the new privacy resource set are not limited. The basic conversion comprises the same type conversion and cross type conversion of data resources, information resources and knowledge resources, and the data resources, the information resources and the knowledge resources are all a mode (namely a specific expression form and can also be called as a type) of privacy resources and belong to the existing public industry common knowledge. The specific implementation logic of the basic conversion is as follows:
IDIK1 from IGraph(Ua);
IDIK1user UaBorn in xx years xx month xx days ";
IDIK2 new=Transformation(IDIK1) User UaThis year xx.
In the above-mentioned implementation logic of the basic conversion, IDIK1Representing private resources (information resources of a particular type), IGraph(Ua) By user UaInformation map composed of private resources ofDIK2 newRepresenting the new privacy resources, and Transformation represents a pre-constructed Transformation function.
Specifically, the combination conversion means: and deriving a new privacy resource set from the privacy resource set consisting of the rest privacy resources, wherein the types and the number of the privacy resources contained in the privacy resource set and the new privacy resource set are not limited. The specific implementation logic of the combined conversion is as follows:
IDIK1,IDIK2 from IGraph(A);
IDIK1product a sales volume in the first quarter;
IDIK2product a second quarter sales;
IDIK3 new=Transformation(IDIK1,IDIK2) Product A with increased sales in the first half yearLong rate ".
In the above concrete implementation logic of combinational transformation, IDIK1Representing a first set of privacy resources, IDIK2Representing a second set of privacy resources, IGraph(A) Information Profile representing product A, IDIK3 newRepresenting a new set of privacy resources and Transformation representing a pre-constructed Transformation function.
Specifically, the technical conversion means: and calculating the conversion difficulty of the entity according to the technology and the resources, and converting the residual privacy resources into new privacy resources by using the technology and the resources under the condition that the value of the conversion difficulty is not less than a preset conversion capacity threshold value. The technical conversion can be understood as needing technical means and the assistance of other resource contents to complete the generation process of the new privacy resource.
Unlike the basic conversion and the combined conversion, the technical conversion has a conversion difficulty, and the value of the conversion difficulty depends on the entity participating in the conversion. In the calculation process of the conversion difficulty, as shown in formula (15), when the value of the conversion difficulty is smaller than the preset conversion capability threshold, it indicates that the entity has the capability of completing the technical conversion.
TDifficulty(PDIK,PDIK new)=Difficulty(PDIK,PDIK new,E) (15)
In the formula (15), TDifficultyRepresenting the difficulty of the conversion, PDIKRepresenting private resources, PDIK newRepresenting a new privacy resource, Difficulty representing a pre-constructed function, E representing an entity, wherein the entity comprises technology and resources, and under the condition that the technology and the resources cannot meet the requirement of technology conversion and only the entity participates in the conversion, the value of the conversion Difficulty is determined to be smaller than a preset conversion capacity threshold value, so that the privacy resource cannot be converted into the new privacy resource. The specific implementation logic of the technology conversion is as follows:
IDIK1=“Photo1by UaShot ";
IDIK2=“Photo1the content in (1) comprises Building1”;
IDIK3 new=Transformation(IDIK1,IDIK2)=“Photo1By UaIn Place1Shoot ";
IDIK4=“Building1is located on Place1”;
IfIDIK4∈E(Resource):
TDifficulty((IDIK1,IDIK2),IDIK3 new)
=Difficulty((IDIK1,IDIK2),IDIK3 new,E)
<TDifficulty W
Else:
TDifficulty((IDIK1,IDIK2),IDIK3 new)>TDifficulty W
In the specific implementation logic of the above technical transformation, IDIK1Representing a first privacy resource, IDIK2Representing a second privacy resource, IDIK3 newRepresenting new privacy resources, Transformation representing a pre-constructed Transformation function, IfIDIK4Representing other resources, E (resource) representing a resource, TDifficultyRepresenting difficulty of conversion, TDifficulty WRepresenting a preset switching capability threshold.
Specifically, the execution logic of the above-mentioned conversion process of the privacy resources can be seen from fig. 1 c.
S106: and modeling the new privacy resources to obtain a DIKW map, and storing the DIKW map.
The substance of the content shown in S106 is the above-mentioned storage process, specifically, the loading process of the privacy resource mentioned in the storage process, and the so-called loading process refers to a process of inserting a new privacy resource into the storage medium. In the embodiment of the application, the new privacy resources are stored in a medium which can be accessed, deleted and modified in a DIKW map mode.
And classifying and modeling the private resources of the users in the virtual community by utilizing a DIKW map technology. The DIKW map is a UML meta-model which is constructed by taking data, information, knowledge and wisdom with two concepts of 'Human' and 'Existence' as cores as a framework, and is divided into three different types of data, information and knowledge according to different expressions of privacy resources on an 'entity-relation' structure to form the DIKW map which is the privacy resources of a user and is formed by the data, the information and the knowledge map with the user as the center, and the DIKW map can be used for optimizing the storage, transmission and processing efficiency of the privacy resources.
It should be noted that the DIKW maps include individual DIKW maps and group DIKW maps.
Specifically, individual DIKW maps correspond to entities, and group DIKW maps correspond to close relation groups. The group DIKW atlas is associated with the individual DIKW atlas of each entity contained in the close relation group. Meanwhile, the same entity can be divided into a plurality of close relation groups according to different attributes, and therefore, the single individual DIKW map can be associated with the DIKW maps of the groups.
The so-called DIKW, known in Chinese as Data-Information-Knowledge-Wisdom (DIKW), is associated with a DIKW map (DIKW)Graph) Comprises a data map (D)Graph) Information map (I)Graph) And a knowledge graph (K)Graph)。
The data map comprises data resources (D)DIK) The data resource refers to a discrete element which can be directly observed, represents a single entity E or an attribute of the entity, and has no practical significance under the condition of no context. As shown in the DIKW graph shown in fig. 1d, the constructed data graph is an undirected graph with data resources as nodes and R (i.e., the information resources mentioned below) as edges, and it is known that a data resource is related to an entity, but it is not known that a specific relationship between the data resource and the entity exists.
The information graph includes information resources (I)DIK) Information resources express an objectively existing interaction relationship R between two entities, and expressed as R (E)1,E2) Is expressed in terms of the form.
The knowledge graph includes knowledge resources (K)DIK) Knowledge resources are generalized summaries and derivations of relationships between entities, and are denoted by K (E)1,E2) Is expressed in terms of the form. The knowledge resource is used for further derivation and summarization of the relation edge R between the nodes. The knowledge resources may have the problems of incomplete and uncertain contents and inconsistency with contents expressed by other privacy resources on a DIKW map, and can be solved by modeling and preference reasoning, and are not repeated here.
It should be noted that semantics of the key elements expressed by the privacy resources can be formalized, and the semantics can be specifically classified into three types, namely data resources, information resources and knowledge resources, and the expression mode of the privacy resources is shown in formula (16). The data resources, the information resources and the knowledge resources respectively express three different types of relations among the entities, in practical application, all things can be defined through the relations, specifically, the entities are used as nodes, the relations are used as edges, a DIKW map with users as cores can be constructed, and the expression mode of the DIKW map is shown in a formula (17).
PDIK={DDIK,IDIK,KDIK} (16)
DIKWGraph={DGraph,IGraph,KGraph} (17)
In the formulae (16) and (17), PDIKRepresenting new private resources, DIKWGraphRepresenting individual DIKW maps, DGraphRepresenting individual data profiles, IGraphRepresenting an individual information map, KGraphRepresenting an individual knowledge map.
Compared with the traditional data mining method, the collection and utilization of the resource relation of the DIKW map are more precise. Traditional data mining focuses on data type resources, and ignores the role that information and knowledge type resources play in analyzing user behaviors and habits. The amount of resources contained in the relationship R is not inferior to the entity E itself. Utilize the DIKW map, can dig all kinds of privacy resources among the virtual community deeply, avoid omitting relevant privacy resource.
In the DIKW map, data resources, information resources and knowledge resources respectively express the relationship among different types of resource entities. The relationship expressed by the data resources is: and whether a relationship exists between the two entities or not, wherein the value range of the relationship comprises True or not (True/Flase). The relationship expressed by the information resource is: whether a relationship exists between two entities or not, and the value range of the relationship comprises positive (positive) and negative (negative).
Knowledge resources are an inductive summary of relationships, with two attributes obtained by computation, accuracy and precision, respectively. The accuracy is used for indicating the accuracy of the knowledge resource to the relation expression, and the accuracy is used for indicating the fineness of the content expressed by the knowledge resource.
As shown in the formula (18), the accuracy can be calculated according to the basic attributes of the knowledge resources, where the basic attributes of the knowledge resources include, but are not limited to, the sources of the knowledge resources, and the higher the authority of the sources of the knowledge resources is, the higher the value of the accuracy is.
KDIK(Val)=Validity(KDIK,PDIK associated) (18)
In the formula (18), KDIK(Val) stands for accuracy, Validity is a pre-constructed function, KDIKRepresenting a knowledge resource, PDIK associatedRepresenting the relevant privacy resources (i.e., the source of the knowledge resources).
As can be seen from equation (18), the accuracy of the knowledge resource is also affected by the associated privacy resource. Related privacy resources with similar semantics will result in increased accuracy value, related privacy resources with mutually exclusive semantics will result in decreased accuracy value, and the relationship between the knowledge resources and the related privacy resources is stored on the information map.
In the embodiment of the present application, the accuracy is used to indicate the content accuracy of different knowledge resources expressing the same content, specifically, assuming that the first knowledge resource and the second knowledge resource both belong to the content shown in the photographic subject, and for the photographic content, the accuracy of the second knowledge resource is greater than the accuracy of the first knowledge resource, and the comparison logic is as follows:
KDIK1,KDIK2 FromKGraph(Ua);
{KDIK1,KDIK2}∈Topic(Photography);
KDIK1=K(Ua,Photography)=“Ualike photography ";
KDIK2=K(Ua,Photography)=“Ualike naturalistic photography ";
KDIK2(Pre)>KDIK1(Pre)。
in the above comparison logic, KDIK1Representing a first knowledge resource, KDIK2Representing a second knowledge resource, KGraph(Ua) Representing by user UaThe individual knowledge map of (1), topic (photographic) represents the content shown in the photographic subject, photographic represents the photographic content, KDIK1(Pre) represents the accuracy of the first knowledge resource, KDIK2(Pre) represents the accuracy of the second knowledge resource.
In a DIKW map, partial privacy resources have logic conflict with each other, and for this reason, a set of multiple privacy resources with logic conflict is regarded as a mutually exclusive privacy resource group. In particular, it is assumed that the first information resource originates from a user UaThe personal data filled in during registration, i.e. the first data resource, the first information resource originating from the second data resource, user UaThe method includes the steps that text content (namely user generated content) is published in a virtual community, correspondingly, a set formed by a first information resource and a second information resource is regarded as a mutually exclusive privacy resource group, and the judgment logic of the mutually exclusive privacy resource group is as follows:
IDIK1,IDIK2 from IGraph(Ua);
DDIK1"virtual community registration data";
DDIK2=UGC(text)=“I am aboy”;
IDIK1=Derive(DDIK1)=R(Ua,Gender)=“Uais a female;
IDIK2=Derive(DDIK2)=R(Ua,Gender)=“Uais a male;
{IDIK1,IDIK2}∈PDIK(G) in
in the above judgment logic, IDIK1Representing a first information resource, IDIK2Representing a second information resource, IGraph(Ua) Representing user UaAn information map of (D)DIK1Representing a first data resource, DDIK2Representing the second data resource, UGC (text) representing the user-generated content, Derive representing a pre-constructed function, PDIK(G) inRepresenting a set of mutually exclusive privacy resources.
The individual privacy resources contained in the DIKW map may affect a group through interactions between entities. Taking a human entity as an example, if privacy resources of related groups such as family, friend, and neighbor of the entity can be mined from a certain privacy resource of the entity, it is determined that the privacy resource belongs to group privacy.
Group privacy exists among a plurality of entities, and can be divided into two categories according to the nature of privacy resources, namely group relationship privacy and group content privacy. The number of entities involved in group privacy is greater than or equal to two, and the entities belonging to the same group privacy pair jointly form an intimate relation group sharing group privacy. The close relation group is not limited to a plurality of entity sets in which an association exists, but may represent a group classified by a specific tag such as race, gender, age, occupation, and the like.
Group relationship privacy is used to express relationships between two or more entities, each entity indicated in group relationship privacy being stored on an information graph. The privacy resources related to group content privacy influence the entities indicated in the close-related group, and generally, the privacy resources related to group content privacy are only stored on a personal DIKW map of one entity in the close-related group.
The degree of keeping the privacy of the group content by different entities in the close-related group is different due to individual differences, and the entities are connected with each other, and the behavior of any one entity affects the privacy protection of the whole group, so the difficulty of group privacy protection is higher than that of individual privacy protection.
In the embodiment of the application, a group DIKW map of the whole close relation group may be constructed based on individual DIKW maps of entities in the close relation group, where the group DIKW map includes a group data map, a group information map, and a group knowledge map, and a specific group DIKW map is shown in formula (19).
DIKWGraph G={DataGraphG,InformationGraphG,KnowledgeGraphG} (19)
In formula (19), DIKWGraph GRepresenting the DIKW atlas, DataGraphGRepresentative population data Profile, InformationGraphGRepresenting a population information map, knowledggegraphGRepresenting a group knowledge graph.
S107: when an access request transmitted by an accessor is received, whether the accessor has an access right is judged.
If the visitor has the access right, S108 is performed, otherwise S109 is performed.
The substance of this is the above mentioned transmission process, and the AI system exercises its own supervision, as indicated at 107.
S108: and analyzing the access request to obtain the identity and intention of the visitor and the privacy resource group required by the visitor.
After execution of S108, execution continues with S110.
S109: the visitor is prohibited from obtaining the private resources in the DIKW map.
S110: and inquiring the unknown privacy resource group of the visitor from the DIKW map according to the identity and the intention of the visitor.
The content shown in S110, which is the content right-of-way right mentioned above, is specifically shown in formula (2), the entity mentioned in formula (2) is the visitor, and the participation process is the transmission process mentioned above. The essence of a so-called set of privacy resources is a collection of privacy resources.
S111: the set of privacy resources required by the visitor is compared to the set of informed privacy resources of the visitor.
The content shown in S111 is the above-mentioned supervision right for the informed content, specifically, as shown in formula (8), the transmission privacy resource group mentioned in formula (8) represents the privacy resource group required by the visitor.
S112: and under the condition that the informed privacy resource group of the visitor covers the privacy resource group required by the visitor, judging whether the privacy resource group required by the visitor contains the target resource.
If the privacy resource group required by the visitor contains the target resource, S113 is executed, otherwise S114 is executed.
The target resource is a privacy resource containing preset sensitive content, and the preset sensitive content includes but is not limited to: the content is likely to cause adverse effects.
It should be noted that, in the case that the set of informed privacy resources of the visitor does not completely cover the set of privacy resources required by the visitor, it is determined that the access behavior of the visitor is unauthorized, and transmission of privacy resources other than the set of informed privacy resources to the visitor is prohibited.
S113: and starting a preset anonymous protection mechanism to encrypt the target resource.
After execution of S113, execution of S114 is continued.
Anonymity is a form of non-identifiability, and is a kind of hiding of resource attributes, not limited to name changes. The anonymity protection mechanism refers to hiding a part of the content (i.e. target resource) of the privacy resource required by the visitor based on the property of the privacy resource so as to achieve the purpose of privacy protection. In the embodiment of the application, the anonymous protection mechanism comprises four types of data anonymous protection, information anonymous protection, knowledge anonymous protection and group anonymous protection.
In the embodiment of the present application, a preset anonymous protection mechanism is enabled, and the process of encrypting the target resource includes:
1. and identifying the data resources containing preset sensitive contents in the privacy resource group as target resources, and carrying out data anonymity protection on the target resources.
2. And according to the intention of the visitor and the content sensitivity of each information resource in the privacy resource group, determining the information resources which need to be anonymized in the privacy resource group, and performing information anonymity protection on the information resources which need to be anonymized.
3. And determining knowledge resources needing anonymous processing in the privacy resource group according to the intention of the visitor and the accuracy of each knowledge resource in the privacy resource group, and performing knowledge anonymity protection on the knowledge resources needing anonymous processing.
4. And group anonymity protection is carried out on group privacy resources contained in the privacy resource group.
Specifically, the data anonymity protection means: for a data resource (namely a target resource) containing preset sensitive content, a proxy parameter encryption method is adopted to reduce the risk of privacy disclosure. Data resources in the privacy resource group required by the visitor are encrypted by the AI system according to a certain rule before the transmission process is started, and are decrypted by the visitor after the transmission is finished, so that the risk of privacy disclosure in the transmission process is reduced.
For example, a data resource sensitive to the content of the HIV detection result is anonymously encrypted by a generation parameter, and the result after anonymous encryption is represented by a new data resource, specifically, "a ═ 0" represents that the HIV detection result is negative, and "a ═ 1" represents that the HIV detection result is positive, and the encryption logic is as follows:
DDIK1=“TestHIV=Negative”;
DDIK2=“TestHIV=Positive”;
DDIK1 A=“a=0”;
DDIK2 A=“a=1”。
in the above encryption logic, DDIK1And DDIK2Are privacy resources containing preset sensitive content, TestHIVMeans for representingLet's assume sensitive content (i.e. HIV detection result), DDIK1 AAnd DDIK2 ARepresenting the new data resource (i.e., the anonymously encrypted result), and a represents the generation parameter. During the transmission process, original target resources in the privacy resource group required by the visitor are replaced by new privacy resources. Professional visitors with access qualification have the capability of restoring the new privacy resources back to the target resources, and no matter a third party escapes identity examination of an AI system through a non-legitimate way or obtains the new privacy resources due to an accident in the transmission process, the third party does not have the capability of restoring anonymous data, so that the purpose of privacy protection is achieved to a certain extent.
Specifically, the anonymous protection of the information refers to: a method for privacy protection is achieved by hiding some entity or relationship in an information resource (a concrete representation of a privacy resource). The content hidden in the information resource depends on the intention of the visitor and the content sensitivity of the information resource. For example, the first information resource represents whether the patient entity has aids, and the contents of the target information resource obtained by anonymizing the first information resource are different for different intentions of visitors, and the logic for implementing privacy protection is as follows:
IDIK1user UaPositive HIV test result of (a);
Purpose1"study of cases of aids";
IDIK1 A"positive HIV test result for user XX";
Purpose2"statistics of test results";
IDIK1 Auser UaPositive XX test result ";
Purpose3"HIV test user statistics";
IDIK1 Auser UaThe result of HIV detection of (1) is XX ".
In the above logic of privacy protection, IDIK1Representing a first information resource, Purpose1、Purpose2And Purpose3All represent the intentions of the visitor, IDIK1 ARepresenting a target information resource. According to three different intentions, the AI system respectively hides two entities and relations in the first information resource so as to reduce the leakage of irrelevant privacy content in the privacy resource group required by the visitor.
Specifically, the anonymous knowledge protection means: the general generalization of the entity and the prediction of the future behavior may include a plurality of knowledge resources (i.e. a concrete expression form of the privacy resource) with different accuracies under the same subject. For example, the first indication resource and the second knowledge resource belong to the topic of medicine requirement under the knowledge graph, and the logic of anonymous protection of knowledge is as follows:
KDIK1,KDIK2 FormKGraph(Ua);
{KDIK1,KDIK2}∈Topic(Medicine Demand);
KDIK1=K(Ua"user U" in MedicineaNeed to purchase the medicine (M)1: zidovudine) ";
KDIK1(Val)=75%;
KDIK2=K(Ua"user U" - "medianeaNeed to purchase the medicine (M)2: dehydroxyglycoside) ";
KDIK2(Val)=25%。
in the logic of anonymous protection of knowledge, KDIK1Representing a first knowledge resource, KDIK2Representing a second knowledge resource, KGraph(Ua) Representing user UaThe personal knowledge map of (1), Topic stands for Topic, KDIK1(Val) represents the accuracy of the first knowledge resource, KDIK2(Val) represents the accuracy of the second knowledge resource. Wherein M is1And M2Are conventional medicines for treating AIDS, the applicability and the requirement of the patient to the medicines depend on the physical condition and the past medical history of the patient, and the accuracy value of the first knowledge resource and the accuracy value of the second knowledge resource are distinguished accordingly. From the accuracy, user UaFor medicine M1Is greater than the purchase demand of the drug M2. After the anonymization processing is carried out on the first knowledge resource, a new first knowledge resource is obtained, and after the anonymization processing is carried out on the second knowledge resource, a new second knowledge resource is obtained, as follows:
KDIK1 A=K(Ua,Medicine),KDIK1 A(Val)=XX;
KDIK2 A=K(Ua,medicine),KDIK2 A(Val)=XX。
KDIK1 Arepresenting a new first knowledge resource, KDIK2 ARepresenting a new second knowledge resource. Visitor to user U during anonymous processingaWhen providing recommendation service, K with accuracy hidden according toDIK1 AAnd KDIK2 AFor drug M1And a drug M2Providing equal recommendation service, user UaThe medicine type of purchasing can be determined according to the actual condition of the user, and the autonomy that the user can select to purchase according to the self requirement is guaranteed.
It is emphasized that the anonymity protection mechanism may not only work on individual privacy resources but also on group privacy resources. For example, in a review class decision, the privacy resource group needs to include the identity resources of the relevant participants for notification or public notice of the subsequent review result, and for fairness, the identity resources need to be excluded from the privacy resources referred by the review decision to prevent fraud. At this point, an anonymous protection mechanism is started, and the logic thereof is as follows:
Event1(rule1) "choose top n names with achievement ranking";
Event1(rule2) "separate ranking of different specialties";
Event1(rule3) The evaluation is carried out according to the principle of fairness and justice;
Decision(Event1):
IfPDIK(G) T={Name,Age,Gender,Address,Major,Grade,....}:
Vfairness=Ffairness(Event1,PDIK(G) T,Uprice)=False;
PDIK(G) T(A)=Anonymity(PDIK(G) T)={XX,XX,XX,XX,Major,Grade,....};
IfPDIK(G) T==PDIK(G) T(A)
Vfairness=Ffairness(Event1,PDIK(G) T,Uprice)=True。
in the above logic, Event1Representing a selection-type decision event, VfairnessRepresenting privacy value, PDIK(G) T(A)Representing a new set of privacy resources, UpriceRepresenting the value of the user's usage, FfairnessAs a pre-constructed function, PDIK(G) TThe privacy resource group required by the representative visitor comprises Name (Name), Age (Age), Gender (Gender), native (Address), specialty (Major), achievement (Grade) and the like, and the possibility that irrelevant privacy resources can hinder decision fairness exists. And the new privacy resource group processed by the anonymization method can reduce the influence of irrelevant privacy resources on fairness to the minimum. The new privacy resource group after anonymization treatment can eliminate irrelevant privacy resources to the maximum extent, and the influence on the selection fairness is reduced to the minimum no matter the decision of the selection decision event is completed by an AI system or manually by workers.
S114: and starting a risk evaluation mechanism, and calculating the risk value of each privacy resource in the privacy resource group required by the visitor.
Among these, the role of the risk assessment mechanism is: and detecting whether the privacy resource group required by the visitor exists or not, wherein the privacy resource is leaked due to the privacy resource conversion process in the storage process. And calculating to obtain the risk value of the privacy resource based on the conversion difficulty of the privacy resource in the privacy resource group required by the visitor and the self attribute of the privacy resource. When the calculated risk value is larger than the preset risk threshold value, it is determined that privacy disclosure risks exist, and the AI system needs to delete and replace privacy resources causing increased risk values in privacy resource groups required by visitors, so that privacy safety is guaranteed.
In the embodiment of the application, the risk assessment mechanism is mainly used for performing assessment calculation based on the self-attributes of the privacy resources, and the self-attributes of the privacy resources comprise the conversion difficulty, the conversion in-degree, the conversion out-degree and the accuracy of the knowledge resources.
Specifically, the risk assessment of the conversion difficulty is to assess whether the visitor has the ability to convert the privacy resource into other new privacy resources that are not related to the access intention and may cause privacy infringement to the user. According to the above mentioned formula (15), the conversion difficulty can be calculated, and if the value of the conversion difficulty is infinite, it indicates that the visitor does not have the capability of converting the privacy resource into a new privacy resource, that is, the probability of the privacy resource being leaked is low. And if the value of the conversion difficulty is smaller than the preset conversion capacity threshold value, allowing the privacy resource group required by the visitor to transmit. If the visitor has the ability to convert the private resource to a new private resource, there is a probability that an unrelated private resource will be revealed, which should be replaced or deleted.
The risk assessment logic for the difficulty of translation can be seen as follows:
DDIK1,IDIK1 from DIKWGraph(Ua);
DDIK1"one Photo in virtual community1”;
IDIK1Photo ═ Photo1Middle by UaShot ";
IDIK2 newphoto ═ Photo1The shooting location ";
IfE==Visitorpro
TDifficulty(IDIK1,IDIK2 new)
=Difficulty(IDIK1,IDIK2 new,E)<TDifficulty W
VRisk<VRisk W
IfE==Visitorcommon
TDifficulty(IDIK1,IDIK2 new)
=Difficulty(IDIK1,IDIK2 new,E)>TDifficulty W
VRisk>VRisk W
in the above-described risk assessment logic of conversion difficulty, DDIK1Representing a first data resource, IDIK1Representing a first information resource, DIKWGraph(Ua) Representing user UaDIKW map, IDIK2 newRepresenting a new information resource converted from the first information resource, E representing an entity, VisitorproRepresenting the visitor's intention, TDifficultyRepresenting the Difficulty of the conversion, Difficulty being a pre-constructed function, TDifficulty WRepresenting a predetermined switching capacity threshold, VRiskA risk value, V, representing a private resourceRisk WRepresenting a preset risk threshold.
It should be noted that the risk assessment of the conversion difficulty has a high requirement on the calculation condition, and in a general decision, complete understanding of the ability of the visitor and calculation of the accurate conversion difficulty cannot be achieved. The risk evaluation based on the conversion difficulty is difficult to completely avoid the risk of privacy resource leakage.
The conversion in degree and the conversion out degree of the privacy resources can represent the connectivity of the privacy resources in a DIKW map. When a plurality of privacy resources meet the intention of the visitor, the privacy resource with the minimum comprehensive value and the worst connectivity on the two attributes of the conversion in-degree and the conversion out-degree can be selected to be included in the privacy resource group required by the visitor, so that the conversion times of the privacy resource and the risk of disclosure of the privacy resource are reduced.
In addition, on the premise of meeting the intention of the visitor, the knowledge resource with smaller accuracy value contains less resource content, and the risk of privacy disclosure is smaller.
S115: and deleting the privacy resources with the risk value larger than a preset risk threshold value in the privacy resource groups required by the visitors.
And replacing the privacy resources with the risk values larger than the preset risk threshold value by using other privacy resources meeting the conditions.
S116: and enabling a supervision mechanism to detect whether the privacy resources in the privacy resource group required by the visitor have logical errors.
Wherein, the supervision mechanism comprises logic supervision, value supervision and authority supervision.
In an embodiment of the present application, a supervision mechanism is enabled, and a process of detecting whether a logic error exists in a privacy resource group required by a visitor includes:
1. and carrying out logic supervision on the decision rule and the decision result of the AI system, wherein the decision result is used for indicating the circulation process of the privacy resource group.
2. And under the condition that the decision result accords with the decision rule, determining the value of logic supervision as true.
3. And calculating the privacy value of the privacy resources in the privacy resource group, and judging whether the privacy value meets a preset privacy value standard.
4. And under the condition that the privacy value meets the preset privacy value standard, determining the value of the value supervision as true.
5. And acquiring the supervision right of the AI system on the informed right, the supervision right of the AI system on the participation right and the supervision right of the AI system on the forgetting right.
6. And determining that the value supervised by the authority is true under the conditions that the value of the supervision right of the AI system to the informed right is true, the value of the supervision right of the AI system to the participation right is true, and the value of the supervision right of the AI system to the forgetting right is true.
7. And under the conditions that the value of the logic supervision is true, the value of the value supervision is true, and the value of the power supervision is true, determining that the privacy resources in the privacy resource group have no logic errors.
Specifically, as shown in formula (20), it is determined that there is no logical error in the privacy resources in the privacy resource group when the value of the logical supervision is true, the value of the value supervision is true, and the value of the power supervision is true.
Supervise=Slogic&&Svalue&&Sright (20)
In equation (20), Supervise is a pre-constructed function, SlogicValue, S, representing logic supervisionvalueValue representing value supervision, SrightRepresenting values supervised by the authority.
Specifically, the logical supervision means: the AI system supervises whether a logical error has occurred in the decision process. As shown in formula (21), the value of the logic supervision is the comparison result of the decision rule and the decision result.
Slogic=Supervise(Event(rule),Event(result)) (21)
In the formula (21), SlogicRepresenting the value of logic supervision, Supervise is a pre-constructed function, event (rule) represents a decision rule, and event (result) represents a decision result. In the embodiment of the present application, if the value of the logic supervision is true, it indicates that no logic error occurs in the AI decision process, and if the value of the logic supervision is false, it indicates that a logic error occurs in the AI decision process.
Taking the decision event of workload distribution as an example, the logic of logic supervision is as follows:
Event1(rule1): "total work amount is 100 parts";
Event1(rule2): "three staff count";
Event1(rule3): "workload distribution according to the processing capacity efficiency of the worker";
Decision(Event1):
Event1(result)={Ua:20,Ub:40,Uc:50};
Slogic=Supervise(Event1(rule),Event1(result))=False。
in the logic of the above logic supervision, Event1(rule1)、Event1(rule2) And Event1(rule3) Are decision rules, Event1(result) is the decision result, SlogicRepresenting logical supervision.
Specifically, value monitoring refers to: the AI system determines whether the decision behavior violates the supervision of the privacy value, as shown in formula (22), and whether the privacy value of the privacy resource meets the preset privacy value standard. When the privacy value of the privacy resources meets the preset privacy value standard, the value of the value supervision is true, namely the value supervision result is qualified.
Svalue=Supervise(PDIK(value),PDIK(value)W) (22)
In the formula (22), SvalueRepresenting the value of value supervision, Supervise being a pre-constructed function, PDIK(value) represents the privacy value of a privacy resource, PDIK(value)WRepresenting a preset privacy value criterion.
Specifically, the right supervision means: the AI system monitors the privacy right, including the monitoring right of the AI system on the informed right, the monitoring right of the AI system on the participation right, and the monitoring right of the AI system on the forgetting right. As shown in the formula (23), when the value of the supervision right of the AI system on the informed right is true, the value of the supervision right of the AI system on the participation right is true, and the value of the supervision right of the AI system on the forgetting right is true, the value of the right supervision is true, which means that the right supervision result is qualified.
Sright=Sknow&&Sparticipate&&Sforget (23)
In the formula (23), SrightValue, S, representing the supervision of the authorityknowValue representing the supervision of the AI system on the right to know, SparticipateValue representing the supervision of the participation right by the AI system, SforgetAnd the value represents the supervision right of the AI system to the forgetting right.
S117: in the absence of a logical error for a privacy resource in the set of privacy resources, transmitting the set of privacy resources to the visitor.
To sum up, compare in prior art, this embodiment the scheme utilizes the AI system to replace artifically, avoids artifical encryption protection operation who participates in the privacy resource to can effectively improve the treatment effeciency of privacy resource protection.
It should be noted that, in the foregoing embodiment, the reference S101 is an optional implementation manner of the cross-modal privacy protection oriented AI governance method described in this application. In addition, S107 mentioned in the foregoing embodiment is also an optional implementation manner of the cross-modal privacy protection oriented AI governance method described in this application. For this reason, the above embodiments may be summarized as the method shown in fig. 2.
As shown in fig. 2, a schematic diagram of an AI governance method for cross-modal privacy protection according to an embodiment of the present application is applied to an AI system, and includes the following steps:
s201: and extracting the privacy resources from the interactive behavior information of the user in the virtual community.
S202: modeling is carried out on the private resources to obtain a DIKW map.
S203: and analyzing the access request of the visitor to obtain the identity and intention of the visitor and the privacy resource group required by the visitor.
S204: and inquiring the unknown privacy resource group of the visitor from the DIKW map according to the identity and the intention of the visitor.
S205: and under the condition that the informed privacy resource group covers the privacy resource group and the privacy resource group contains the target resource, starting a preset anonymous protection mechanism to encrypt the target resource.
The target resource is a privacy resource containing preset sensitive content.
S206: and starting a preset risk evaluation mechanism, and calculating the risk value of each privacy resource in the privacy resource group.
S207: and deleting the privacy resources with the risk values larger than the preset risk threshold value in the privacy resource group.
S208: and starting a preset supervision mechanism, and detecting whether the privacy resources in the privacy resource group have logic errors.
S209: in the absence of a logical error for a privacy resource in the set of privacy resources, transmitting the set of privacy resources to the visitor.
To sum up, compare in prior art, this embodiment the scheme utilizes the AI system to replace artifically, avoids artifical encryption protection operation who participates in the privacy resource to can effectively improve the treatment effeciency of privacy resource protection.
Corresponding to the above-mentioned AI governance method for cross-modal privacy protection, an embodiment of the present application further provides an AI governance device for cross-modal privacy protection.
As shown in fig. 3, an architecture diagram of an AI governance device for cross-modal privacy protection according to an embodiment of the present application includes:
the setting unit 100 is configured to preset privacy rights of each participant in a circulation process of the privacy resources. The circulation process comprises a sensing process, a storage process, a transmission process and a processing process; the perception process is as follows: the AI system extracts the privacy resources from the interactive behavior information of the user; the storage process is as follows: the AI system converts the extracted privacy resources, models the converted privacy resources to obtain a DIKW map, and stores the DIKW map; the transmission process is as follows: the AI system transmits the privacy resource group to the visitor; the treatment process comprises the following steps: a process in which the visitor uses a set of privacy resources; the participants comprise users, visitors and AI systems; privacy rights include an informed right, a participation right, a forgetting right, and a supervision right.
The extracting unit 200 is configured to extract the privacy resource from the interaction behavior information of the user in the virtual community.
And the modeling unit 300 is used for modeling the privacy resources to obtain a DIKW map.
Wherein, the modeling unit 300 is specifically configured to: based on the interactive behavior information, calculating to obtain the retention degree of the user on the private resources; filtering the privacy resources with the value of the retention degree larger than a preset retention threshold value; converting the rest privacy resources to obtain new privacy resources; and modeling the new privacy resources to obtain a DIKW map, and storing the DIKW map.
The modeling unit 300 is configured to perform a process of converting the remaining privacy resources to obtain new privacy resources, and includes: deriving a new privacy resource set from the remaining privacy resources, wherein the new privacy resource set comprises a plurality of new privacy resources; deriving a new privacy resource set from a privacy resource set consisting of the remaining privacy resources; regarding the rest privacy resources as entities, and acquiring the technology and resources of the entities; calculating the conversion difficulty of the entity according to the technology and the resource; and under the condition that the value of the conversion difficulty is not less than a preset conversion capacity threshold value, converting the residual privacy resources into new privacy resources by using the technology and the resources.
The parsing unit 400 is configured to parse the access request of the visitor to obtain the identity and intention of the visitor and the set of privacy resources required by the visitor.
And the query unit 500 is configured to query the information privacy resource group of the visitor from the didw map according to the identity and intention of the visitor.
The querying unit 500 is specifically configured to: under the condition of receiving an access request sent by an accessor, judging whether the accessor has an access right; under the condition that the visitor has the access right, analyzing the access request to obtain the identity and intention of the visitor and a privacy resource group required by the visitor; in case the visitor does not have access right, the visitor is prohibited from acquiring the private resources in the DIKW atlas.
The protection unit 600 is configured to, when the informed privacy resource group covers the privacy resource group and the privacy resource group includes the target resource, start a preset anonymous protection mechanism to encrypt the target resource; the target resource is a privacy resource containing preset sensitive content.
The anonymous protection mechanism comprises data anonymous protection, information anonymous protection, knowledge anonymous protection and group anonymous protection; the privacy resources include data resources, information resources, knowledge resources, and group privacy resources.
The protection unit 600 is specifically configured to: identifying data resources containing preset sensitive contents in the privacy resource group as target resources, and carrying out data anonymity protection on the target resources; according to the intention of a visitor and the content sensitivity of each information resource in the privacy resource group, determining the information resources needing anonymous processing in the privacy resource group, and performing information anonymity protection on the information resources needing anonymous processing; determining knowledge resources needing anonymous processing in the privacy resource group according to the intention of the visitor and the accuracy of each knowledge resource in the privacy resource group, and performing knowledge anonymity protection on the knowledge resources needing anonymous processing; and group anonymity protection is carried out on group privacy resources contained in the privacy resource group.
The evaluation unit 700 is configured to enable a preset risk evaluation mechanism, and calculate a risk value of each privacy resource in the privacy resource group.
A deleting unit 800, configured to delete the privacy resource of which the risk value is greater than the preset risk threshold in the privacy resource group.
The detecting unit 900 is configured to enable a preset supervision mechanism, and detect whether a logic error exists in a privacy resource group of the privacy resource group.
The preset supervision mechanism comprises logic supervision, value supervision and authority supervision.
The evaluation unit 700 is specifically configured to: carrying out logic supervision on a decision rule and a decision result of the AI system; the decision result is used for indicating the circulation process of the privacy resource group; determining that the value of logic supervision is true under the condition that the decision result accords with the decision rule; calculating the privacy value of the privacy resources in the privacy resource group, and judging whether the privacy value meets a preset privacy value standard; determining the value of the value supervision as true under the condition that the privacy value meets the preset privacy value standard; acquiring the supervision right of an AI system on an informed right, the supervision right of the AI system on a participation right and the supervision right of the AI system on a forgetting right; determining that the value of power supervision is true under the conditions that the value of the supervision right of the AI system to the informed right is true, the value of the supervision right of the AI system to the participation right is true, and the value of the supervision right of the AI system to the forgetting right is true; and under the conditions that the value of the logic supervision is true, the value of the value supervision is true, and the value of the power supervision is true, determining that the privacy resources in the privacy resource group have no logic errors.
A transmitting unit 1000, configured to transmit the privacy resource group to the visitor if the privacy resource in the privacy resource group does not have a logical error.
To sum up, compare in prior art, this embodiment the scheme utilizes the AI system to replace artifically, avoids artifical encryption protection operation who participates in the privacy resource to can effectively improve the treatment effeciency of privacy resource protection.
The application also provides a computer-readable storage medium, which comprises a stored program, wherein the program executes the cross-modal privacy protection oriented AI governance method provided by the application.
The application also provides an AI treatment equipment towards crossing modal privacy protection, include: a processor, a memory, and a bus. The processor is connected with the memory through the bus, the memory is used for storing programs, and the processor is used for running the programs, wherein the program runs to execute the cross-modal privacy protection oriented AI governance method provided by the application.
The functions described in the method of the embodiment of the present application, if implemented in the form of software functional units and sold or used as independent products, may be stored in a storage medium readable by a computing device. Based on such understanding, part of the contribution to the prior art of the embodiments of the present application or part of the technical solution may be embodied in the form of a software product stored in a storage medium and including several instructions for causing a computing device (which may be a personal computer, a server, a mobile computing device or a network device) to execute all or part of the steps of the method described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The embodiments are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same or similar parts among the embodiments are referred to each other.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. An AI management method for cross-modal privacy protection is characterized in that the method is applied to an AI system and comprises the following steps:
extracting privacy resources from the interactive behavior information of the user in the virtual community;
modeling the privacy resources to obtain a DIKW map;
analyzing an access request of a visitor to obtain the identity and intention of the visitor and a privacy resource group required by the visitor;
inquiring and obtaining the set of acquaintance privacy resources of the visitor from the DIKW map according to the identity and the intention of the visitor;
starting a preset anonymous protection mechanism to encrypt the target resource under the condition that the informed privacy resource group covers the privacy resource group and the privacy resource group contains the target resource; the target resource is a privacy resource containing preset sensitive content;
starting a preset risk evaluation mechanism, and calculating the risk value of each privacy resource in the privacy resource group;
deleting the privacy resources with the risk values larger than a preset risk threshold value from the privacy resource groups;
starting a preset supervision mechanism, and detecting whether a logic error exists in the privacy resources in the privacy resource group;
transmitting the set of privacy resources to the visitor in the absence of a logical error for a privacy resource in the set of privacy resources.
2. The method of claim 1, further comprising:
presetting the privacy rights of all participants in the circulation process of the privacy resources;
wherein the circulation process comprises a sensing process, a storage process, a transmission process and a processing process;
the perception process is as follows: the AI system extracts the privacy resources from the interactive behavior information of the user;
the storage process is as follows: the AI system converts the extracted privacy resources, models the converted privacy resources to obtain the DIKW map, and stores the DIKW map;
the transmission process comprises the following steps: a process in which the AI system transmits the set of privacy resources to the visitor;
the treatment process comprises the following steps: a process for the visitor to use the set of privacy resources;
the participants include the user, the visitor, and the AI system;
the privacy right comprises an informed right, a participation right, a forgetting right and a supervision right.
3. The method of claim 1, wherein modeling the privacy resources to obtain a DIKW map comprises:
calculating to obtain the retention degree of the user on the privacy resources based on the interaction behavior information;
filtering the privacy resources with the value of the retention degree larger than a preset retention threshold value;
converting the rest privacy resources to obtain new privacy resources;
and modeling the new privacy resources to obtain a DIKW map, and storing the DIKW map.
4. The method of claim 3, wherein the converting the remaining privacy resources to obtain new privacy resources comprises:
deriving a new privacy resource set from the remaining privacy resources, the new privacy resource set comprising a plurality of new privacy resources;
deriving the new set of privacy resources from a set of privacy resources consisting of the remaining privacy resources;
regarding the rest privacy resources as entities, and acquiring the technology and resources of the entities;
calculating a conversion difficulty of the entity according to the technology and the resource;
and under the condition that the value of the conversion difficulty is not less than a preset conversion capacity threshold value, converting the rest privacy resources into the new privacy resources by using the technology and the resources.
5. The method of claim 1, wherein the parsing the visitor's access request to obtain the visitor's identity and intent and the set of privacy resources required by the visitor comprises:
under the condition of receiving an access request sent by an accessor, judging whether the accessor has an access right;
under the condition that the visitor has the access right, analyzing the access request to obtain the identity and intention of the visitor and a privacy resource group required by the visitor;
in the case that the visitor does not have the access right, prohibiting the visitor from acquiring the private resources in the DIKW map.
6. The method of claim 1, wherein the anonymity protection mechanisms include data anonymity protection, information anonymity protection, knowledge anonymity protection, and group anonymity protection; the privacy resources comprise data resources, information resources, knowledge resources and group privacy resources;
the enabling of the preset anonymous protection mechanism encrypts the target resource, and comprises the following steps:
identifying data resources containing preset sensitive contents in the privacy resource group as the target resources, and performing data anonymity protection on the target resources;
according to the intention of the visitor and the content sensitivity of each information resource in the privacy resource group, determining the information resources which need to be anonymized in the privacy resource group, and carrying out information anonymization protection on the information resources which need to be anonymized;
according to the intention of the visitor and the accuracy of each knowledge resource in the privacy resource group, determining the knowledge resources needing anonymous processing in the privacy resource group, and performing knowledge anonymity protection on the knowledge resources needing anonymous processing;
and group anonymity protection is carried out on group privacy resources contained in the privacy resource group.
7. The method of claim 1, wherein the predetermined supervision mechanism comprises logic supervision, value supervision, and authority supervision;
the enabling a preset supervision mechanism, detecting whether a logic error exists in the privacy resource group, includes:
carrying out logic supervision on the decision rule and the decision result of the AI system; the decision result is used for indicating a circulation process of the privacy resource group;
determining that the value of the logic supervision is true under the condition that the decision result conforms to the decision rule;
calculating the privacy value of the privacy resources in the privacy resource group, and judging whether the privacy value meets a preset privacy value standard;
determining that the value of the value supervision is true when the privacy value meets the preset privacy value standard;
acquiring the supervision right of the AI system on the informed right, the supervision right of the AI system on the participation right and the supervision right of the AI system on the forgetting right;
determining that the value supervised by the authority is true under the conditions that the value of the supervision right of the AI system to the informed right is true, the value of the supervision right of the AI system to the participation right is true, and the value of the supervision right of the AI system to the forgetting right is true;
and under the conditions that the value of the logic supervision is true, the value of the value supervision is true, and the value of the authority supervision is true, determining that the privacy resources in the privacy resource group have no logic errors.
8. The utility model provides a device is administered to AI towards trans-modal privacy protection which characterized in that includes:
the extraction unit is used for extracting privacy resources from the interactive behavior information of the user in the virtual community;
the modeling unit is used for modeling the privacy resources to obtain a DIKW map;
the analysis unit is used for analyzing the access request of the visitor to obtain the identity and intention of the visitor and the privacy resource group required by the visitor;
the inquiry unit is used for inquiring and obtaining the set of acquaintance privacy resources of the visitor from the DIKW map according to the identity and the intention of the visitor;
the protection unit is used for starting a preset anonymous protection mechanism and encrypting the target resource under the condition that the informed privacy resource group covers the privacy resource group and the privacy resource group contains the target resource; the target resource is a privacy resource containing preset sensitive content;
the evaluation unit is used for starting a preset risk evaluation mechanism and calculating the risk value of each privacy resource in the privacy resource group;
the deleting unit is used for deleting the privacy resources with the risk values larger than a preset risk threshold value in the privacy resource group;
the detection unit is used for starting a preset supervision mechanism and detecting whether the privacy resources in the privacy resource group have logic errors or not;
a transmission unit, configured to transmit the privacy resource group to the visitor when there is no logic error in the privacy resource group.
9. A computer-readable storage medium, characterized in that the computer-readable storage medium comprises a stored program, wherein the program executes the cross-modal privacy protection-oriented AI governance method according to any one of claims 1 to 7.
10. An AI administration device for cross-modal privacy protection, comprising: a processor, a memory, and a bus; the processor and the memory are connected through the bus;
the memory is used for storing a program, and the processor is used for executing the program, wherein the program executes the cross-modal privacy protection oriented AI governance method according to any one of claims 1 to 7.
CN202110908765.5A 2021-08-09 2021-08-09 AI (advanced technology attachment) treatment method and device for cross-modal privacy protection Active CN113742770B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110908765.5A CN113742770B (en) 2021-08-09 2021-08-09 AI (advanced technology attachment) treatment method and device for cross-modal privacy protection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110908765.5A CN113742770B (en) 2021-08-09 2021-08-09 AI (advanced technology attachment) treatment method and device for cross-modal privacy protection

Publications (2)

Publication Number Publication Date
CN113742770A true CN113742770A (en) 2021-12-03
CN113742770B CN113742770B (en) 2024-05-14

Family

ID=78730438

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110908765.5A Active CN113742770B (en) 2021-08-09 2021-08-09 AI (advanced technology attachment) treatment method and device for cross-modal privacy protection

Country Status (1)

Country Link
CN (1) CN113742770B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108804950A (en) * 2018-06-09 2018-11-13 海南大学 Based on data collection of illustrative plates, modeling and the data-privacy guard method of Information Atlas and knowledge mapping
CN112685772A (en) * 2020-12-28 2021-04-20 海南大学 Intrinsic-computation-oriented DIKW-mode-crossing relative difference privacy protection method
CN112818382A (en) * 2021-01-13 2021-05-18 海南大学 Essential computing-oriented DIKW private resource processing method and component

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108804950A (en) * 2018-06-09 2018-11-13 海南大学 Based on data collection of illustrative plates, modeling and the data-privacy guard method of Information Atlas and knowledge mapping
CN112685772A (en) * 2020-12-28 2021-04-20 海南大学 Intrinsic-computation-oriented DIKW-mode-crossing relative difference privacy protection method
CN112818382A (en) * 2021-01-13 2021-05-18 海南大学 Essential computing-oriented DIKW private resource processing method and component

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
YUXIAO LEI等: ""Technical Implementation Framework of AI Governance Policies for Cross-Modal Privacy Protection"", 《COLLABORATIVE COMPUTING:NETWORKING, APPLICATIONS AND WORKSHARING》, pages 431 - 443 *
雷羽潇等: "基于DIKW图谱的虚拟社区用户性格分类与转换方法", 应用科学学报, vol. 38, no. 5, pages 803 - 824 *

Also Published As

Publication number Publication date
CN113742770B (en) 2024-05-14

Similar Documents

Publication Publication Date Title
Harron et al. Methodological developments in data linkage
Cochrane Public Health Group et al. Digital contact tracing technologies in epidemics: a rapid review
Landman Democracy and human rights: Concepts, measures, and relationships
Billard Weighted forensics evidence using blockchain
Seneviratne et al. Enabling privacy through transparency
CN112231750A (en) Multi-mode privacy protection method integrating fairness, justice and transparent regulation technologization
Kieseberg et al. Protecting anonymity in data-driven biomedical science
Livraga et al. Data confidentiality and information credibility in on-line ecosystems
CN114883005A (en) Data classification and classification method and device, electronic equipment and storage medium
Georgiadou et al. Digital earth ethics
Leroux Legal admissibility of electronic evidence
Babalola Data Protection Legal Regime and Data Governance in Africa: An Overview
Ceolin et al. Combining user reputation and provenance analysis for trust assessment
CN113742770B (en) AI (advanced technology attachment) treatment method and device for cross-modal privacy protection
Mishra et al. A Global Medical Data Security and Privacy Preserving Standards Identification Framework for Electronic Healthcare Consumers
Reitinger et al. Epsilon-differential privacy, and a twostep test for quantifying reidentification risk
Xuepeng et al. Method of Information Security Risk Assessment Based on Improved Fuzzy Theory of Evidence.
Møller et al. Research data exchange solution
Nišević et al. Understanding the legal bases for automated decision-making under the GDPR
Stepanović Privacy and digital literacy: who is responsible for the protection of personal data in Serbia?
CN105245499A (en) Cloud service privacy information exposure evidence obtaining method
El Emam et al. Concepts and methods for de-identifying clinical trial data
Fatema et al. Extracting access control and conflict resolution policies from European data protection law
Calvi Exploring the Synergies between Non-Discrimination and Data Protection: What Role for EU Data Protection Law to Address Intersectional Discrimination?
Faresi et al. Privacy leakage in health social networks

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant