CN113742770B - AI (advanced technology attachment) treatment method and device for cross-modal privacy protection - Google Patents

AI (advanced technology attachment) treatment method and device for cross-modal privacy protection Download PDF

Info

Publication number
CN113742770B
CN113742770B CN202110908765.5A CN202110908765A CN113742770B CN 113742770 B CN113742770 B CN 113742770B CN 202110908765 A CN202110908765 A CN 202110908765A CN 113742770 B CN113742770 B CN 113742770B
Authority
CN
China
Prior art keywords
privacy
resource
resources
supervision
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110908765.5A
Other languages
Chinese (zh)
Other versions
CN113742770A (en
Inventor
段玉聪
雷羽潇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hainan University
Original Assignee
Hainan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hainan University filed Critical Hainan University
Priority to CN202110908765.5A priority Critical patent/CN113742770B/en
Publication of CN113742770A publication Critical patent/CN113742770A/en
Application granted granted Critical
Publication of CN113742770B publication Critical patent/CN113742770B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/901Indexing; Data structures therefor; Storage structures
    • G06F16/9024Graphs; Linked lists
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • G06F21/6254Protecting personal data, e.g. for financial or medical purposes by anonymising data, e.g. decorrelating personal data from the owner's identification

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Bioethics (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Medical Informatics (AREA)
  • Computer Hardware Design (AREA)
  • Computer Security & Cryptography (AREA)
  • Data Mining & Analysis (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The application discloses an AI (advanced technology attachment) treatment method and device for cross-modal privacy protection, which extract privacy resources from interactive behavior information of users. Modeling the privacy resource to obtain DIKW maps. And inquiring from the DIKW map according to the identity and intention of the visitor to obtain the informed privacy resource group of the visitor. And (5) starting an anonymous protection mechanism and encrypting the target resource. And starting a risk assessment mechanism, and calculating the risk value of each privacy resource in the privacy resource group. And deleting the privacy resources with risk values larger than a preset risk threshold in the privacy resource group. And (5) starting a supervision mechanism, and detecting whether the privacy resource has logic errors. In the event that there is no logical error in the privacy resource, the set of privacy resources is transmitted to the visitor. Compared with the prior art, the scheme of the application utilizes the AI system to replace manual work, and avoids the manual work from participating in the encryption protection operation of the privacy resource, thereby effectively improving the processing efficiency of the privacy resource protection.

Description

AI (advanced technology attachment) treatment method and device for cross-modal privacy protection
Technical Field
The application relates to the field of artificial intelligence, in particular to an AI (advanced technology attachment) treatment method and device for cross-modal privacy protection.
Background
With the popularization of various virtual communities in the Internet, internet users can communicate through the virtual communities. However, user interaction in the Virtual community can result in various digital types of privacy resources, including Virtual Trace (T virtual), and User-generated content (User-GENERATED CONTENT, UGC) resulting from the interaction. To ensure that the private resources are not stolen and maliciously used, protection of the private resources is required.
At present, a mode of protecting privacy resources mostly adopts a mode of manual decision, for example, an administrator of a virtual community manually performs encryption protection operation on the privacy resources, so that the processing efficiency is low, and careless mistakes are easy to occur.
Disclosure of Invention
The application provides an AI (advanced technology attachment) treatment method and device for cross-mode privacy protection, and aims to improve the processing efficiency of privacy resource protection.
In order to achieve the above object, the present application provides the following technical solutions:
an AI governance method facing cross-mode privacy protection is applied to an AI system and comprises the following steps:
Extracting privacy resources from interaction behavior information of users in the virtual community;
modeling the privacy resource to obtain DIKW maps;
Analyzing an access request of a visitor to obtain the identity and the intention of the visitor and a privacy resource group required by the visitor;
Inquiring from the DIKW map to obtain an informed privacy resource group of the visitor according to the identity and intention of the visitor;
When the privacy resource group covers the privacy resource group and the privacy resource group contains target resources, a preset anonymity protection mechanism is started to encrypt the target resources; the target resource is a privacy resource containing preset sensitive content;
Starting a preset risk assessment mechanism, and calculating risk values of all privacy resources in the privacy resource group;
Deleting the privacy resources with the risk values larger than a preset risk threshold in the privacy resource group;
enabling a preset supervision mechanism, and detecting whether logic errors exist in the privacy resources in the privacy resource group;
And transmitting the privacy resource group to the visitor under the condition that the privacy resources in the privacy resource group have no logic errors.
Optionally, the method further comprises:
presetting privacy rights of all participants in the circulation process of the privacy resource;
Wherein the circulation process comprises a sensing process, a storage process, a transmission process and a processing process;
The sensing process is as follows: the AI system extracts the process of privacy resources from the interactive behavior information of the user;
The storage process is as follows: the AI system converts the extracted privacy resources, models the privacy resources obtained by conversion, obtains the DIKW map and stores the DIKW map;
the transmission process is as follows: a process in which the AI system transmits the set of private resources to the visitor;
the processing process comprises the following steps: a process in which the visitor uses the set of privacy resources;
the participants include the user, the visitor, and the AI system;
the privacy rights include informed rights, participation rights, forgetfulness rights and supervision rights.
Optionally, modeling the privacy resource to obtain DIKW maps includes:
calculating the reservation degree of the user on the privacy resource based on the interaction behavior information;
filtering the privacy resources with the reserved degree value larger than a preset reserved threshold value;
converting the rest privacy resources to obtain new privacy resources;
modeling the new privacy resource to obtain DIKW maps, and storing the DIKW maps.
Optionally, the converting the remaining privacy resources to obtain new privacy resources includes:
deriving a new privacy resource set from the rest of the privacy resources, the new privacy resource set comprising a plurality of new privacy resources;
deriving the new privacy resource set from the privacy resource set consisting of the rest of the privacy resources;
regarding the rest of the privacy resources as entities, and acquiring the technologies and resources of the entities;
calculating the conversion difficulty of the entity according to the technology and the resource;
And under the condition that the value of the conversion difficulty is not smaller than a preset conversion capability threshold, converting the rest privacy resources into the new privacy resources by utilizing the technology and the resources.
Optionally, the analyzing the access request of the visitor, to obtain the identity and the intention of the visitor, and the privacy resource group required by the visitor, includes:
judging whether the visitor has access right or not under the condition that an access request sent by the visitor is received;
analyzing the access request under the condition that the visitor has the access right to obtain the identity and the intention of the visitor and the privacy resource group required by the visitor;
and under the condition that the visitor does not have the access right, prohibiting the visitor from acquiring the privacy resource in the DIKW map.
Optionally, the anonymity protection mechanism comprises data anonymity protection, information anonymity protection, knowledge anonymity protection and group anonymity protection; the privacy resources comprise data resources, information resources, knowledge resources and group privacy resources;
the enabling the preset anonymous protection mechanism to encrypt the target resource comprises the following steps:
Identifying the data resources containing preset sensitive content in the privacy resource group as the target resources, and carrying out data anonymity protection on the target resources;
According to the intention of the visitor and the content sensitivity of each information resource in the privacy resource group, determining the information resource which needs to be anonymously processed in the privacy resource group, and anonymously protecting the information resource which needs to be anonymously processed;
Determining knowledge resources needing to be anonymously processed in the privacy resource group according to the intention of the visitor and the accuracy of each knowledge resource in the privacy resource group, and carrying out knowledge anonymously protection on the knowledge resources needing to be anonymously processed;
And carrying out group anonymity protection on group privacy resources contained in the privacy resource group.
Optionally, the preset supervision mechanism includes logic supervision, value supervision and entitlement supervision;
The enabling the preset supervision mechanism, detecting whether the privacy resource in the privacy resource group has a logic error, includes:
Logic supervision is carried out on the decision rule and the decision result of the AI system; the decision result is used for indicating a circulation process of the privacy resource group;
Under the condition that the decision result accords with the decision rule, determining the value of the logic supervision to be true;
calculating the privacy value of the privacy resources in the privacy resource group, and judging whether the privacy value meets a preset privacy value standard or not;
under the condition that the privacy value meets the preset privacy value standard, determining the value of the value supervision to be true;
acquiring the supervision of the AI system on the right to know, the supervision of the AI system on the right to participate and the supervision of the AI system on the right to forget;
Determining that the value of the right supervision is true when the value of the supervision of the right of the AI system is true, the value of the supervision of the participation right of the AI system is true, and the value of the supervision of the left right of the AI system is true;
And determining that the privacy resources in the privacy resource group have no logic errors under the conditions that the value of the logic supervision is true, the value of the value supervision is true and the value of the right supervision is true.
An AI governance device for cross-modal privacy protection, comprising:
the extraction unit is used for extracting privacy resources from interaction behavior information of the user in the virtual community;
the modeling unit is used for modeling the privacy resource to obtain DIKW patterns;
The analyzing unit is used for analyzing the access request of the visitor to obtain the identity and the intention of the visitor and the privacy resource group required by the visitor;
the inquiring unit is used for inquiring and obtaining the informed privacy resource group of the visitor from the DIKW map according to the identity and the intention of the visitor;
The protection unit is used for starting a preset anonymous protection mechanism to encrypt the target resource under the condition that the privacy resource group covers the privacy resource group and the privacy resource group contains the target resource; the target resource is a privacy resource containing preset sensitive content;
the evaluation unit is used for starting a preset risk evaluation mechanism and calculating the risk value of each privacy resource in the privacy resource group;
a deleting unit, configured to delete the privacy resource whose risk value is greater than a preset risk threshold in the privacy resource group;
The detection unit is used for starting a preset supervision mechanism and detecting whether the privacy resources in the privacy resource group have logic errors or not;
and the transmission unit is used for transmitting the privacy resource group to the visitor under the condition that the privacy resources in the privacy resource group have no logic errors.
A computer-readable storage medium comprising a stored program, wherein the program performs the cross-modality privacy protection oriented AI governance method.
An AI governance device for cross-modal privacy protection, comprising: a processor, a memory, and a bus; the processor is connected with the memory through the bus;
the memory is used for storing a program, and the processor is used for running the program, wherein the AI treatment method facing the cross-mode privacy protection is executed when the program runs.
According to the technical scheme provided by the application, the privacy resource is extracted from the interaction behavior information of the user in the virtual community. Modeling the privacy resource to obtain DIKW maps. And analyzing the access request of the visitor to obtain the identity and the intention of the visitor and the privacy resource group required by the visitor. And inquiring from the DIKW map according to the identity and intention of the visitor to obtain the informed privacy resource group of the visitor. And under the condition that the privacy resource group covers the privacy resource group and the privacy resource group comprises the target resource, starting a preset anonymity protection mechanism to encrypt the target resource, wherein the target resource is the privacy resource comprising preset sensitive content. And starting a preset risk assessment mechanism, and calculating the risk value of each privacy resource in the privacy resource group. And deleting the privacy resources with risk values larger than a preset risk threshold in the privacy resource group. And starting a preset supervision mechanism, and detecting whether the privacy resources in the privacy resource group have logic errors or not. In the event that there is no logical error in the privacy resource group, the privacy resource group is transmitted to the visitor. Compared with the prior art, the scheme of the application utilizes the AI system to replace manual work, and avoids the manual work from participating in the encryption protection operation of the privacy resource, thereby effectively improving the processing efficiency of the privacy resource protection.
Drawings
In order to more clearly illustrate the embodiments of the application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1a is a schematic diagram of an AI treatment method for cross-modal privacy protection according to an embodiment of the present application;
fig. 1b is a schematic diagram of a circulation process of a private resource according to an embodiment of the present application;
Fig. 1c is a schematic diagram of a conversion process of a privacy resource according to an embodiment of the present application;
FIG. 1d is a diagram of DIKW. According to an embodiment of the present application;
FIG. 2 is a schematic diagram of another cross-modal privacy protection-oriented AI governance method provided by an embodiment of the application;
Fig. 3 is a schematic architecture diagram of an AI governance device for cross-modal privacy protection according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
As shown in fig. 1a, a schematic diagram of an AI governance method for cross-modal privacy protection according to an embodiment of the present application is applied to an AI system, and includes the following steps:
s101: privacy rights of all participants in the circulation process of the privacy resources are preset.
The circulation process comprises a perception process, a storage process, a transmission process and a processing process, and the participation party comprises a User, a visitor (Visitor) and an AI system. The user is a producer of the privacy resource, the visitor is a user of the privacy resource, and the AI system is a storage and a transmission center of the privacy resource. In an embodiment of the application, privacy rights include informed rights (Know), participation rights (PARTICIPATE), forgetfulness rights (Forget), and supervision rights (Supervise).
The so-called perception process is: the process of extracting the privacy resource from the interaction behavior information of the user by the AI system specifically refers to the process of extracting the privacy resource from the interaction behavior information of the user of the virtual community by the AI system. The participants in the perception process include users and AI systems, and privacy rights involved in the perception process include:
User's right to Know (Know U): the system is used for collecting perceived awareness of all privacy resources of the user by the AI system;
AI system awareness (Know AI): the AI system is allowed to perceive the collected privacy resources from the user, the privacy resources only comprise virtual traces agreed by the user and user generated content, and do not comprise privacy resources obtained by the AI system from other illegal channels;
Participation right of AI system (PARTICIPATE AI): in the perception process, the mode of extracting the privacy resource by the AI system, wherein the participation right of the AI system comprises the definition and limitation of the perception time, the perception content and the perception mode of the AI system;
user supervision (Supervise (U→AI)): the user monitors whether the behavior of the AI system in the sensing process accords with the rule;
AI system supervision (Supervise AI): AI systems self-supervise their own behavior in the sensing process.
The so-called storage procedure is: the AI system converts the extracted privacy resources, models the privacy resources obtained by conversion, obtains DIKW maps and stores DIKW maps. The storage process comprises a conversion process of the privacy resource and a loading process of the privacy resource. The conversion process of the privacy resource is as follows: the AI system converts the acquired privacy resources in the perception process. The loading process of the privacy resource is as follows: the AI system models the converted privacy resources, obtains DIKW maps (including individual DIKW maps, and population DIKW maps), and stores DIKW maps in a medium that can be accessed and restored. The participants of the stored procedure include AI systems, and the privacy rights involved in the stored procedure include:
Participation right of AI system (PARTICIPATE AI): the participation processing right of the AI system on the privacy resource comprises a conversion process of the privacy resource and a loading process of the privacy resource;
Forgetting rights of AI system (Forget AI): the AI system clears the forgotten right for the privacy resource which has lost the storage value on DIKW maps;
Forgetting rights of user (Forget U): the user requests the AI system to clear the forgotten right of the updated and misexpressed privacy resource on DIKW map;
AI system supervision (Supervise AI): the AI system monitors itself in the storage process, including monitoring the participation right and forgetting right of itself;
user supervision (Supervise (U→AI)): the user performs supervision on whether the AI system stores the privacy resources in a proper mode and whether the obsolete privacy resources are systematically cleared.
The so-called transmission procedure is: the process of transmitting the privacy resource group to the visitor by the AI system specifically refers to the process of transmitting the privacy resource group required by the visitor to the visitor in the form of transmitting the privacy resource based on the access intention of the visitor and under the condition that the privacy value is satisfied. The participants of the transmission process include the visitor and the AI system, and the privacy rights involved in the transmission process include:
Visitor's right to Know (Know V): the content of the privacy resource which can be informed by the visitor can be obtained by calculation and analysis by the AI system according to the identity and intention of the visitor;
user's right to Know (Know U): the user transmits the knowledge of the process to the visitor through the AI system on the own privacy resource;
Participation rights of visitor (PARTICIPATE V): during the transmission process, the processing participation right of the transmission privacy resource is definitely accessed by the AI system;
AI system supervision (Supervise (AI→V)): the AI system monitors whether the visitor has access rights and whether the access intention is reasonable;
Visitor's supervision (Supervise (V→AI)): the visitor supervises whether the transmission privacy resource transmitted by the AI system is truly and credible or not and whether the transmission privacy resource meets the access requirement of the visitor or not;
User supervision (Supervise U): the user supervising the private resource transmission process, including a first supervision and a second supervision (Supervise (U→V));
first supervision (Supervise (U→AI)): the user monitors transmission privacy resources transmitted by the AI system;
second supervision (Supervise (U→V)): user supervision of visitor identity, and access intent.
The so-called treatment process is: the process of using the privacy resource group by the visitor specifically refers to the process of using the privacy resource obtained in the transmission process by the visitor. The participants in the process include visitors, and privacy rights involved in the process include:
Participation rights of visitor (PARTICIPATE V): the visitor handles the rights of the privacy resource, and the content of the participation right is consistent with the content of the participation right in the transmission process.
User supervision (Supervise (U→V)): the user oversees how the visitor uses his own privacy resources.
Access supervision (Supervise V): the visitor self-oversees how to handle and utilize the privacy resources themselves.
The above mentioned perception process, storage process, transmission process, and privacy rights of the various parties in the process can be seen in fig. 1 b.
It should be noted that the privacy resource is generated based on various interactions of the user in the virtual community, and when the visitor makes an access request to the AI system (the AI system serves as an administrator of the virtual community) for his own intention, the circulation process of the privacy resource is started. Each participant has different privacy rights in different privacy circulation links (namely a sensing process, a storage process, a transmission process and a processing process), wherein the privacy rights are necessary measures for ensuring the security of privacy resources. In the circulation process of the privacy resource, the privacy rights of the user are inherent rights and are not bound with privacy obligations, so that the user does not exercise own privacy rights in time, and the privacy rights of the user are still reserved.
Specifically, the awareness refers to the freedom and the right of the participants to learn and acquire the privacy resources, and in the circulation process of the privacy resources, the awareness of different participants can be distinguished.
The awareness includes process awareness and content awareness. The process awareness refers to awareness of the circulation process of the privacy resource, belongs to binary Boolean value attribute, is unique right of a user as a privacy resource owner, and as shown in a formula (1), the process awareness of the user comprises awareness of the whole circulation process of the privacy resource, and the process awareness of the user is true and is a necessary condition of legal decision behavior of an AI system.
KnowU(course)={SensingK,StorageK,TransferK,ProcessingK} (1)
In formula (1), know U (coarse) represents the process awareness of the user, sensing K represents the awareness of the user, storage K represents the awareness of the user of the stored process, transfer K represents the awareness of the user of the transmission process, and Processing K represents the awareness of the user of the Processing process.
The content awareness refers to privacy resources which are informed by the participating entity, and as shown in a formula (2), the informed privacy resource group of the entity is obtained through calculation according to the identity and intention of the participating entity and the participating process.
PDIK(G) Know=Know(E,process) (2)
In formula (2), P DIK(G) Know represents the set of knowledgeable privacy resources, know is a pre-structured function, E represents an entity (i.e., privacy resource), and process represents a participation process (a process in which a participant participates, specifically including any one or more of a perception process, a storage process, a transmission process, and a processing process).
Specifically, the participation right refers to the right of the participating entity to make decisions and use the privacy resource according to a preset rule. Participation rights are a large conceptual category, and specific content of participation rights can be characterized by a participation weight group, which includes but is not limited to participation forms, participation times, participation deadlines, etc. for entities, in other words, the participation weight group is a rule restriction on behavior of participation entities, as shown in formula (3). And (3) correlating the participation entity with the participation process to obtain a participation weight group through calculation as shown in a formula (4).
Participation={Time,Form,Deadline,...} (3)
Participation=Paticipate(E,process) (4)
In formulas (3) and (4), participation represents a participation weight group, time represents a participation Deadline obtained by the entity, form represents a participation Form of the entity, line represents a participation number of the entity, and PARTICIPATE is a function constructed in advance.
In addition, the participation right of the user comprises a ticket overrule right, and the user has the right to exercise the ticket overrule right on any participation process or any privacy resource, so that circulation of the privacy resource is prevented. After the user exercises a ticket overrule, the participation of the user is false, and the decision-making action of the AI system is judged to be illegal and is stopped.
The participation of the AI system comprises the collection of the privacy resources in the sensing process, the conversion of the privacy resources in the storage process, the establishment and update of DIKW patterns in the storage process, the decision of the transmission privacy resource group in the transmission process, and the like.
The participation rights of the visitor comprise the access to the AI system in the transmission process, the collection of the privacy resources in the transmission process and the development and utilization of the privacy resources in the processing process.
Specifically, the forgetting right refers to the right that the participant systematically clears the forgetting of the outdated privacy resources and the worthless privacy resources in the preset DIKW map.
The outdated privacy resource refers to a privacy resource replaced by other privacy resources with different semantic expressions along with the passage of time, and the user has forgetting rights to the outdated privacy resource. Taking the decision event of "prize evaluation" as an example, it is known that, in the perception process, the timestamp of the second information resource is later than the timestamp of the first information resource, and the decision logic Decision (Event) is as follows:
IDIK1,IDIK2 from IGraph(Ua);
I DIK1 = "user U a did not pass english level examination";
i DIK2 = "user U a passed english level examination";
Vfairness=Ffairness(Event,IDIK1,Uprice)=False;
Vfairness=Ffairness(Event,IDIK2,Uprice)=True。
In decision logic Decision (Event), I DIK1 represents a first information resource, I DIK2 represents a second information resource, I Graph(Ua) represents an information map of privacy resources generated by user U a, V fairness represents privacy fairness value, F fairness represents a pre-constructed function for calculating privacy value, and U price represents user utility value.
Aiming at the decision logic Decision (Event), the AI system has the obligation of carrying out system removal forgetting on the outdated privacy resources in the information spectrum of the user U a at regular time so as to avoid that the first information resources and the second information resources exist on the information spectrum of the user U a at the same time, so that the AI system violates the privacy value in event decision, namely V fairness is false. In the embodiment of the application, the information resource is a mode (namely a concrete expression form) of the privacy resource, and the information spectrum is a spectrum expression form contained in the DIKW spectrum.
In addition to the user, AI systems also possess forgetfulness to the non-valued privacy resource. Over time, the number of privacy resources stored on the DIKW map can geometrically increase, and the AI system needs to bear more and more storage cost, and meanwhile, too many privacy resources on the DIKW map can also cause the increase of search cost in the process of automated decision. After time change, a part of privacy resources will have value reduction, so that the privacy resources are converted into non-value privacy resources, the forgetting of the non-value privacy resources by the AI system is an essential choice for reducing the operation cost and ensuring the decision efficiency, and when the cost required for storing the privacy resources is higher than a preset cost threshold value, the privacy resources are regarded as non-value privacy resources as shown in a formula (5).
PDIK unvalue={PDIK|PDIK(cost)>PDIK(cost)W} (5)
In formula (5), P DIK unvalue represents a non-valuable privacy resource, P DIK represents a privacy resource, P DIK (cost) represents a cost to be paid for storing the privacy resource, and P DIK(cost)W represents a preset cost threshold.
In order to prevent the influence of the outdated privacy resources and the worthless privacy resources on the decision of the AI system, the AI system sets a forgetting period, and the DIKW map needs to be subjected to systematic forgetting once every time a forgetting period passes, as shown in a formula (6), and the construction function Forget classifies the privacy resources on the DIKW map.
{PDIK(G) F,PDIK(G) R}=Forget(DIKWGraph) (6)
In equation (6), P DIK(G) F represents the forgotten set of privacy resources and P DIK(G) R represents the reserved set of privacy resources.
Specifically, the supervision rights include the supervision rights of the participants for the informed rights, the supervision rights of the participants, and the supervision rights of the forgetting rights. In the embodiment of the application, the initial values of the supervision of the participant on the informed rights, the supervision of the participant and the supervision of the forgetting rights are all true, and when the participant gives a reasonable objection, the values of the supervision of the participant on the informed rights, the supervision of the participant and the supervision of the forgetting rights are set to be false, and at the moment, the decision-making behavior of the AI system is determined to be illegal.
The supervision of the informed rights by the participants corresponds to the content of the informed rights, including the supervision of the informed processes, and the supervision of the informed content. As shown in equation (7), the participant's supervision of the informed content is true when both the supervision of the informed process and the supervision of the informed content are true.
Sknow=Sknow(course)&&Sknow(content) (7)
In formula (7), S know represents the authority of the participant over the informed content, S know (coarse) represents the authority over the informed process, and S know (content) represents the authority over the informed content. The supervision of the informed process is part of the protection of privacy value, taking the decision Event of "volunteer registration audit of Drug 1" as an example, the known decision rule Event 1 (rule) includes a first rule and a second rule, as follows:
Event 1(rule1): "participating volunteers can obtain a patch for XX element";
Event 1(rule2): "participants need to sign for voluntary calls".
Accordingly, decision logic Decision (Event 1) is as follows:
KDIK1 from KGraph(Ua);
KDIK2 from KGraph(Ub);
k DIK1=K(Ua,Event1) = "user U a has a certain medical knowledge, can know the risk behind Event 1";
k DIK2=K(Ua,Event1) = "user U b needs to use money and has no condition to know Event 1 risk";
Sknow(course)= Supervise(Event1(rule),KDIK1,KDIK2)=False
Uprice(Ua)<Uprice(Ub);
Vfairness=Ffairness(Event1,PDIK(G),Uprice)=False
In Decision logic Decision (Event 1), event 1(rule1) represents a first rule, event 1(rule2) represents a second rule, K DIK1 represents a first knowledge resource, K Graph(Ua) represents a knowledge graph of a privacy resource generated by user U a, K DIK2 represents a second knowledge resource, K Graph(Ub) represents a knowledge graph of a privacy resource generated by user U b, V fairness represents a privacy value, F fairness represents a constructor of a privacy value, U price(Ua) represents a utilization value of user U a, U price(Ub) represents a utilization value of user U b, and P DIK(G) represents a set of privacy resources (which may specifically include the first knowledge resource and the second knowledge resource). It should be noted that, the knowledge resource is a mode (i.e. a concrete expression form) of the privacy resource, and the knowledge spectrum is a spectrum expression form contained in the DIKW spectrum.
Aiming at the Decision logic Decision (Event 1), in order to ensure fairness in the AI Decision process, the AI system should set a risk reminding notice in a volunteer registration page to ensure that all users know about risk of Decision events.
In addition, as shown in the formula (8), the transmission privacy resource group and the informed privacy resource group calculated by the formula (2) are compared and calculated. The supervision of the informed content is true when the transmission privacy resource group and the informed privacy resource group agree. When the transmission privacy resource group is inconsistent with the acquirable privacy resource group, the behavior that the participant has the option of unauthorized notification or the behavior that the option of notification is not satisfied is indicated, and the supervision of the notification content is false.
Sknow(content) =Supervise(PDIK(G) T,PDIK(G) Know) (8)
In formula (8), S know (content) represents a supervision of the informed content, P DIK(G) T represents a transmission privacy resource group, P DIK(G) Know represents an informed privacy resource group, and Supervise is a function constructed in advance.
The participation authority of the participant is used for collecting, extracting and using the privacy resources according to the rule indicated by the participation authority tuple by the supervision entity. In the embodiment of the application, the supervised content comprises whether the number of times of the participation of the entity in the processing exceeds the standard, whether the participation form of the entity is in the allowed range, whether the participation deadline of the entity belongs to a preset time period and the like.
In addition, as shown in formula (9), the construction function Supervise compares and calculates the actual participation content of the entity participating in the decision process with the participation right, and if the actual participation content does not exceed the range specified by the participation right, the supervision right of the participation right by the participant is true.
Spaticipate=Supervise(ParticipationReal,Participation) (9)
In formula (9), S paticipate represents the supervision of participation by the participant, participation Real represents the actual participation content, and Participation represents the participation.
The participant's supervision of the forgetting rights, as shown in equation (10), includes supervision of the use of the forgetting rights, and supervision of the forgetting content.
Sforget=Sforget(course)&&Sforget(content) (10)
In formula (10), S forget represents the authority of the participant over the forgetting right, S forget (coarse) represents the authority of the use of the forgetting right, and S forget (content) represents the authority of the forgetting content.
The supervision rights used for the forgetting rights are used for supervising whether the AI system updates and optimizes the DIKW atlas according to the forgetting period timing, the supervision rights used for the forgetting rights are all false at the starting time point of each forgetting period, and when the forgetting period is finished, if the AI system executes forgetting clearing, the supervision rights used for the forgetting rights are set to be true.
In addition, as shown in formula (11), the sum value of the outdated privacy resource and the worthless privacy resource is calculated in advance, the sum value and the forgotten privacy resource set obtained by the calculation of the formula (6) are compared, and when the sum value is consistent with the forgotten privacy resource set, the supervision right of the forgotten content is true.
Sforget(content) =Supervise(PDIK(G) F,PDIK(G) old+PDIK(G) unvalue) (11)
In formula (11), S forget (content) represents the supervision of forgotten content, P DIK(G) F represents the set of forgotten privacy resources, P DIK(G) old represents the obsolete privacy resources, P DIK(G) unvalue represents the worthless privacy resources, supervise is a pre-constructed function.
S102: and acquiring interaction behavior information of the user in the virtual community, and extracting privacy resources from the interaction behavior information.
The content shown in S102 is the above-mentioned sensing process.
S103: and calculating to obtain the reservation degree of the user on the privacy resources based on the interaction behavior information.
The extraction of the privacy resource is a process of collecting PDIK from a homogeneous or heterogeneous source. Privacy is self-subjective, and the collection standard of the privacy resources is determined according to the reservation degree of the privacy resources by the user. In addition, the content shown in S103 is also part of the above-mentioned sensing process.
The reservation degree of the user on the privacy resource is a basic attribute of the privacy resource. The larger the value of the reservation degree is, the higher the reservation degree of the privacy resource is for the user. As shown in equation (12), the degree of reservation depends on the own properties of the privacy resource and the set of relevant behavior records for the pair between the user and the privacy resource. The attribute of the privacy resource can determine the basic value of the reservation degree, and the related behavior record group of the user and the privacy resource can determine the dynamic value of the reservation degree.
PDIK(DRes)=Reserve(PDIK,Inter(PDIK)) (12)
In formula (12), P DIK(DRes) represents the reservation degree, P DIK represents the privacy resource, inter (P DIK) represents the relevant behavior record group of the pair between the user and the privacy resource, and Reserve is a function previously constructed for calculating the reservation degree.
Specifically, the sources of the privacy resources include virtual traces and user generated content, the virtual traces belong to passive resources, and the user generated content belongs to active resources. In the calculation of the reservation degree, the reservation degree of the privacy resource of the virtual trace by the user is larger than the reservation degree of the privacy resource of the generated content by the user.
Obviously, the behavior of the user in the virtual community can reflect the reservation degree of the user psychological layer for the privacy resource. The related behavior record group of the pair between the user and the privacy resource comprises privacy protection positive related behavior and privacy protection negative related behavior, and the privacy protection positive related behavior and the privacy protection negative related behavior are as follows:
inter (P DIK1)pos = "authority to close virtual community read mobile device microphone" (13)
Interr (P DIK2)neg = "publish own annual income" (14)
In formulas (13) and (14), inter (P DIK1)pos represents a privacy-preserving positive correlation behavior, and Inter (P DIK2)neg represents a privacy-preserving negative correlation behavior). The more the behavior recorded in the privacy-preserving positive correlation behavior, the higher the reservation degree of the privacy resource by the user, the higher the value of the reservation degree will be.
S104: and filtering the privacy resources with the reserved degree of which the value is larger than a preset reserved threshold value.
And if the value of the reservation degree is larger than the privacy resource with the preset reservation threshold value, the privacy resource is regarded as a secret type resource, namely a resource range which is specified by the privacy resource acquisition standard and cannot be extracted. And filtering the privacy resources with the reservation degree of which the value is larger than the preset reservation threshold value, and then only remaining the privacy resources with the reservation degree of which the value is not larger than the preset reservation threshold value.
It should be noted that the content shown in S104 is also part of the above-mentioned sensing process.
S105: and converting the rest privacy resources to obtain new privacy resources.
The uncertainty of incomplete and inconsistent privacy resources in the virtual community causes people to lack confidence in the resource analysis process, and AI decision results lack convincing force for people. The uncertainty of the privacy resource can be compensated by the resource conversion, as shown in fig. 1c, the privacy resource can be converted into a new privacy resource, and specific conversion modes include basic conversion, combined conversion and technical conversion.
In the embodiment of the application, the privacy resource has two basic attributes, namely, the conversion degree P DIK (in) and the conversion degree P DIK(out).PDIK (out) respectively represent that the privacy resource can be converted to generate P DIK (out) new privacy resources, and the P DIK (in) represents that when the privacy resource is taken as the new privacy resource, the privacy resource can be converted to generate other P DIK (in) privacy resources. In addition, the content shown in S105 is actually the above-mentioned storage process, specifically, the conversion process of the privacy resource mentioned in the storage process.
Specifically, the basic conversion means: deriving a new privacy resource set from the rest of the privacy resources, wherein the new privacy resource set comprises a plurality of new privacy resources, and the types and the number of the privacy resources contained in the new privacy resource set are not limited. The basic conversion includes the same type conversion of data resources, information resources and knowledge resources, and cross-type conversion, and the data resources, the information resources and the knowledge resources are all a mode (namely a concrete expression form, also can be called as a type) of privacy resources, and belong to the common general knowledge of the prior public industry. The specific implementation logic of the basic conversion is as follows:
IDIK1 from IGraph(Ua);
I DIK1 = "user U a occurs on xx year xx month xx day";
I DIK2 new=Transformation(IDIK1) = "user U a this year xx year old".
In the above logic for implementing basic conversion, I DIK1 represents a private resource (a specific type is an information resource), I Graph(Ua represents an information map composed of private resources of the user U a, I DIK2 new represents a new private resource, and conversion represents a pre-constructed conversion function.
Specifically, the combination transformation means: the new privacy resource set is derived from the privacy resource set composed of the remaining privacy resources, and the types and the number of the privacy resources contained in the privacy resource set and the new privacy resource set are not limited. The specific implementation logic of the combination conversion is as follows:
IDIK1,IDIK2 from IGraph(A);
i DIK1 = "sales of product a in the first quarter";
I DIK2 = "sales of product a in the second quarter";
I DIK3 new=Transformation(IDIK1,IDIK2) = "rate of increase of sales of product a for half a year".
In the above embodiment logic of the combination Transformation, I DIK1 represents the first set of private resources, I DIK2 represents the second set of private resources, I Graph (a) represents the information map of product a, I DIK3 new represents the new set of private resources, and Transformation represents the pre-constructed Transformation function.
Specifically, the technical transformation means: according to the technology and the resources, calculating the conversion difficulty of the entity, and converting the rest privacy resources into new privacy resources by utilizing the technology and the resources under the condition that the value of the conversion difficulty is not smaller than the preset conversion capability threshold. The technical conversion can be understood as requiring technical means and the assistance of other resource contents to complete the generation process of the new privacy resource.
Unlike basic transformations and combinatorial transformations, technological transformations have a transformation difficulty, the value of which depends on the entity involved in the transformation. And in the calculation process of the conversion difficulty, as shown in a formula (15), when the value of the conversion difficulty is smaller than a preset conversion capability threshold value, the entity has the capability of completing the technical conversion.
TDifficulty(PDIK,PDIK new) =Difficulty(PDIK,PDIK new,E) (15)
In formula (15), T Difficulty represents the conversion difficulty, P DIK represents the privacy resource, P DIK new represents the new privacy resource, difficulty represents the pre-constructed function, E represents the entity, the entity includes the technology and the resource, and under the condition that the technology and the resource cannot meet the requirement of technology conversion and only the entity participates in conversion, the value of the conversion difficulty is determined to be smaller than the preset conversion capability threshold value, so that the conversion of the privacy resource into the new privacy resource cannot be realized. The specific implementation logic of the technical conversion is as follows:
I DIK1="Photo1 is photographed by U a ";
The content in I DIK2="Photo1 includes build 1 ";
I DIK3 new=Transformation(IDIK1,IDIK2) ="Photo1 is photographed by U a at Place 1 ";
i DIK4="Building1 is located in Place 1';
If IDIK4∈E(Resource):
TDifficulty((IDIK1,IDIK2),IDIK3 new)
=Difficulty((IDIK1,IDIK2),IDIK3 new,E)
<TDifficulty W
Else:
TDifficulty((IDIK1,IDIK2),IDIK3 new)>TDifficulty W
In the specific implementation logic of the above technical Transformation, I DIK1 represents a first privacy Resource, I DIK2 represents a second privacy Resource, I DIK3 new represents a new privacy Resource, transformation represents a pre-constructed Transformation function, if I DIK4 represents other resources, E (Resource) represents a Resource, T Difficulty represents a Transformation difficulty, and T Difficulty W represents a preset Transformation capability threshold.
In particular, the above mentioned conversion process of the privacy resource, the execution logic of which can be seen in fig. 1 c.
S106: modeling a new privacy resource to obtain DIKW maps and storing DIKW maps.
The essence of the content shown in S106 is the above-mentioned storage process, specifically, the loading process of the privacy resource mentioned in the storage process, and the loading process refers to the process of inserting the new privacy resource into the storage medium. In the embodiment of the application, the new privacy resource is stored in a medium which can be accessed, added, deleted and changed in the form of DIKW patterns.
And classifying and modeling the privacy resources of the users in the virtual community by utilizing DIKW atlas technology. The DIKW map is a UML meta-model constructed by taking data, information, knowledge and wisdom of two concepts of 'Human' and 'Existence' as cores, and the UML meta-model is divided into three different types of data, information and knowledge according to different expressions of privacy resources on a structure of 'entity-relationship', so that the data, information and knowledge map taking a user as a center are formed, and the three are combined into a privacy resource DIKW map of the user, and can be used for optimizing the storage, transmission and processing efficiency of the privacy resources.
The DIKW profile includes an individual DIKW profile and a population DIKW profile.
Specifically, the individual DIKW profile corresponds to an entity and the population DIKW profile corresponds to a close population. The map of population DIKW is associated with the map of individuals DIKW of each entity contained in the intimate population. Meanwhile, the same entity can be divided into a plurality of intimate groups according to different attributes, and for this purpose, the individual DIKW patterns can be associated with a plurality of group DIKW patterns.
So-called DIKW, chinese is called Data-Information-Knowledge-wisdom (Data-Information-knowledgement-Wisdom, DIKW), and correspondingly, DIKW profiles (DIKW Graph) include Data profile (D Graph), information profile (I Graph), and Knowledge profile (K Graph).
The data map includes data resources (D DIK), which are discrete elements that can be directly observed, representing a single entity E or attributes of an entity, and are not of practical significance without context. As shown in DIKW diagram shown in fig. 1d, the constructed data diagram is an undirected graph with data resources as nodes and R (i.e. information resources mentioned below) as edges, and the data resources are known to be related to a certain entity, but the specific relationship between the data resources and the entity is not known.
The information graph includes an information resource (I DIK) that expresses an objectively existing interaction relationship R between two entities and is represented in the form of R (E 1,E2).
The knowledge graph includes knowledge resources (K DIK), which are a summary and a derivation of relationships between entities, and is represented in the form of K (E 1,E2). Knowledge resources are further deductions and summaries of the relationship edges R between the nodes. The knowledge resources may have the problems of incomplete content, uncertainty and inconsistency with the content expressed by other privacy resources on DIKW maps, and can be solved by modeling and carrying out preference reasoning, which is not described herein.
It should be noted that, the semantic form of the key element expressed by the privacy resource may be specifically classified into three types of data resource, information resource and knowledge resource, and the expression mode of the privacy resource is shown in formula (16). The data resource, the information resource and the knowledge resource respectively express three different types of relations among the entities, in practical application, the relations can define everything, specifically, by taking the entities as nodes and the relations as edges, a DIKW map with a user as a core can be constructed, and the expression mode of the DIKW map is shown as a formula (17).
PDIK={DDIK,IDIK,KDIK} (16)
DIKWGraph={DGraph,IGraph,KGraph} (17)
In formulas (16) and (17), P DIK represents a new privacy resource, DIKW Graph represents an individual DIKW profile, D Graph represents an individual data profile, I Graph represents an individual information profile, and K Graph represents an individual knowledge profile.
The DIKW map is finer in collection and utilization of resource relationships than traditional data mining methods. Traditional data mining focuses on the data type resources themselves, ignoring the role that information and knowledge type resources play in analyzing user behavior and habits. The amount of resources contained in relation R is not inferior to entity E itself. By utilizing DIKW patterns, various privacy resources in the virtual community can be deeply dug, and the omission of related privacy resources is avoided.
In DIKW's map, the data resource, information resource and knowledge resource express the relationship between different types of resource entities. The relationship expressed by the data resources is: whether a relationship exists between the two entities, and the value range comprises yes or not (True/Flase). The relationship expressed by the information resource is: whether there is a relationship between the two entities, the range of values includes positive (positive) and negative (negative).
The knowledge resource is a summary of the relationships, and has two attributes obtained through calculation, namely accuracy and precision. Accuracy is used for indicating the accuracy degree of the knowledge resource to the relation expression, and accuracy is used for indicating the fineness degree of the content expressed by the knowledge resource.
As shown in formula (18), the accuracy can be calculated according to the basic attribute of the knowledge resource, wherein the basic attribute of the knowledge resource comprises, but is not limited to, a source of the knowledge resource, and the higher the authority of the source of the knowledge resource is, the higher the value of the accuracy is.
KDIK(Val)=Validity(KDIK,PDIK associated) (18)
In equation (18), K DIK (Val) represents accuracy, validy is a pre-constructed function, K DIK represents knowledge resources, and P DIK associated represents related privacy resources (i.e., sources of knowledge resources).
As can be seen from equation (18), the accuracy of knowledge resources can also be affected by the associated privacy resources. Related privacy resources with similar semantics can lead to the increase of the value of accuracy, related privacy resources with opposite semantics can lead to the decrease of the value of accuracy, and the relation between knowledge resources and related privacy resources is stored on an information map.
In the embodiment of the present application, the accuracy is used to indicate the accuracy of the content of different knowledge resources expressing the same content, specifically, it is assumed that the first knowledge resource and the second knowledge resource belong to the content shown by the photographic subject, and for the photographic content, the accuracy of the second knowledge resource is greater than the accuracy of the first knowledge resource, and the comparison logic is as follows:
KDIK1,KDIK2 From KGraph(Ua);
{KDIK1,KDIK2}∈Topic(Photography);
K DIK1=K(Ua,Photography) ="Ua likes photography ";
k DIK2=K(Ua,Photography) ="Ua likes natural sense photography ";
KDIK2(Pre)>KDIK1(Pre)。
In the above-described comparison logic, K DIK1 represents the first knowledge resource, K DIK2 represents the second knowledge resource, K Graph(Ua) represents the individual knowledge graph of the user U a, topic (Photography) represents the content shown by the photographic subject, photograph represents the photographic content, K DIK1 (Pre) represents the accuracy of the first knowledge resource, and K DIK2 (Pre) represents the accuracy of the second knowledge resource.
In DIKW, the partial privacy resources have logic conflict with each other, and for this reason, the set of multiple privacy resources with logic conflict is regarded as the mutually exclusive privacy resource group. Specifically, assuming that the first information resource originates from the personal data filled in when the user U a registers, i.e., the first data resource, the first information resource originates from the second data resource, the user U a publishes the literal content (i.e., the user generated content) in the virtual community, and accordingly, the set formed by the first information resource and the second information resource is regarded as a mutually exclusive privacy resource group, and the judgment logic of the mutually exclusive privacy resource group is as follows:
IDIK1,IDIK2 from IGraph(Ua);
D DIK1 = "virtual community registration material";
DDIK2=UGC(text)=“I am a boy”;
I DIK1=Derive(DDIK1)=R(Ua,Gender) ="Ua is a female ";
i DIK2=Derive(DDIK2)=R(Ua,Gender) ="Ua is a male ";
{IDIK1,IDIK2}∈PDIK(G) in
In the above-described decision logic, I DIK1 represents the first information resource, I DIK2 represents the second information resource, I Graph(Ua) represents the information profile of the user U a, D DIK1 represents the first data resource, D DIK2 represents the second data resource, UGC (text) represents the user-generated content, derive represents the pre-constructed function, and P DIK(G) in represents the set of mutually exclusive privacy resources.
The personal privacy resources contained in DIKW's map may affect a certain group through the interaction between entities. Taking a human entity as an example, if a privacy resource of the entity is started from a certain privacy resource of the entity, privacy resources of related groups such as family, friends, neighbors and the like of the entity can be mined, and the privacy resource is determined to belong to group privacy.
The group privacy exists among a plurality of entities and can be divided into two types according to the property of the privacy resource, namely group relation privacy and group content privacy. The number of the entities involved in the group privacy is greater than or equal to two, and the entities belonging to the same group privacy pair together form a close relationship group sharing the group privacy. The affinity group is not limited to a plurality of entity sets having an association, and may be a group classified according to a specific tag such as gender, age, occupation, or the like.
The group relationship privacy is used for expressing the relationship between two or more entities, and each entity indicated in the group relationship privacy is stored on the information map. The privacy resources related to the privacy of the group content affect the entities indicated in the affinity group, and in general, the privacy resources related to the privacy of the group content are only stored on the personal DIKW map of one entity in the affinity group.
The preservation degree of the privacy of the group content by different entities in the intimate group can be different due to the difference of individuals, the entities are connected with each other, and the privacy protection of the whole group is influenced by the behavior of any entity, so that the difficulty of the group privacy protection is higher than that of the individual privacy.
In the embodiment of the application, the individual DIKW spectrum of the entity in the intimate group can be taken as a basis to construct the group DIKW spectrum of the whole intimate group, wherein the group DIKW spectrum comprises a group data spectrum, a group information spectrum and a group knowledge spectrum, and the specific group DIKW spectrum is shown in a formula (19).
DIKWGraph G={DataGraphG,InformationGraphG,KnowledgeGraphG} (19)
In formula (19), DIKW Graph G represents a population DIKW map, DATAGRAPH G represents a population data map, informationGraph G represents a population information map, and KnowledgeGraph G represents a population knowledge map.
S107: and judging whether the visitor has the access right or not under the condition that the access request sent by the visitor is received.
If the visitor has access rights, S108 is performed, otherwise S109 is performed.
The content shown in 107 is actually the transmission procedure mentioned above, and the AI system exercises its own supervision.
S108: and analyzing the access request to obtain the identity and the intention of the visitor and the privacy resource group required by the visitor.
After S108 is performed, S110 is continued.
S109: the visitor is prohibited from obtaining DIKW private resources in the map.
S110: and inquiring from the DIKW map according to the identity and intention of the visitor to obtain the informed privacy resource group of the visitor.
The content shown in S110 is the content awareness, specifically, as shown in formula (2), the entity mentioned in formula (2) is the visitor, and the participation process is the transmission process mentioned above. A so-called privacy resource group is, in fact, a collection of privacy resources.
S111: the set of privacy resources required by the visitor is compared with the set of privacy resources available to the visitor.
The content shown in S111 is actually the above-mentioned supervision on the informed content, specifically, as shown in formula (8), and the transmission privacy resource group mentioned in formula (8) represents the privacy resource group required by the visitor.
S112: and judging whether the privacy resource group required by the visitor contains the target resource or not under the condition that the privacy resource group required by the visitor is covered by the visitor's informed privacy resource group.
If the privacy resource group required by the visitor contains the target resource, S113 is executed, otherwise S114 is executed.
The target resource is a privacy resource containing preset sensitive content, and the preset sensitive content includes but is not limited to: content that is more susceptible to adverse effects.
When the set of the privacy resources required by the visitor is not covered, the access behavior override of the visitor is determined, and the transmission of the privacy resources other than the set of the privacy resources is prohibited.
S113: and starting a preset anonymous protection mechanism, and encrypting the target resource.
After S113 is performed, S114 is continued.
Anonymity is a form of unrecognizable, a type of hiding of resource attributes, and not just a change in name. Anonymity protection mechanism refers to hiding a portion of the content of the privacy resource (i.e., the target resource) required by the visitor for privacy protection purposes based on the nature of the privacy resource. In the embodiment of the application, the anonymity protection mechanism comprises four types of data anonymity protection, information anonymity protection, knowledge anonymity protection and group anonymity protection.
In the embodiment of the application, a preset anonymous protection mechanism is started, and the process of encrypting the target resource comprises the following steps:
1. and identifying the data resources containing the preset sensitive content in the privacy resource group as target resources, and carrying out data anonymity protection on the target resources.
2. According to the intention of the visitor and the content sensitivity of each information resource in the privacy resource group, determining the information resource which needs to be anonymously processed in the privacy resource group, and anonymously protecting the information of the information resource which needs to be anonymously processed.
3. According to the intention of the visitor and the accuracy of each knowledge resource in the privacy resource group, determining the knowledge resource which needs to be anonymously processed in the privacy resource group, and carrying out knowledge anonymously protection on the knowledge resource which needs to be anonymously processed.
4. And carrying out group anonymity protection on group privacy resources contained in the privacy resource group.
Specifically, the data anonymization protection refers to: for data resources (namely target resources) containing preset sensitive contents, a method of substitution parameter encryption is adopted to reduce the risk of privacy disclosure. Before the transmission process starts, the data resources in the privacy resource group required by the visitor are encrypted by the AI system according to a certain rule, and after the transmission is completed, the data resources are decrypted by the visitor, so that the risk of privacy disclosure in the transmission process is reduced.
Specifically, the anonymous protection of information means: a method of privacy protection is achieved by hiding an entity or relationship in an information resource (a particular form of privacy resource). The content that is hidden in the information resource depends on the intention of the visitor and the content sensitivity of the information resource.
Specifically, knowledge anonymization protection refers to: summarizing an entity and predicting future behavior may involve multiple knowledge resources (i.e., a specific representation of a privacy resource) with varying accuracy under the same topic. For example, the first indication resource and the second knowledge resource belong to the subject of the medicine requirement under the knowledge graph, and the logic of the anonymous protection of the knowledge is as follows:
KDIK1,KDIK2 Form KGraph(Ua);
{KDIK1,KDIK2}∈Topic(Medicine Demand);
k DIK1=K(Ua, medicine) = "user U a needs to purchase a drug (M 1)";
KDIK1(Val)=75%;
K DIK2=K(Ua, medium) = "user U a needs to purchase a drug (M 2)";
KDIK2(Val)=25%。
In the above logic for anonymizing protection of knowledge, K DIK1 represents a first knowledge resource, K DIK2 represents a second knowledge resource, K Graph(Ua) represents a personal knowledge graph of user U a, topic represents a Topic, K DIK1 (Val) represents the accuracy of the first knowledge resource, and K DIK2 (Val) represents the accuracy of the second knowledge resource. Wherein, M 1 and M 2 are both conventional medicines for treating special diseases, and the applicability and the demand of patients on the medicines depend on the physical condition and the past medical history of the patients, and the accuracy of the first knowledge resource and the accuracy value of the second knowledge resource are also differentiated. As can be seen in accuracy, the purchase demand of drug M 1 by user U a is greater than that of drug M 2. After anonymizing the first knowledge resource, a new first knowledge resource is obtained, and after anonymizing the second knowledge resource, a new second knowledge resource is obtained, as follows:
KDIK1 A=K(Ua,Medicine),KDIK1 A(Val)=XX;
KDIK2 A=K(Ua,medicine),KDIK2 A(Val)=XX。
K DIK1 A represents a new first knowledge resource and K DIK2 A represents a new second knowledge resource. When a visitor provides a recommendation service for the user U a in the anonymous processing process, the same recommendation service is provided for the medicine M 1 and the medicine M 2 according to K DIK1 A and K DIK2 A with hidden accuracy, the user U a can determine the type of the purchased medicine according to the actual situation of the user, and the user can select the purchase owner according to the requirement of the user.
It is emphasized that the anonymity protection mechanism may work not only on individual privacy resources, but also on group privacy resources. For example, in the decision of the review class, the privacy resource group needs to include the identity resources of the relevant participants for notification or disclosure of the subsequent review results, and for fairness purposes, the identity resources need to be excluded from the privacy resources referred to by the decision of the review class to prevent the occurrence of fraud. At this point, an anonymity protection mechanism is started, the logic of which is as follows:
Event 1(rule1) = "top n" with score rank;
Event 1(rule2) = "different professions separate ranking";
event 1(rule3) = "rating according to the principle of fairness";
Decision(Event1):
If PDIK(G) T={Name,Age,Gender,Address,Major,Grade,....}:
Vfairness=Ffairness(Event1,PDIK(G) T,Uprice) =False;
PDIK(G) T(A)=Anonymity(PDIK(G) T) ={XX,XX,XX,XX,Major,Grade,....};
If PDIK(G) T==PDIK(G) T(A)
Vfairness=Ffairness(Event1,PDIK(G) T,Uprice) =True。
In the above logic, event 1 represents a comment class decision Event, V fairness represents a privacy value, P DIK(G) T(A) represents a new privacy resource group, U price represents a user's utility value, F fairness is a pre-constructed function, P DIK(G) T represents a privacy resource group required by a visitor, including Name (Name), age (Age), gender (Gender), address (Address), professional (Major), performance (Grade), and the like, and there is a possibility that unrelated privacy resources hinder decision fairness. The new privacy resource group processed by the anonymization method can minimize the influence of irrelevant privacy resources on fairness. The new privacy resource group after anonymization treatment can exclude irrelevant privacy resources to the maximum extent, and the influence on the fairness of the selection is minimized no matter whether the decision of the decision event of the selection is completed by an AI system or manually completed by staff.
S114: and starting a risk assessment mechanism, and calculating the risk value of each privacy resource in the privacy resource group required by the visitor.
Among other things, the role of the risk assessment mechanism is: detecting whether the privacy resource group required by the visitor exists or not, and the risk of privacy resource leakage is caused by the privacy resource conversion process in the storage process. And calculating to obtain a risk value of the privacy resource based on the conversion difficulty of the privacy resource in the privacy resource group required by the visitor and the attribute of the privacy resource. When the calculated risk value is larger than the preset risk threshold value, the privacy leakage risk is determined to exist, and the AI system needs to delete and replace privacy resources with increased risk values in the privacy resource group required by the visitor, so that privacy safety is ensured.
In the embodiment of the application, the risk assessment mechanism is mainly used for carrying out assessment calculation based on the self attribute of the privacy resource, wherein the self attribute of the privacy resource comprises the conversion difficulty, the conversion in degree, the conversion out degree and the accuracy of the knowledge resource.
Specifically, the risk assessment regarding the difficulty of conversion is to assess whether the visitor has the ability to convert the private resource into another new private resource that is independent of the intention of access and that would be privacy-infringing to the user. According to the above mentioned formula (15), the conversion difficulty can be calculated, and if the value of the conversion difficulty is infinity, it indicates that the visitor does not have the capability of converting the privacy resource into a new privacy resource, that is, the probability of disclosure of the privacy resource is low. If the conversion difficulty value is smaller than the preset conversion capability threshold value, the privacy resource group required by the visitor is allowed to be transmitted. If the visitor has the ability to convert the private resource to a new private resource, there is a probability that the unrelated private resource is compromised, and the private resource should be replaced or deleted. The risk assessment logic of the conversion difficulty can be seen as follows:
DDIK1,IDIK1 from DIKWGraph(Ua);
D DIK1 = "Photo 1 in virtual community";
I DIK1 = "taken by U a in Photo 1";
I DIK2 new = "shooting place of Photo 1";
If E==Visitorpro
TDifficulty(IDIK1,IDIK2 new)
=Difficulty(IDIK1,IDIK2 new,E)<TDifficulty W
VRisk<VRisk W
If E==Visitorcommon
TDifficulty(IDIK1,IDIK2 new)
=Difficulty(IDIK1,IDIK2 new,E)>TDifficulty W
VRisk>VRisk W
In the risk assessment logic of the conversion difficulty, D DIK1 represents the first data resource, I DIK1 represents the first information resource, DIKW Graph(Ua) represents the DIKW map of the user U a, I DIK2 new represents the new information resource converted by the first information resource, E represents the entity, visitor pro represents the intention of the visitor, T Difficulty represents the conversion difficulty, difficulty is a pre-constructed function, T Difficulty W represents a preset conversion capability threshold, V Risk represents a risk value of the privacy resource, and V Risk W represents a preset risk threshold.
It should be noted that, the risk assessment of the conversion difficulty has high requirements on the calculation conditions, and in a general decision, the complete knowledge of the visitor's ability and the calculation of the accurate conversion difficulty cannot be achieved. Based on risk assessment of conversion difficulty only, the risk of privacy resource disclosure is difficult to completely avoid.
The conversion in degree and the conversion out degree of the privacy resource can both represent the connectivity of the privacy resource in DIKW patterns. When a plurality of privacy resources meet the intention of the visitor at the same time, the privacy resources with the minimum comprehensive values on the two attributes of the conversion degree and the worst connectivity can be selected to be included in the privacy resource group required by the visitor, so that the conversion times of the privacy resources and the risk of leakage of the privacy resources are reduced.
In addition, on the premise of meeting the intention of the visitor, the smaller the accuracy value is, the smaller the content of the contained resource is, and the smaller the risk of privacy leakage is. In addition, on the premise of meeting the intention of the visitor, the smaller the accuracy value is, the smaller the content of the contained resource is, and the smaller the risk of privacy leakage is.
S115: and deleting the privacy resources with risk values larger than a preset risk threshold in the privacy resource group required by the visitor.
Other privacy resources meeting the conditions can be utilized to replace privacy resources with risk values larger than a preset risk threshold.
S116: and (3) starting a supervision mechanism, and detecting whether logic errors exist in the privacy resources in the privacy resource group required by the visitor.
The supervision mechanism comprises logic supervision, value supervision and entitlement supervision.
In the embodiment of the application, a supervision mechanism is started, and the process for detecting whether the privacy resources in the privacy resource group required by the visitor have logic errors comprises the following steps:
1. And carrying out logic supervision on the decision rule and the decision result of the AI system, wherein the decision result is used for indicating the circulation process of the privacy resource group.
2. And under the condition that the decision result accords with the decision rule, determining the value of the logic supervision as true.
3. And calculating the privacy value of the privacy resources in the privacy resource group, and judging whether the privacy value meets the preset privacy value standard.
4. And under the condition that the privacy value meets the preset privacy value standard, determining the value of the value supervision as true.
5. Acquiring the supervision of the AI system on the right to know, the supervision of the AI system on the right to participate and the supervision of the AI system on the right to forget.
6. And determining that the value of the right supervision is true under the conditions that the value of the supervision of the right of the AI system is true, the value of the supervision of the participation right of the AI system is true, and the value of the supervision of the left right of the AI system is true.
7. And determining that the privacy resources in the privacy resource group have no logic errors under the conditions that the value of the logic supervision is true, the value of the value supervision is true and the value of the entitlement supervision is true.
Specifically, as shown in the formula (20), in the case that the value of the logical supervision is true, the value of the value supervision is true, and the value of the rights supervision is true, it is determined that the privacy resource in the privacy resource group has no logical error.
Supervise=Slogic&&Svalue&&Sright (20)
In formula (20), supervise is a pre-constructed function, S logic represents a value of a logic supervisor, S value represents a value of a value supervisor, and S right represents a value of a entitlement supervisor.
Specifically, logic supervision refers to: the AI system oversees whether a logic error occurred in the decision process. As shown in formula (21), the value of the logic supervision is the comparison result of the decision rule and the decision result.
Slogic=Supervise(Event(rule),Event(result)) (21)
In formula (21), S logic represents a value of logic supervision, supervise is a function constructed in advance, event (rule) represents a decision rule, and Event (result) represents a decision result. In the embodiment of the application, if the value of the logic supervision is true, the logic error does not occur in the AI decision process, and if the value of the logic supervision is false, the logic error occurs in the AI decision process.
Taking the decision event of workload distribution as an example, logic supervision logic is as follows:
Event 1(rule1): "workload totals 100 parts";
Event 1(rule2): the number of workers is three;
Event 1(rule3): "workload distribution according to the processing capacity efficiency of staff";
Decision(Event1):
Event1(result)={Ua:20,Ub:40,Uc:50};
Slogic=Supervise(Event1(rule),Event1(result)) =False。
In the logic of the logic supervision, event 1(rule1)、Event1(rule2) and Event 1(rule3) are decision rules, event 1 (result) is a decision result, and S logic represents the logic supervision.
Specifically, the value supervision means: the AI system monitors whether the decision behavior violates the privacy value, and determines whether the privacy value of the privacy resource meets a preset privacy value standard as shown in a formula (22). When the privacy value of the privacy resource meets the preset privacy value standard, the value of the value supervision is true, namely the representative value supervision result is qualified.
Svalue=Supervise(PDIK(value),PDIK(value)W) (22)
In formula (22), S value represents a value of value supervision, supervise is a function constructed in advance, P DIK (value) represents a privacy value of a privacy resource, and P DIK(value)W represents a preset privacy value standard.
Specifically, rights supervision refers to: the supervision of privacy rights by the AI system comprises the supervision of informed rights by the AI system, the supervision of participation rights by the AI system and the supervision of forgetting rights by the AI system. As shown in the formula (23), when the value of the supervision of the right of the AI system is true, the value of the supervision of the participation right of the AI system is true, and the value of the supervision of the forgetting right of the AI system is true, the value of the right supervision is true, namely, the right supervision result is qualified.
Sright=Sknow&&Sparticipate&&Sforget (23)
In the formula (23), S right represents the value of the right supervision, S know represents the value of the right supervision of the AI system, S participate represents the value of the right supervision of the participation of the AI system, and S forget represents the value of the right supervision of the forget of the AI system.
S117: in the event that there is no logical error in the privacy resource group, the privacy resource group is transmitted to the visitor.
In summary, compared with the prior art, the scheme of the embodiment uses the AI system to replace the manual work, thereby avoiding the manual participation in the encryption protection operation of the privacy resource, and effectively improving the processing efficiency of the privacy resource protection.
It should be noted that S101 mentioned in the foregoing embodiment is an optional implementation manner of the AI governance method for cross-modal privacy protection according to the present application. In addition, S107 mentioned in the foregoing embodiment is also an optional implementation manner of the AI governance method for cross-modal privacy protection according to the present application. For this reason, the description of the above embodiment can be generalized to the method shown in fig. 2.
As shown in fig. 2, a schematic diagram of an AI governance method for cross-modal privacy protection according to an embodiment of the present application is applied to an AI system, and includes the following steps:
S201: and extracting privacy resources from the interaction behavior information of the user in the virtual community.
S202: modeling the privacy resource to obtain DIKW maps.
S203: and analyzing the access request of the visitor to obtain the identity and the intention of the visitor and the privacy resource group required by the visitor.
S204: and inquiring from the DIKW map according to the identity and intention of the visitor to obtain the informed privacy resource group of the visitor.
S205: and under the condition that the privacy resource group covers the privacy resource group and the privacy resource group comprises the target resource, starting a preset anonymity protection mechanism to encrypt the target resource.
The target resource is a privacy resource containing preset sensitive content.
S206: and starting a preset risk assessment mechanism, and calculating the risk value of each privacy resource in the privacy resource group.
S207: and deleting the privacy resources with risk values larger than a preset risk threshold in the privacy resource group.
S208: and starting a preset supervision mechanism, and detecting whether the privacy resources in the privacy resource group have logic errors or not.
S209: in the event that there is no logical error in the privacy resource group, the privacy resource group is transmitted to the visitor.
In summary, compared with the prior art, the scheme of the embodiment uses the AI system to replace the manual work, thereby avoiding the manual participation in the encryption protection operation of the privacy resource, and effectively improving the processing efficiency of the privacy resource protection.
Corresponding to the above-mentioned AI governance method facing cross-mode privacy protection in the embodiment of the present application, the embodiment of the present application further provides an AI governance device facing cross-mode privacy protection.
As shown in fig. 3, an architecture diagram of an AI governance device for cross-modal privacy protection according to an embodiment of the present application includes:
A setting unit 100, configured to preset privacy rights of each participant in the circulation process of the privacy resource. Wherein the circulation process comprises a sensing process, a storage process, a transmission process and a processing process; the sensing process is as follows: the AI system extracts the process of privacy resources from the interactive behavior information of the user; the storage process is as follows: the AI system converts the extracted privacy resources, models the privacy resources obtained by conversion, obtains DIKW maps and stores DIKW maps; the transmission process is as follows: a process in which the AI system transmits the set of private resources to the visitor; the treatment process comprises the following steps: a process in which the visitor uses the private resource group; the participants comprise users, visitors and AI systems; privacy rights include informed rights, participation rights, forgetfulness rights, and supervision rights.
And the extracting unit 200 is used for extracting the privacy resource from the interaction behavior information of the user in the virtual community.
The modeling unit 300 is configured to perform modeling on the privacy resource to obtain DIKW maps.
The modeling unit 300 specifically is configured to: based on the interaction behavior information, calculating to obtain the reservation degree of the user on the privacy resource; filtering the privacy resources with the reserved degree of which the value is larger than a preset reserved threshold value; converting the rest privacy resources to obtain new privacy resources; modeling a new privacy resource to obtain DIKW maps and storing DIKW maps.
The modeling unit 300 is configured to convert the remaining privacy resources to obtain new privacy resources, and the process includes: deriving a new privacy resource set from the remaining privacy resources, the new privacy resource set comprising a plurality of new privacy resources; deriving a new set of privacy resources from the set of privacy resources consisting of the remaining privacy resources; regarding the rest privacy resources as entities, and obtaining the technology and resources of the entities; calculating the conversion difficulty of the entity according to the technology and the resource; and under the condition that the value of the conversion difficulty is not smaller than a preset conversion capability threshold value, converting the residual privacy resource into a new privacy resource by utilizing the technology and the resource.
The parsing unit 400 is configured to parse the access request of the visitor to obtain the identity and intention of the visitor and the privacy resource group required by the visitor.
And the query unit 500 is configured to query the DIKW map for the set of informed privacy resources of the visitor according to the identity and intention of the visitor.
The query unit 500 specifically is configured to: under the condition that an access request sent by a visitor is received, judging whether the visitor has access rights or not; under the condition that the visitor has access rights, analyzing the access request to obtain the identity and intention of the visitor and the privacy resource group required by the visitor; and under the condition that the visitor does not have access rights, prohibiting the visitor from acquiring the privacy resources in the DIKW map.
The protection unit 600 is configured to enable a preset anonymity protection mechanism to encrypt a target resource when the privacy resource group covers the privacy resource group and the privacy resource group includes the target resource; the target resource is a privacy resource containing preset sensitive content.
The anonymous protection mechanism comprises data anonymous protection, information anonymous protection, knowledge anonymous protection and group anonymous protection; the privacy resources include data resources, information resources, knowledge resources, and group privacy resources.
The protection unit 600 specifically serves to: the method comprises the steps of marking data resources containing preset sensitive contents in a privacy resource group as target resources, and carrying out data anonymity protection on the target resources; according to the intention of a visitor and the content sensitivity of each information resource in the privacy resource group, determining the information resource needing anonymous processing in the privacy resource group, and carrying out information anonymous protection on the information resource needing anonymous processing; according to the intention of a visitor and the accuracy of each knowledge resource in the privacy resource group, determining the knowledge resources needing anonymous processing in the privacy resource group, and carrying out knowledge anonymous protection on the knowledge resources needing anonymous processing; and carrying out group anonymity protection on group privacy resources contained in the privacy resource group.
The evaluation unit 700 is configured to enable a preset risk evaluation mechanism, and calculate a risk value of each privacy resource in the privacy resource group.
The deleting unit 800 is configured to delete the privacy resources with risk values greater than the preset risk threshold in the privacy resource group.
The detecting unit 900 is configured to enable a preset supervision mechanism, and detect whether a logic error exists in a private resource in the private resource group.
The preset supervision mechanism comprises logic supervision, value supervision and entitlement supervision.
The evaluation unit 700 is specifically configured to: logic supervision is carried out on decision rules and decision results of the AI system; the decision result is used for indicating the circulation process of the privacy resource group; under the condition that the decision result accords with the decision rule, determining the value of logic supervision as true; calculating the privacy value of the privacy resources in the privacy resource group, and judging whether the privacy value meets the preset privacy value standard; under the condition that the privacy value meets the preset privacy value standard, determining the value of the value supervision as true; acquiring the supervision of the AI system on the right of the notice, the supervision of the AI system on the right of the participation and the supervision of the AI system on the right of the forget; determining that the value of the right supervision is true when the value of the supervision of the right of the AI system is true, the value of the supervision of the participation right of the AI system is true, and the value of the supervision of the left right of the AI system is true; and determining that the privacy resources in the privacy resource group have no logic errors under the conditions that the value of the logic supervision is true, the value of the value supervision is true and the value of the entitlement supervision is true.
And a transmission unit 1000, configured to transmit the privacy resource group to the visitor when the privacy resource in the privacy resource group has no logic error.
In summary, compared with the prior art, the scheme of the embodiment uses the AI system to replace the manual work, thereby avoiding the manual participation in the encryption protection operation of the privacy resource, and effectively improving the processing efficiency of the privacy resource protection.
The application also provides a computer readable storage medium, which comprises a stored program, wherein the program executes the AI governance method facing the cross-mode privacy protection.
The application also provides an AI treatment device facing cross-mode privacy protection, which comprises: a processor, a memory, and a bus. The processor is connected with the memory through a bus, the memory is used for storing a program, and the processor is used for running the program, wherein the AI treatment method facing the cross-mode privacy protection is executed when the program runs.
The functions of the methods of embodiments of the present application, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored on a computing device readable storage medium. Based on such understanding, a part of the present application that contributes to the prior art or a part of the technical solution may be embodied in the form of a software product stored in a storage medium, comprising several instructions for causing a computing device (which may be a personal computer, a server, a mobile computing device or a network device, etc.) to execute all or part of the steps of the method described in the embodiments of the present application. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
In this specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different point from other embodiments, so that the same or similar parts between the embodiments are referred to each other.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (8)

1. The AI governance method for cross-modal privacy protection is characterized by being applied to an AI system and comprising the following steps:
Extracting privacy resources from interaction behavior information of users in the virtual community;
modeling the privacy resource to obtain DIKW maps;
Analyzing an access request of a visitor to obtain the identity and the intention of the visitor and a privacy resource group required by the visitor;
inquiring from the DIKW map according to the identity and intention of the visitor to obtain an informed privacy resource group of the visitor;
When the privacy resource group covers the privacy resource group and the privacy resource group contains a target resource, starting a preset anonymity protection mechanism to encrypt the target resource, wherein the method comprises the following steps: identifying the data resources containing preset sensitive content in the privacy resource group as the target resources, and carrying out data anonymity protection on the target resources; according to the intention of the visitor and the content sensitivity of each information resource in the privacy resource group, determining the information resource which needs to be anonymously processed in the privacy resource group, and anonymously protecting the information resource which needs to be anonymously processed; according to the intention of the visitor and the accuracy of each knowledge resource in the privacy resource group, determining the knowledge resource needing anonymous processing in the privacy resource group, and carrying out knowledge anonymous protection on the knowledge resource needing anonymous processing; group anonymity protection is carried out on group privacy resources contained in the privacy resource group; the target resource is a privacy resource containing preset sensitive content; the anonymous protection mechanism comprises data anonymous protection, information anonymous protection, knowledge anonymous protection and group anonymous protection; the privacy resources comprise data resources, information resources, knowledge resources and group privacy resources;
Starting a preset risk assessment mechanism, and calculating the risk value of each privacy resource in the privacy resource group based on the conversion difficulty, the conversion in degree, the conversion out degree and the accuracy of the knowledge resource of the privacy resource;
Deleting the privacy resources with the risk values larger than a preset risk threshold in the privacy resource group;
Enabling a preset supervision mechanism, and detecting whether the privacy resources in the privacy resource group have logic errors or not, wherein the method comprises the following steps: logic supervision is carried out on the decision rule and the decision result of the AI system; the decision result is used for indicating a circulation process of the privacy resource group; under the condition that the decision result accords with the decision rule, determining the value of the logic supervision to be true; calculating the privacy value of the privacy resources in the privacy resource group, and judging whether the privacy value meets a preset privacy value standard or not; under the condition that the privacy value meets the preset privacy value standard, determining the value of value supervision as true; acquiring the supervision of the AI system on the right to know, the supervision of the AI system on the right to participate and the supervision of the AI system on the right to forget; determining that the value of the right supervision is true when the value of the supervision of the right of the AI system is true, the value of the supervision of the participation right of the AI system is true, and the value of the supervision of the left right of the AI system is true; determining that the privacy resources in the privacy resource group have no logic error under the conditions that the value of the logic supervision is true, the value of the value supervision is true and the value of the right supervision is true; the preset supervision mechanism comprises logic supervision, value supervision and entitlement supervision;
And transmitting the privacy resource group to the visitor under the condition that the privacy resources in the privacy resource group have no logic errors.
2. The method as recited in claim 1, further comprising:
presetting privacy rights of all participants in the circulation process of the privacy resource;
Wherein the circulation process comprises a sensing process, a storage process, a transmission process and a processing process;
The sensing process is as follows: the AI system extracts the process of privacy resources from the interactive behavior information of the user;
The storage process is as follows: the AI system converts the extracted privacy resources, models the privacy resources obtained by conversion, obtains the DIKW map and stores the DIKW map;
the transmission process is as follows: a process in which the AI system transmits the set of private resources to the visitor;
the processing process comprises the following steps: a process in which the visitor uses the set of privacy resources;
the participants include the user, the visitor, and the AI system;
the privacy rights include informed rights, participation rights, forgetfulness rights and supervision rights.
3. The method of claim 1, wherein said modeling the privacy resource results in DIKW patterns, comprising:
calculating the reservation degree of the user on the privacy resource based on the interaction behavior information;
filtering the privacy resources with the reserved degree value larger than a preset reserved threshold value;
converting the rest privacy resources to obtain new privacy resources;
modeling the new privacy resource to obtain DIKW maps, and storing the DIKW maps.
4. A method according to claim 3, wherein said converting the remaining privacy resources to obtain new privacy resources comprises:
deriving a new privacy resource set from the rest of the privacy resources, the new privacy resource set comprising a plurality of new privacy resources;
deriving the new privacy resource set from the privacy resource set consisting of the rest of the privacy resources;
regarding the rest of the privacy resources as entities, and acquiring the technologies and resources of the entities;
calculating the conversion difficulty of the entity according to the technology and the resource;
And under the condition that the value of the conversion difficulty is not smaller than a preset conversion capability threshold, converting the rest privacy resources into the new privacy resources by utilizing the technology and the resources.
5. The method of claim 1, wherein said parsing the visitor's access request to obtain the identity and intent of the visitor and the set of privacy resources required by the visitor comprises:
judging whether the visitor has access right or not under the condition that an access request sent by the visitor is received;
analyzing the access request under the condition that the visitor has the access right to obtain the identity and the intention of the visitor and the privacy resource group required by the visitor;
and under the condition that the visitor does not have the access right, prohibiting the visitor from acquiring the privacy resource in the DIKW map.
6. The utility model provides a cross-modal privacy protection oriented AI administering apparatus which is characterized in that is applied to AI system, includes:
the extraction unit is used for extracting privacy resources from interaction behavior information of the user in the virtual community;
the modeling unit is used for modeling the privacy resource to obtain DIKW patterns;
The analyzing unit is used for analyzing the access request of the visitor to obtain the identity and the intention of the visitor and the privacy resource group required by the visitor;
The inquiring unit is used for inquiring and obtaining the informed privacy resource group of the visitor from the DIKW map according to the identity and the intention of the visitor;
The protection unit is used for starting a preset anonymous protection mechanism to encrypt the target resource under the condition that the privacy resource group covers the privacy resource group and the privacy resource group contains the target resource; the target resource is a privacy resource containing preset sensitive content; the anonymous protection mechanism comprises data anonymous protection, information anonymous protection, knowledge anonymous protection and group anonymous protection; the privacy resources comprise data resources, information resources, knowledge resources and group privacy resources;
The protection unit is specifically used for: identifying the data resources containing preset sensitive content in the privacy resource group as the target resources, and carrying out data anonymity protection on the target resources; according to the intention of the visitor and the content sensitivity of each information resource in the privacy resource group, determining the information resource which needs to be anonymously processed in the privacy resource group, and anonymously protecting the information resource which needs to be anonymously processed; determining knowledge resources needing to be anonymously processed in the privacy resource group according to the intention of the visitor and the accuracy of each knowledge resource in the privacy resource group, and carrying out knowledge anonymously protection on the knowledge resources needing to be anonymously processed; group anonymity protection is carried out on group privacy resources contained in the privacy resource group;
The evaluation unit is used for starting a preset risk evaluation mechanism and calculating the risk value of each privacy resource in the privacy resource group based on the conversion difficulty, the conversion in degree, the conversion out degree and the accuracy of the knowledge resource of the privacy resource;
a deleting unit, configured to delete the privacy resource whose risk value is greater than a preset risk threshold in the privacy resource group;
the detection unit is used for starting a preset supervision mechanism and detecting whether the privacy resources in the privacy resource group have logic errors or not; the preset supervision mechanism comprises logic supervision, value supervision and entitlement supervision;
The detection unit is specifically used for: logic supervision is carried out on the decision rule and the decision result of the AI system; the decision result is used for indicating a circulation process of the privacy resource group; under the condition that the decision result accords with the decision rule, determining the value of the logic supervision to be true; calculating the privacy value of the privacy resources in the privacy resource group, and judging whether the privacy value meets a preset privacy value standard or not; under the condition that the privacy value meets the preset privacy value standard, determining the value of the value supervision to be true; acquiring the supervision of the AI system on the right to know, the supervision of the AI system on the right to participate and the supervision of the AI system on the right to forget; determining that the value of the right supervision is true when the value of the supervision of the right of the AI system is true, the value of the supervision of the participation right of the AI system is true, and the value of the supervision of the left right of the AI system is true; determining that the privacy resources in the privacy resource group have no logic error under the conditions that the value of the logic supervision is true, the value of the value supervision is true and the value of the right supervision is true;
and the transmission unit is used for transmitting the privacy resource group to the visitor under the condition that the privacy resources in the privacy resource group have no logic errors.
7. A computer-readable storage medium, characterized in that the computer-readable storage medium comprises a stored program, wherein the program performs the AI governance method of any of claims 1-5 for cross-modal privacy protection.
8. AI governance equipment facing cross-modal privacy protection is characterized by comprising: a processor, a memory, and a bus; the processor is connected with the memory through the bus;
The memory is used for storing a program, and the processor is used for running the program, wherein the program executes the AI governance method facing cross-mode privacy protection as set forth in any one of claims 1-5.
CN202110908765.5A 2021-08-09 2021-08-09 AI (advanced technology attachment) treatment method and device for cross-modal privacy protection Active CN113742770B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110908765.5A CN113742770B (en) 2021-08-09 2021-08-09 AI (advanced technology attachment) treatment method and device for cross-modal privacy protection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110908765.5A CN113742770B (en) 2021-08-09 2021-08-09 AI (advanced technology attachment) treatment method and device for cross-modal privacy protection

Publications (2)

Publication Number Publication Date
CN113742770A CN113742770A (en) 2021-12-03
CN113742770B true CN113742770B (en) 2024-05-14

Family

ID=78730438

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110908765.5A Active CN113742770B (en) 2021-08-09 2021-08-09 AI (advanced technology attachment) treatment method and device for cross-modal privacy protection

Country Status (1)

Country Link
CN (1) CN113742770B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108804950A (en) * 2018-06-09 2018-11-13 海南大学 Based on data collection of illustrative plates, modeling and the data-privacy guard method of Information Atlas and knowledge mapping
CN112685772A (en) * 2020-12-28 2021-04-20 海南大学 Intrinsic-computation-oriented DIKW-mode-crossing relative difference privacy protection method
CN112818382A (en) * 2021-01-13 2021-05-18 海南大学 Essential computing-oriented DIKW private resource processing method and component

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108804950A (en) * 2018-06-09 2018-11-13 海南大学 Based on data collection of illustrative plates, modeling and the data-privacy guard method of Information Atlas and knowledge mapping
CN112685772A (en) * 2020-12-28 2021-04-20 海南大学 Intrinsic-computation-oriented DIKW-mode-crossing relative difference privacy protection method
CN112818382A (en) * 2021-01-13 2021-05-18 海南大学 Essential computing-oriented DIKW private resource processing method and component

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"Technical Implementation Framework of AI Governance Policies for Cross-Modal Privacy Protection";Yuxiao Lei等;《Collaborative Computing:Networking, Applications and Worksharing》;第431-443页 *
基于DIKW图谱的虚拟社区用户性格分类与转换方法;雷羽潇等;应用科学学报;第38卷(第5期);第803-824页 *

Also Published As

Publication number Publication date
CN113742770A (en) 2021-12-03

Similar Documents

Publication Publication Date Title
Harron et al. Methodological developments in data linkage
Willenborg et al. Elements of statistical disclosure control
Blanco-Justicia et al. Machine learning explainability via microaggregation and shallow decision trees
CA3104119C (en) Systems and methods for enforcing privacy-respectful, trusted communications
Billard Weighted forensics evidence using blockchain
Burmeister et al. Enhancing client welfare through better communication of private mental health data between rural service providers
Kieseberg et al. Protecting anonymity in data-driven biomedical science
CN112231750A (en) Multi-mode privacy protection method integrating fairness, justice and transparent regulation technologization
Livraga et al. Data confidentiality and information credibility in on-line ecosystems
Georgiadou et al. Digital earth ethics
Young et al. Call me big PAPA: An extension of Mason’s information ethics framework to big data
Komarova et al. Identification, data combination, and the risk of disclosure
CN113742770B (en) AI (advanced technology attachment) treatment method and device for cross-modal privacy protection
Harrison et al. Care requirements for clients who present after rape and clients who presented after consensual sex as a minor at a clinic in Harare, Zimbabwe, from 2011 to 2014
Nišević et al. Understanding the legal bases for automated decision-making under the GDPR
Liu et al. Differential privacy performance evaluation under the condition of non-uniform noise distribution
El Emam et al. Concepts and methods for de-identifying clinical trial data
Elliot et al. Data environment analysis and the key variable mapping system
Aidinlis et al. Building a Justice Data Infrastructure
Adamakis et al. Darav: a tool for visualizing de-anonymization risks
Räz Understanding risk with FOTRES?
Halvorsen et al. How attacker knowledge affects privacy risks: an analysis using probabilistic programming
Ojo et al. Public perception of Digital Contact Tracing App and Implications for Technology Acceptance and Use Models.
Alhaqbani Privacy and trust management for electronic health records
Kaspero et al. Criminal record privacy & the structural risks inherent within commercial storehouses in the consumer data industry

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant