WO2020063349A1 - Procédé et dispositif de protection des données, appareil et support de stockage informatique - Google Patents

Procédé et dispositif de protection des données, appareil et support de stockage informatique Download PDF

Info

Publication number
WO2020063349A1
WO2020063349A1 PCT/CN2019/105390 CN2019105390W WO2020063349A1 WO 2020063349 A1 WO2020063349 A1 WO 2020063349A1 CN 2019105390 W CN2019105390 W CN 2019105390W WO 2020063349 A1 WO2020063349 A1 WO 2020063349A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
privacy
processed
sub
model
Prior art date
Application number
PCT/CN2019/105390
Other languages
English (en)
Chinese (zh)
Inventor
艾东梅
Original Assignee
中兴通讯股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中兴通讯股份有限公司 filed Critical 中兴通讯股份有限公司
Publication of WO2020063349A1 publication Critical patent/WO2020063349A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules

Definitions

  • the embodiments of the present application relate to, but are not limited to, privacy data protection technologies, and in particular, to a data protection method, device, device, and computer storage medium.
  • a method for protecting user privacy data is not flexible enough, and it is impossible to determine whether privacy protection of user data is required according to actual needs.
  • the embodiments of the present application provide a data protection method, device, device, and computer storage medium, which can flexibly protect and manage private data.
  • An embodiment of the present application provides a data protection method.
  • the method includes:
  • each privacy sub-model is a data set representing a privacy attribute, the privacy attributes represented by the n privacy sub-models are different from each other, and n is an integer greater than 1;
  • an alert message is generated to remind the data to be processed that privacy protection is needed.
  • An embodiment of the present application further provides a data protection device, where the device includes a processor and a memory configured to store a computer program capable of running on the processor; wherein,
  • processor When the processor is configured to run the computer program, execute the steps of any one of the data protection methods described above.
  • An embodiment of the present application further provides a data protection device, where the device includes an acquisition module and a decision module, where:
  • the obtaining module is configured to obtain n privacy sub-models; wherein each privacy sub-model is a data set representing a privacy attribute, the privacy attributes represented by the n privacy sub-models are different from each other, and n is an integer greater than 1 ;
  • the decision-making module is configured to obtain the data to be processed and determine the privacy sub-model corresponding to the data to be processed; when the correlation between the data to be processed and the corresponding privacy sub-model is greater than or equal to a preset correlation threshold value, generating warning information to It is reminded that the data to be processed needs privacy protection.
  • An embodiment of the present application further provides a computer storage medium, and when the computer program is executed by a processor, the steps of any one of the foregoing data protection methods are implemented.
  • n privacy submodels are first obtained; wherein each privacy submodel is a data set representing a privacy attribute, and the n The privacy attributes represented by the privacy submodel are different from each other, n is an integer greater than 1. Then, the data to be processed is obtained to determine the privacy submodel corresponding to the data to be processed; finally, the correlation between the data to be processed and the corresponding privacy submodel is When the correlation is greater than or equal to a preset correlation threshold, an early-warning message is generated to remind the data to be processed that privacy protection is required.
  • the n privacy attributes corresponding to the n privacy submodels can be flexibly set according to the actual needs of the user, the n privacy submodels that meet the actual needs can be obtained.
  • the required n privacy sub-models determine that the warning information is generated, it indicates that the generation of the warning information is in line with the actual requirements; that is, by setting the n privacy attributes flexibly and autonomously in advance, the user's privacy data can be alerted. It has certain flexibility and autonomy, which can prevent the leakage of private data that requires privacy protection.
  • FIG. 1 is a flowchart of a data protection method according to an embodiment of the present application
  • FIG. 2 is a schematic diagram of a clustering result of training data according to an embodiment of the present application.
  • FIG. 3 is a flowchart of another data protection method according to an embodiment of the present application.
  • FIG. 4 is a schematic structural diagram of a data protection device according to an embodiment of the present application.
  • FIG. 5 is a schematic diagram of a hardware structure of a data protection device according to an embodiment of the present application.
  • the method for protecting user privacy data is not flexible enough, and it is impossible to determine whether the user data needs to be privacy protected according to actual needs.
  • the embodiments of the present application can be applied to any scenario where privacy protection is required.
  • privacy protection is performed on user data generated when an application runs on a terminal
  • it can be implemented based on the technical solution provided in the embodiments of the present application.
  • the embodiments of the present application may be applied to a terminal or other devices, and the terminal or other devices described above may include devices such as a processor and a memory.
  • FIG. 1 is a flowchart of a data protection method according to an embodiment of the present application. As shown in FIG. 1, the process may include:
  • Step 101 Obtain n privacy sub-models; wherein each privacy sub-model is a data set representing a privacy attribute, the privacy attributes represented by the n privacy sub-models are different from each other, and n is an integer greater than 1.
  • training data may be obtained first, where the training data is used to represent user data generated when the application is running; then, preset n privacy attributes are used as the central object to perform training on the training object.
  • the data is clustered to obtain n privacy sub-models.
  • user raw data generated during application running may be obtained, and the user raw data recorded above may be pre-processed to obtain training data.
  • the user raw data recorded above may be pre-processed to obtain training data.
  • at least one of the following recorded user raw data may be trained to obtain training data.
  • Data: word segmentation processing, filtering useless word processing, useless words can include punctuation, single words, symbols, and other meaningless words; it should be noted that the content of the above records is only an example to provide the implementation of preprocessing.
  • the pre-processing may also have other implementation manners, which are not limited in the embodiments of the present application.
  • the above-mentioned user raw data may be user data generated when an application (Application, App) of a mobile terminal is run, and may include user's use of each application of the mobile terminal.
  • Various data generated such as login information, reading, consumption, preference details, etc.
  • n privacy attributes can be set in advance according to the actual needs for protecting private data.
  • Each privacy attribute of the n privacy attributes indicates that the user determines the privacy point to be protected (that is, the privacy that the user cares about most).
  • Points for example, n privacy attributes can include "identity", "interest”, etc .; n can be considered as a pre-set protection degree coefficient. The greater the value of n, the more privacy points the user determines to be protected; further After setting n privacy attributes, the user can change the n privacy attributes according to actual needs, and then re-cluster the training data based on the revised privacy attributes to obtain the corresponding privacy submodel.
  • protection degree coefficient n users can flexibly determine the privacy protection strategy of personal data, and the protection degree coefficient n then affects the size of the privacy protection category. Users set the corresponding degree of protection coefficient according to the degree of their privacy protection.
  • a user may input a protection degree coefficient n and n privacy attributes through a user interface (UI) of a terminal, which is convenient for user operations.
  • UI user interface
  • the n privacy attributes and the user's original data can be used as input data for constructing the n privacy sub-models, and further, the n privacy sub-models can be input.
  • the data is processed to obtain n privacy sub-models.
  • clustering-based natural language processing methods commonly used in machine learning can be used to automatically cluster the input data of n privacy sub-models and iteratively update each The central object of the secondary clustering until the final clustering result is obtained; here, the final clustering result may include n clusters, the privacy properties of the n clusters in the final clustering result are different from each other, and the final clustering Each cluster in the class result represents a privacy sub-model. It should be noted that the embodiments of the present application do not limit the structure and learning method of the machine learning model.
  • the central object of the cluster is updated so that the preset evaluation index of the current clustering result is higher than the preset evaluation index of the previous clustering result.
  • the preset evaluation index of the clustering result can be used to indicate: the proximity of each record in the same cluster in the clustering result, and the distance between the records of different clusters in the clustering result; the same in the clustering result The closer each record in the cluster is, the further away the records in different clusters are from the clustering result, indicating that the preset evaluation index of the clustering result is higher.
  • a first clustering process is performed on the training data to obtain a first clustering result
  • m denote the total number of iterations of the iterative clustering method.
  • i takes 2 to m
  • preset evaluation of the clustering result of the ith clustering If the index is higher than the preset evaluation index of the i-1th clustering result, the center object of the i-1th clustering is updated to obtain the center object of the ith clustering.
  • the i-th clustering process is performed on the training data to obtain the i-th clustering result.
  • m may be a preset integer greater than 1, or may be determined by a preset iteration termination condition.
  • the preset iteration termination condition may be: the preset evaluation index of the clustering result cannot be changed. High is the target, and the central object of the previous clustering is updated.
  • Step 102 Obtain the data to be processed and determine the privacy sub-model corresponding to the data to be processed. When the correlation between the data to be processed and the corresponding privacy sub-model is greater than or equal to a preset correlation threshold, it is determined that the data to be processed needs to be processed. privacy protection.
  • user data generated during application running can be monitored, user data generated during application running is monitored, and user data generated during monitored application running is taken as pending data.
  • the data to be processed is data to be uploaded by the terminal or data to be saved by the terminal.
  • the n privacy sub-models described above are taken as the center object, and a sufficient calculation is performed by a machine learning algorithm to determine the privacy sub-model (the privacy sub-model corresponding to the to-be-processed data).
  • the correlation between the data to be processed and the n privacy sub-models can be determined separately, and the privacy sub-model with the most correlation with the data to be processed is used as the privacy sub-model corresponding to the data to be processed.
  • the semantic distance between the data to be processed and each privacy submodel can be calculated, and according to the data to be processed and each privacy submodel, The semantic distance between the data to be processed and each privacy sub-model is determined. The smaller the semantic distance between the data to be processed and the privacy sub-model, the greater the privacy sensitivity of the data to be processed. The greater the correlation.
  • the magnitude relationship between the correlation between the data to be processed and the corresponding privacy sub-model and a preset correlation threshold can be judged.
  • the data to be processed is related to the corresponding privacy sub-model
  • the correlation is greater than or equal to a preset correlation threshold, it is determined that the data to be processed needs to be protected by privacy.
  • an alert message may be generated to remind the data to be processed that privacy needs to be protected.
  • the correlation of the privacy sub-model is less than a preset correlation threshold, it is determined that the data to be processed does not need to be protected by privacy, and the process may be directly ended.
  • the data to be processed and the corresponding privacy submodel may be calculated.
  • the semantic distance between the data to be processed and the corresponding privacy submodel is less than or equal to a preset semantic distance threshold, it is determined that the correlation between the data to be processed and the corresponding privacy submodel is greater than or equal to a preset correlation threshold; otherwise,
  • the semantic distance between the data to be processed and the corresponding privacy submodel is greater than a preset semantic distance threshold, it is determined that the correlation between the data to be processed and the corresponding privacy submodel is less than a preset correlation threshold.
  • n privacy attributes corresponding to the n privacy sub-models can be flexibly set according to actual needs, n privacy sub-models that meet actual needs can be obtained, and further, when generating the warning information according to the n privacy sub-models that meet actual needs, , Indicating that the generation of early warning information is in line with actual needs; that is, by setting n privacy attributes flexibly and autonomously in advance, early warning reminders of user privacy data can be achieved, with a certain degree of flexibility and autonomy, which can prevent the need Privacy data leakage for privacy protection.
  • the correlation between the data to be processed and the corresponding privacy sub-model is less than a preset correlation threshold, the data to be processed can be ignored, so that a secure channel and powerful guarantee can be provided for data that does not need privacy protection.
  • the data protection method of the first embodiment of the present application may be implemented based on a processor of a terminal or the like.
  • the method for protecting user privacy data is not flexible enough, and it is impossible to determine whether the user data needs to be privacy protected according to actual needs.
  • a machine learning method is used to automatically extract and aggregate the user's original data on the terminal according to preset n privacy attributes to generate a privacy protection scheme that meets the actual needs of a single user. Based on this, it is detected whether the data generated during the use of the application meets the degree of openness of the private information expected by the user, and then make corresponding measures; it can be seen that by setting n privacy attributes, the data generated during the use of the application can be filtered And discrimination processing, which can protect private data from misuse and even attack, and allow effective use of the data; that is, in the embodiment of the present application, from the perspective of the user, adhering to "let the user take charge of their own data "The purpose is to use machine learning to automatically build a privacy data protection solution that meets the needs of users, and then decide and manage those privacy data that may be the user's attention. You can protect your privacy while providing information to enjoy the service.
  • FIG. 3 is a flowchart of another data protection method according to an embodiment of the present application. As shown in FIG. 3, the process may include:
  • Step 301 Obtain data to be processed and n privacy sub-models.
  • Step 302 Determine a privacy sub-model corresponding to the data to be processed.
  • step 102 The implementation of this step has been described in step 102, and is not repeated here.
  • Step 303 Determine whether the correlation between the data to be processed and the corresponding privacy sub-model is greater than or equal to a preset correlation threshold, and if yes, execute step 304; if not, end the process.
  • Step 304 Perform early warning or other processing on the data to be processed.
  • the correlation between the data to be processed and the corresponding privacy sub-model is greater than or equal to a preset correlation threshold, it can be considered that the probability of the data to be processed belongs to the privacy category that the user cares about, and early warning information can be generated to prompt There is a risk of leakage of private information, or privacy protection may be performed directly on the data to be processed.
  • the display method of the warning information for example, the UI of the terminal or other forms may be used to display the warning information.
  • the data to be processed may be saved, When uploading or other operations that may cause privacy leakage, prevent the corresponding operation of the data to be processed, and alert or remind users.
  • the data to be processed may also be added to the corresponding privacy sub-model to make the privacy sub-model more complete.
  • a fourth embodiment of the present application provides a data protection device.
  • FIG. 4 is a schematic structural diagram of a data protection device according to an embodiment of the present application. As shown in FIG. 4, the device includes an obtaining module 401 and a decision module 402, where:
  • the obtaining module 401 is configured to obtain n privacy sub-models; wherein each privacy sub-model is a data set representing a privacy attribute, and the privacy attributes represented by the n privacy sub-models are different from each other, and n is greater than 1 Integer
  • the decision module 402 is configured to obtain data to be processed and determine a privacy sub-model corresponding to the data to be processed; when the correlation between the data to be processed and the corresponding privacy sub-model is greater than or equal to a preset correlation threshold, generating warning information, In order to remind that the data to be processed needs privacy protection.
  • the obtaining module 401 is specifically configured to obtain training data, where the training data is used to represent user data generated when an application is running; preset n privacy attributes are used as a central object for the training. The data is clustered to obtain n privacy sub-models.
  • the acquisition module 401 is specifically set to take preset n privacy attributes as the central object and adopt an iterative clustering method to perform multiple clustering processing on the training data to obtain n privacy A sub-model; wherein, when the clustering process is not performed for the first time, the central object of the clustering is updated so that the preset evaluation index of the current clustering result is higher than the preset evaluation index of the previous clustering result.
  • the preset evaluation index of the clustering result may be used to indicate: the proximity of each record in the same cluster in the clustering result, and the distance between the records of different clusters in the clustering result. .
  • the data to be processed is data to be uploaded by the terminal or data to be saved by the terminal.
  • the decision module 402 is specifically configured to use, among the n privacy sub-models, a privacy sub-model with the highest correlation with the data to be processed as the privacy sub-model corresponding to the data to be processed.
  • the decision module 402 is further configured to perform privacy protection on the data to be processed when the correlation between the data to be processed and the corresponding privacy sub-model is greater than or equal to a preset correlation threshold.
  • the decision module 402 is further configured to add the to-be-processed data to a corresponding one when the correlation between the to-be-processed data and a corresponding privacy sub-model is greater than or equal to a preset correlation threshold.
  • a preset correlation threshold In the privacy submodel, supplementary expansion of the corresponding privacy submodel is implemented.
  • the decision module 402 is further configured to determine that the data to be processed does not require privacy protection when the correlation between the data to be processed and the corresponding privacy sub-model is less than a preset correlation threshold.
  • the above-mentioned obtaining module 401 and decision-making module 402 can be composed of a central processing unit (CPU), a microprocessor (micro processor unit, MPU), and a digital signal processor (DSP) located in a terminal. ), Or Field Programmable Gate Array (FPGA).
  • CPU central processing unit
  • MPU microprocessor
  • DSP digital signal processor
  • FPGA Field Programmable Gate Array
  • the functional modules in this embodiment may be integrated into one processing unit, or each unit may exist separately physically, or two or more units may be integrated into one unit.
  • the above integrated unit may be implemented in the form of hardware or in the form of software functional modules.
  • the integrated unit is implemented in the form of a software functional module and is not sold or used as an independent product, it may be stored in a computer-readable storage medium.
  • the technical solution of this embodiment is essentially or It is said that a part that contributes to the existing technology or all or part of the technical solution can be embodied in the form of a software product.
  • the computer software product is stored in a storage medium and includes several instructions for making a computer device (can It is a personal computer, a server, or a network device) or a processor (processor) to perform all or part of the steps of the method described in this embodiment.
  • the foregoing storage media include: U disks, mobile hard disks, read-only memory (ROM), random access memory (RAM), magnetic disks, or optical disks, which can store program codes.
  • the computer program instructions corresponding to a data protection method in this embodiment may be stored on a storage medium such as an optical disc, a hard disk, a U disk, and the like, when the computer program instructions corresponding to a data protection method are stored in the storage medium
  • a storage medium such as an optical disc, a hard disk, a U disk, and the like
  • FIG. 5 shows a data protection device 50 provided by an embodiment of the present application.
  • the device may include: a memory 51, a processor 52, and a bus 53;
  • the bus 53 is configured to connect the memory 51, the processor 52, and mutual communication between these devices;
  • the memory 51 is configured to store a computer program and data
  • the processor 52 is configured to execute a computer program stored in the memory to implement the steps of any one of the data protection methods in the foregoing embodiments.
  • the above-mentioned memory 51 may be volatile memory (for example, RAM); or non-volatile memory (for example, ROM, flash memory, hard disk). Drive (HDD) or Solid-State Drive (SSD); or a combination of the above types of memory, and provides instructions and data to the processor 52.
  • volatile memory for example, RAM
  • non-volatile memory for example, ROM, flash memory, hard disk).
  • HDD Hard-State Drive
  • SSD Solid-State Drive
  • the processor 52 may be an Application Specific Integrated Circuit (ASIC), a DSP, a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), an FPGA, a CPU, At least one of a controller, a microcontroller, and a microprocessor. It can be understood that, for different devices, the electronic device used to implement the processor function may be other, which is not specifically limited in the embodiment of the present application.
  • ASIC Application Specific Integrated Circuit
  • DSP Digital Signal Processing Device
  • PLD Programmable Logic Device
  • FPGA field-programmable Logic Device
  • the embodiments of the present application may be provided as a method, a system, or a computer program product. Therefore, this application may take the form of a hardware embodiment, a software embodiment, or an embodiment combining software and hardware aspects. Moreover, the present application may take the form of a computer program product implemented on one or more computer-usable storage media (including, but not limited to, disk storage, optical storage, and the like) containing computer-usable program code.
  • a computer-usable storage media including, but not limited to, disk storage, optical storage, and the like
  • These computer program instructions may also be stored in a computer-readable memory capable of directing a computer or other programmable data processing device to work in a particular manner such that the instructions stored in the computer-readable memory produce a manufactured article including an instruction device, the instructions
  • the device implements the functions specified in one or more flowcharts and / or one or more blocks of the block diagram.
  • These computer program instructions can also be loaded on a computer or other programmable data processing device, so that a series of steps can be performed on the computer or other programmable device to produce a computer-implemented process, which can be executed on the computer or other programmable device.
  • the instructions provide steps for implementing the functions specified in one or more flowcharts and / or one or more blocks of the block diagrams.
  • the n privacy attributes corresponding to the n privacy submodels can be flexibly set according to the actual needs of the user, the n privacy submodels that meet the actual needs can be obtained.
  • the required n privacy sub-models determine that the warning information is generated, it indicates that the generation of the warning information is in line with the actual requirements; that is, by setting the n privacy attributes flexibly and autonomously in advance, the user's privacy data can be alerted. It has certain flexibility and autonomy, which can prevent the leakage of private data that requires privacy protection.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Bioethics (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Computer Security & Cryptography (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

Les modes de réalisation de la présente invention concernent un procédé et un dispositif de protection des données, un appareil et un support d'information informatique. Le procédé consiste à : acquérir n sous-modèles de confidentialité ; chaque sous-modèle de confidentialité étant un ensemble de données représentant un type d'attribut de confidentialité, les attributs de confidentialité représentés par les n sous-modèles de confidentialité étant différents les uns des autres, et n étant un nombre entier supérieur à un ; acquérir des données à traiter, et déterminer un sous-modèle de confidentialité correspondant auxdites données ; lorsque la corrélation entre lesdites données et le sous-modèle de confidentialité correspondant est égale ou supérieure à un seuil de corrélation prédéfini, produire des informations d'alerte précoce afin d'inviter à effectuer la protection de la confidentialité sur lesdites données.
PCT/CN2019/105390 2018-09-30 2019-09-11 Procédé et dispositif de protection des données, appareil et support de stockage informatique WO2020063349A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811162220.9 2018-09-30
CN201811162220.9A CN110968889A (zh) 2018-09-30 2018-09-30 一种数据保护方法、设备、装置和计算机存储介质

Publications (1)

Publication Number Publication Date
WO2020063349A1 true WO2020063349A1 (fr) 2020-04-02

Family

ID=69951173

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/105390 WO2020063349A1 (fr) 2018-09-30 2019-09-11 Procédé et dispositif de protection des données, appareil et support de stockage informatique

Country Status (2)

Country Link
CN (1) CN110968889A (fr)
WO (1) WO2020063349A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112818390A (zh) * 2021-01-26 2021-05-18 支付宝(杭州)信息技术有限公司 一种基于隐私保护的数据信息发布方法、装置及设备
CN113742781B (zh) * 2021-09-24 2024-04-05 湖北工业大学 一种k匿名聚类隐私保护方法、***、计算机设备、终端

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014169600A1 (fr) * 2013-09-06 2014-10-23 中兴通讯股份有限公司 Procédé, dispositif, et terminal pour le traitement d'un dossier de fichiers caché
CN104200170A (zh) * 2014-04-15 2014-12-10 中兴通讯股份有限公司 一种电子设备的隐私保护方法及电子设备
CN106599709A (zh) * 2015-10-15 2017-04-26 中兴通讯股份有限公司 一种防隐私信息泄露的方法、装置及终端
WO2017187207A1 (fr) * 2016-04-29 2017-11-02 Privitar Limited Système et procédé d'ingénierie de confidentialité mis en œuvre par ordinateur
CN107563204A (zh) * 2017-08-24 2018-01-09 西安电子科技大学 一种匿名数据的隐私泄露风险评估方法

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102231277A (zh) * 2011-06-29 2011-11-02 电子科技大学 基于声纹识别的移动终端隐私保护方法
WO2016206041A1 (fr) * 2015-06-25 2016-12-29 宇龙计算机通信科技(深圳)有限公司 Procédé et appareil de protection de données de terminal
CN106709588B (zh) * 2015-11-13 2022-05-17 日本电气株式会社 预测模型构建方法和设备以及实时预测方法和设备
GB201610883D0 (en) * 2016-06-22 2016-08-03 Microsoft Technology Licensing Llc Privacy-preserving machine learning
CN107358111B (zh) * 2017-08-28 2019-11-22 维沃移动通信有限公司 一种隐私保护方法和移动终端
CN107819945B (zh) * 2017-10-30 2020-11-03 同济大学 综合多种因素的手持设备浏览行为认证方法及***

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014169600A1 (fr) * 2013-09-06 2014-10-23 中兴通讯股份有限公司 Procédé, dispositif, et terminal pour le traitement d'un dossier de fichiers caché
CN104200170A (zh) * 2014-04-15 2014-12-10 中兴通讯股份有限公司 一种电子设备的隐私保护方法及电子设备
CN106599709A (zh) * 2015-10-15 2017-04-26 中兴通讯股份有限公司 一种防隐私信息泄露的方法、装置及终端
WO2017187207A1 (fr) * 2016-04-29 2017-11-02 Privitar Limited Système et procédé d'ingénierie de confidentialité mis en œuvre par ordinateur
CN107563204A (zh) * 2017-08-24 2018-01-09 西安电子科技大学 一种匿名数据的隐私泄露风险评估方法

Also Published As

Publication number Publication date
CN110968889A (zh) 2020-04-07

Similar Documents

Publication Publication Date Title
EP3622402B1 (fr) Détection en temps réel de menaces informatiques en utilisant des analyses comportementales
US10785241B2 (en) URL attack detection method and apparatus, and electronic device
Fernandes et al. Generalised differential privacy for text document processing
CN106682495B (zh) 安全防护方法及安全防护装置
US11256821B2 (en) Method of identifying and tracking sensitive data and system thereof
Zarni Aung Permission-based android malware detection
WO2017032261A1 (fr) Procédé, dispositif et appareil d'authentification d'identité
US9628506B1 (en) Systems and methods for detecting security events
CN109614238B (zh) 一种目标对象识别方法、装置、***及可读存储介质
CN109614816A (zh) 数据脱敏方法、装置及存储介质
Zardari et al. K-NN classifier for data confidentiality in cloud computing
WO2020063349A1 (fr) Procédé et dispositif de protection des données, appareil et support de stockage informatique
US20190179906A1 (en) Behavior inference model building apparatus and behavior inference model building method thereof
EP3471060A1 (fr) Appareil et procédés de détermination et de fourniture de contenu anonymisé dans des images
KR20220011101A (ko) 데이터 프라이버시 시스템
WO2019019711A1 (fr) Procédé et appareil de publication de données de motif de comportement, dispositif terminal et support
Malik Android system call analysis for malicious application detection
Charmilisri et al. A novel ransomware virus detection technique using machine and deep learning methods
US11556653B1 (en) Systems and methods for detecting inter-personal attack applications
CN113409014A (zh) 基于人工智能的大数据业务处理方法及人工智能服务器
AbuAlghanam et al. Android Malware Detection System Based on Ensemble Learning
CN105809074B (zh) 一种usb数据传输控制方法、装置、控制组件及***
Aswini et al. Towards the Detection of Android Malware using Ensemble Features.
Yao et al. Reverse Engineering of Deceptions on Machine-and Human-Centric Attacks
CN113849246B (zh) 插件识别方法、插件加载方法、计算设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19867696

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 24.08.2021)

122 Ep: pct application non-entry in european phase

Ref document number: 19867696

Country of ref document: EP

Kind code of ref document: A1