CN114443184A - Intelligent terminal privacy protection method and device - Google Patents

Intelligent terminal privacy protection method and device Download PDF

Info

Publication number
CN114443184A
CN114443184A CN202011194847.XA CN202011194847A CN114443184A CN 114443184 A CN114443184 A CN 114443184A CN 202011194847 A CN202011194847 A CN 202011194847A CN 114443184 A CN114443184 A CN 114443184A
Authority
CN
China
Prior art keywords
privacy
intelligent terminal
protection
user
overhead
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011194847.XA
Other languages
Chinese (zh)
Inventor
李慧芳
杨穗珊
庞涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Telecom Corp Ltd
Original Assignee
China Telecom Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Telecom Corp Ltd filed Critical China Telecom Corp Ltd
Priority to CN202011194847.XA priority Critical patent/CN114443184A/en
Publication of CN114443184A publication Critical patent/CN114443184A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/602Providing cryptographic facilities or services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6209Protecting access to data via a platform, e.g. using keys or access control rules to a single file or object, e.g. in a secure envelope, encrypted and accessed using a key, or with access control rules appended to the object itself
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/64Protecting data integrity, e.g. using checksums, certificates or signatures

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Computer Security & Cryptography (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Health & Medical Sciences (AREA)
  • Bioethics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Storage Device Security (AREA)

Abstract

The invention relates to a method and a device for protecting privacy information of an intelligent terminal, wherein the method comprises the steps of quantitatively evaluating privacy risks faced by the intelligent terminal under the existing configuration to generate a privacy risk level L; collecting privacy preferences of intelligent terminal users to generate user privacy preferences P; calculating actual privacy overhead K after various privacy protection technologies are adopted based on the privacy risk level L and the user privacy preference degree P; comparing the actual privacy overhead K with a preset threshold value T, and if the actual privacy overhead K is less than or equal to the preset threshold value T, determining the combination of various adopted privacy protection technologies as a privacy protection strategy S; sending the privacy protection strategy S to an intelligent terminal operating system to inquire whether a user agrees with the privacy protection strategy S; and mapping the privacy protection strategy S into a combination of various privacy protection technologies under the condition that the user agrees to the privacy protection strategy S, and informing the intelligent terminal operating system to start various privacy protection technologies.

Description

Intelligent terminal privacy protection method and device
Technical Field
The present invention relates to the field of mobile terminal applications, and more particularly, to a method and an apparatus for protecting privacy of an intelligent terminal user.
Background
With the development of artificial intelligence technology, the application of artificial intelligence technology such as machine learning has greatly expanded the scenes, range and number of personal information collection. The application of artificial intelligence cognitive technologies such as image recognition, voice recognition and semantic understanding to the intelligent terminal realizes the acquisition of massive unstructured data. In the intelligent era of all-things interconnection and all-things media, any artificial intelligent terminal can become a content information source and a content receiving window, and a large amount of data for mining is stored, so that the problem of personal privacy disclosure is raised to be the outstanding risk of artificial intelligence governance.
Compared with the traditional big data application which mainly collects information such as user internet access habits and consumption records, the personal information which can be collected by the application on the artificial intelligent terminal is richer and more sensitive, and comprises the biological characteristic personal sensitive information such as user faces, fingerprints, voiceprints, irises, heartbeats and genes. The sensitive information has privacy, uniqueness and invariance, and once the sensitive information is disclosed or abused, the sensitive information has a serious influence on the property of the citizens.
In order to attract more developers, the existing terminal application ecosystems such as Android and iOS disclose more and more APIs at a platform level, and the APIs provide access to hardware functions (including GPS, accelerometer, camera) and user data (including unique identifier, location, social media account) of the intelligent terminal, most of which are privacy-sensitive information. Although both Android and iOS platforms have permission control mechanisms that allow users to control access to sensitive data and functions, on one hand, the permission manager does not provide enough information to help users make reasonable decisions and decisions, and on the other hand, for some less-concerned sensor permissions (such as gravity sensors), the permission manager does not provide permission management and applications can be used freely. Thus, the rights manager of the operating system alone is far from sufficient to achieve secure privacy protection.
The existing intelligent terminal has the privacy security problems that personal sensitive privacy information of a user is revealed and abused.
Disclosure of Invention
In order to overcome the above drawbacks, it is necessary to provide an innovative method and apparatus for protecting privacy of an intelligent terminal user, so as to better protect the privacy of the user while providing the user with required applications and services.
The following presents a simplified summary of the disclosure in order to provide a basic understanding of some aspects of the disclosure. However, it should be understood that this summary is not an exhaustive overview of the disclosure. It is not intended to identify key or critical elements of the disclosure or to delineate the scope of the disclosure. Its sole purpose is to present some concepts of the disclosure in a simplified form as a prelude to the more detailed description that is presented later.
According to one aspect of the disclosure, a method for protecting privacy of an intelligent terminal user is provided, which quantitatively evaluates privacy risks faced by an intelligent terminal under the existing configuration to generate a privacy risk level L; collecting privacy preferences of intelligent terminal users to generate user privacy preferences P; calculating actual privacy overhead K after various privacy protection technologies are adopted based on the privacy risk level L and the user privacy preference degree P; comparing the actual privacy overhead K with a preset threshold value T, and if the actual privacy overhead K is less than or equal to the preset threshold value T, determining the combination of various adopted privacy protection technologies as a privacy protection strategy S; sending the privacy protection strategy S to an intelligent terminal operating system to inquire whether a user agrees with the privacy protection strategy S; and mapping the privacy protection strategy S into a combination of various privacy protection technologies under the condition that the user agrees to the privacy protection strategy S, and informing the intelligent terminal operating system to start various privacy protection technologies.
According to another aspect of the disclosure, a device for protecting privacy of an intelligent terminal user is provided, which includes a privacy risk evaluation unit, configured to quantitatively evaluate a privacy risk faced by an intelligent terminal in its existing configuration to generate a privacy risk level L; the privacy preference collecting unit is used for collecting the privacy preferences of the intelligent terminal user to generate a user privacy preference degree P; the privacy policy generation unit is used for calculating actual privacy expenses K after various privacy protection technologies are adopted based on the received terminal privacy risk level L from the privacy risk evaluation unit and the user privacy preference P from the privacy preference collection unit, determining the combination of the various privacy protection technologies as a privacy protection policy S under the condition that the actual privacy expenses K are less than or equal to a preset threshold value T, and sending the privacy protection policy S to the intelligent terminal operation system to inquire whether the user agrees to the privacy protection policy S; and the privacy management implementation unit is used for mapping the privacy protection strategy S received from the privacy strategy generation unit into a combination of various privacy protection technologies and informing the intelligent terminal operating system to start the various privacy protection technologies under the condition that the user agrees to the privacy protection strategy S generated by the privacy strategy generation unit.
According to still another aspect of the present disclosure, there is provided an intelligent terminal privacy information protection apparatus, including: a memory having instructions stored thereon; and a processor configured to execute instructions stored on the memory to perform a method according to the above aspects of the disclosure.
According to yet another aspect of the present disclosure, there is provided a computer-readable storage medium comprising computer-executable instructions that, when executed by one or more processors, cause the one or more processors to perform a method according to the above-mentioned aspect of the present disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the description, serve to explain the principles of the disclosure.
The present disclosure may be more clearly understood from the following detailed description with reference to the accompanying drawings, in which:
FIG. 1 shows a schematic diagram of the overall architecture of an intelligent end-user privacy preserving apparatus, according to one embodiment of the present disclosure;
FIG. 2 illustrates a flow diagram of a method of intelligent end-user privacy protection in accordance with one embodiment of the present disclosure; and
FIG. 3 illustrates an exemplary configuration in which an intelligent end-user privacy preserving apparatus according to embodiments of the present disclosure may be implemented.
Detailed Description
The following detailed description is made with reference to the accompanying drawings and is provided to assist in a comprehensive understanding of various exemplary embodiments of the disclosure. The following description includes various details to aid understanding, but these details are to be regarded as examples only and are not intended to limit the disclosure, which is defined by the appended claims and their equivalents. The words and phrases used in the following description are intended only to provide a clear and consistent understanding of the disclosure. In addition, descriptions of well-known structures, functions, and configurations may be omitted for clarity and conciseness. Those of ordinary skill in the art will recognize that various changes and modifications of the examples described herein can be made without departing from the spirit and scope of the disclosure.
Fig. 1 shows a schematic diagram of the overall architecture of an intelligent terminal privacy preserving apparatus 100 according to one embodiment of the present disclosure. The smart terminal may be, but is not limited to being, a desktop computer, a laptop computer, a tablet computer, a Personal Data Assistant (PDA), a smart phone, an in-vehicle computer, or a combination thereof.
The intelligent terminal privacy protecting apparatus 100 includes a privacy risk evaluating unit 101, a privacy preference collecting unit 102, a privacy policy generating unit 103, and a privacy management implementing unit 104.
The privacy risk evaluation unit 101 is configured to quantitatively evaluate a privacy risk faced by the intelligent terminal in an existing configuration of the intelligent terminal, for example, quantitatively evaluate a usage situation of an application installed in the intelligent terminal on various privacy authorities, such as the number and frequency of invoking authorities, to generate a quantitative expression of the privacy risk of the intelligent terminal, that is, a privacy risk level L.
The privacy preference collecting unit 102 is used for collecting privacy preferences of users of intelligent terminals. The privacy preference represents the degree of importance of the user to the privacy information, and reflects the subjective intention of the user to protect privacy. The privacy preferences include how often the user allows the intelligent terminal or an application loaded by the intelligent terminal to collect privacy information of the user, which privacy information is prohibited from being collected, and which privacy information is allowed to be collected. The privacy preference collecting unit 102 quantifies the privacy preferences through the formalized definitions after collecting the privacy preferences of the user. The formal definition comprises two items of 'privacy content' and 'privacy weight', and the privacy preference collecting unit quantifies and combines all the privacy content according to the privacy weight to finally generate quantitative expression of the privacy preference of the user, namely the user privacy preference P. The larger the P is, the higher the protection requirement of the user on the private data is; the smaller P is, the lower the protection requirement of the user on the private data is.
The privacy policy generation unit 103 receives the privacy risk level L from the privacy risk evaluation unit 101 and the user privacy preference P from the privacy preference collection unit 102, and generates an index K named "initial privacy overhead" from the two indexes, i.e., the terminal privacy risk level L and the user privacy preference P0I.e. K0α L + β P, wherein α ≦ 0 ≦ 1, β ≦ 0 ≦ 1, and α + β ≦ 1.
The purpose of the privacy policy generation unit 103 is to generate a privacy protection policy S to set an initial privacy overhead K0Falls below a preset threshold T. The threshold value T represents the maximum value of the actual privacy overhead corresponding to the protection of privacy that a user can accept by employing various privacy protection techniques. Assuming that various privacy protection technologies i (e.g. system authority calling rule, enabling encryption or perturbation technology processing on user original privacy data and terminal equipment information (including but not limited to IMEI number, MAC address), etc.) can reduce privacy overhead by Ki(i is an integer, i is 1, …, n), then an initial privacy overhead K will be incurred0Subtracting privacy overhead K saved by employing privacy preserving techniquesiThe actual privacy overhead K is obtained:
Figure BDA0002753733090000051
wherein, ω isiIs a weight coefficient, 0 ≦ ωiLess than or equal to 1. If the actual privacy overhead K is less than or equal to the threshold T, i.e., K ≦ threshold T, then participation in calculating KiConstitutes a privacy protection policy S that satisfies the privacy protection requirements.
The privacy policy generation unit 103 transmits the privacy protection policy S to the intelligent terminal operating system in real time to inquire whether the user agrees with the privacy protection policy S, and transmits the privacy protection policy S to the privacy manager unit 104 in the case where the user agrees with the privacy protection policy S.
The privacy management implementing unit 104 is configured to map the privacy protection policy S into a combination of corresponding various privacy protection technologies and notify the smart terminal operating system to enable the privacy protection technologies when the user agrees to the privacy protection policy S generated by the privacy policy generating unit 103.
A flowchart of a smart terminal privacy protection method according to one embodiment of the present disclosure will now be described with reference to fig. 2.
The intelligent terminal enables the privacy protection method before installing a new internet application each time.
First, the privacy risk assessment unit 101 and the privacy preference collection unit 102 may perform operations in parallel to generate the privacy risk level L and the user privacy preference P, respectively.
In step S21.1, the privacy risk assessment unit 101 quantitatively assesses the privacy risk faced by the intelligent terminal in its existing configuration. And quantitatively evaluating the use conditions of the installed applications in the intelligent terminal on various privacy authorities, such as the number and frequency of calling authorities, so as to generate the quantitative expression of the privacy risk of the intelligent terminal, namely the privacy risk level L. Setting the privacy risk level L to a value between [0-10], wherein [0-3] is low risk, [4-6] is medium risk, and [7-10] is high risk. In the present embodiment, the privacy risk level L belongs to the high risk level, L ═ 8.
Then, in step S22.1, the privacy risk assessment unit 101 sends the generated intelligent terminal privacy risk level L to the privacy policy generation unit 103 of the intelligent terminal.
Step S21.2 may be performed in parallel with step S21.1, wherein the privacy preference collection unit 102 collects privacy preferences of the intelligent end user. The privacy preference represents the degree of importance of the user to the privacy information, and reflects the subjective intention of the user to protect privacy. The privacy preferences include how often the user allows the intelligent terminal or an application loaded by the intelligent terminal to collect privacy information of the user, which privacy information is prohibited from being collected, and which privacy information is allowed to be collected. After the privacy preferences of the user are collected, the privacy preference collecting unit 102 quantifies the privacy preferences through a formal definition, where the formal definition includes two items, namely "privacy content" and "privacy weight", and the privacy preference collecting unit quantifies and combines all the privacy contents according to the privacy weights to finally generate a quantitative expression of the privacy preferences of the user, that is, a user privacy preference degree P.
The larger the P is, the higher the protection requirement of the user on the private data is; the smaller P is, the lower the protection requirement of the user on the private data is. Setting P to a value between [0-10], where [0-3] represents a low privacy preference, [4-6] represents a medium privacy preference, and [7-10] represents a high privacy preference. In the present embodiment, the user privacy preference P belongs to the medium privacy preference, and P is 6.
Then, in step S22.2, the privacy preference collecting unit 102 sends the generated user privacy preference degree P to the privacy policy generating unit 103 of the intelligent terminal.
Next, in step S23, the privacy policy generation unit 103 calculates an initial privacy overhead K according to the following formula based on the received privacy risk level L from the privacy risk evaluation unit and the user privacy preference P from the privacy preference collection unit0
K0Wherein 0. ltoreq. alpha. ltoreq.1, 0. ltoreq. beta. ltoreq.1, and α + β. ltoreq.1
In this embodiment, α ═ β ═ 0.5, the calculation yields:
K0=0.5×8+0.5×8=7
in addition, in another embodiment, the values of α and β may be:
α=1,β=0
in this case, the privacy policy generation unit 103 generates the privacy protection policy S in consideration of only the privacy risk level L generated by quantitative evaluation of the privacy risk that the intelligent terminal faces in its existing configuration.
In another embodiment, the values of α and β may be:
α=0,β=1
in this case, the privacy policy generation unit 103 generates the privacy protection policy S in consideration of only the user privacy preference degree P generated for the privacy preference of the smart terminal user.
The purpose of the privacy policy generation unit 103 is to generate a privacy protection policy S to set an initial privacy overhead K0Falls below a preset threshold T. The threshold value T represents the maximum value of the actual privacy overhead corresponding to the protection of privacy that a user can accept by employing various privacy protection techniques. In this embodiment, it is assumed that the privacy overhead threshold T preset by the user is 5, and the initial privacy overhead K is obtained through calculation0Is greater than 5, and therefore some privacy protection techniques need to be employed to reduce the actual privacy overhead K. Assuming that various privacy protection technologies i (e.g. system authority calling rule, enabling encryption or perturbation technology processing on user original privacy data and terminal equipment information (including but not limited to IMEI number, MAC address), etc.) can reduce privacy overhead by Ki(i is an integer, i is 1, …, n), then an initial privacy overhead K will be incurred0Minus privacy overhead K saved by employing privacy preserving techniquesiThe actual privacy overhead K is obtained:
Figure BDA0002753733090000071
) Then the initial privacy overhead K0Subtracting privacy overhead K saved by employing various privacy protection techniquesiThe actual privacy overhead K is obtained:
Figure BDA0002753733090000072
wherein, ω isiIs a weight coefficient, 0 ≦ ωiLess than or equal to 1. If the actual privacy overhead K is less than or equal to the threshold value T, namely K is less than or equal to T, the combination of various privacy protection technologies participating in calculating the privacy overhead K forms a privacy protection strategy S meeting the privacy protection requirement.
In this embodiment, by analyzing the application usage installed on the intelligent terminal, it is found that most of the loaded applications are non-real-time interactive applications, so that a policy for reducing the calling frequency of the privacy permissions of the system is enabled, and the calling frequencies of the three important privacy permissions, namely "location", "camera", and "reading contact", are reduced. Reduced privacy overhead K assuming reduced "location" privacy permission invocation frequency by half1Is 3, weight ω1Is 0.5; reduced privacy overhead K by reducing the "camera" privacy permissions invocation frequency by half2Is 2, weight ω2Is 0.3; reduced privacy overhead K by reducing the frequency of contact reading privacy permission calls by half3Is 1, weight ω30.2, then the actual privacy overhead K is calculated as:
Figure BDA0002753733090000081
since the calculated value 4.7 of the actual privacy cost K is less than the value 5 of the preset threshold T, the combination of the three techniques for reducing the calling frequency of the privacy authorities by half can be determined to constitute the privacy protection policy S, i.e., S ═ reduces the calling frequency of the three system authorities "position", "camera", and "read contact" by half }.
Then, in step S24, the privacy policy generation unit 103 transmits the determined privacy protection policy S to the smart terminal operating system in real time.
Next, after receiving the privacy policy S, the intelligent terminal operating system actively prompts the user that, in order to control the privacy overhead, the calling frequency of the three system permissions of "location", "camera", and "reading contact" needs to be reduced by half, and asks the user whether to agree with the privacy protection policy S.
After obtaining the feedback that the user agrees to the privacy protection policy S, the intelligent terminal operating system feeds back the inquiry agreement result of the user on the privacy protection policy S to the privacy policy generation unit 103 in step S25.
Upon receiving the inquiry approval result for the privacy protection policy S fed back from the smart terminal operating system, the privacy policy generation unit 103 transmits the privacy protection policy S { half the call frequency of the three system authorities of "position", "camera", and "read contact" to the privacy management enforcement unit 104 in step S26.
Finally, in step S27, the privacy management enforcement unit 104 maps the received privacy protection policy S to a corresponding combination of various privacy protection techniques, and notifies the intelligent terminal operating system to enable these privacy protection techniques, for example, to reduce the calling frequency of the three system authorities "location", "camera", and "read contact" by half.
It is to be understood that the present invention may not be limited to the above-described steps, may include additional steps or may delete steps, and the order of the steps may be different. And the present invention need not perform all of the above-described steps, but may perform only a portion of the steps for a particular purpose or purposes.
It is worth noting that the intelligent terminal triggers the intelligent terminal privacy risk assessment in the disclosure whenever a new application is installed, modifies the privacy protection policy in real time according to the latest assessment result, and enables a corresponding privacy protection technology. These privacy protection techniques include, but are not limited to: the method comprises the steps of adjusting the application authority on the terminal, processing original privacy data and terminal equipment information (including but not limited to IMEI numbers and MAC addresses) by adopting technologies such as encryption and disturbance, and the like, so as to ensure timely protection of the privacy of users.
Therefore, according to the method and the device for protecting the privacy of the intelligent terminal user, the privacy risk under the existing configuration of the intelligent terminal is quantitatively evaluated, and the privacy protection strategy is generated by considering the personalized privacy preference of the user, so that different levels of privacy protection functions can be provided for terminals with different configurations and users with different privacy preferences. In addition, the privacy protection mechanism of the invention can provide different levels of privacy protection only by considering the privacy risks under the existing configuration of the intelligent terminal, or provide different levels of privacy protection only by considering the privacy preferences of the users, so as to meet the individual requirements of different users on the privacy protection.
FIG. 3 illustrates an exemplary configuration in which an intelligent end-user privacy preserving apparatus according to embodiments of the present invention may be implemented.
As shown in fig. 3, the intelligent end-user privacy protection arrangement may include one or more elements connected to or in communication with bus 302, possibly via one or more interfaces. Bus 302 can include, but is not limited to, an Industry Standard Architecture (ISA) bus, a Micro Channel Architecture (MCA) bus, an enhanced ISA (eisa) bus, a Video Electronics Standards Association (VESA) local bus, and a Peripheral Component Interconnect (PCI) bus, among others. The intelligent end user privacy protection apparatus may include, for example, one or more processors 304, one or more input devices 306, and one or more output devices 308. The one or more processors 304 may be any kind of processor and may include, but are not limited to, one or more general purpose processors or special purpose processors (such as special purpose processing chips). Input device 306 may be any type of input device capable of inputting information to a computing device and may include, but is not limited to, a mouse, a keyboard, a touch screen, a microphone, and/or a remote control. Output device 308 may be any type of device capable of presenting information and may include, but is not limited to, a display, speakers, a video/audio output terminal, a vibrator, and/or a printer.
The intelligent end user privacy protecting apparatus may also include or be connected to a non-transitory storage device 314, which non-transitory storage device 314 may be any non-transitory and may enable data storage, and may include, but is not limited to, a disk drive, an optical storage device, a solid state memory, a floppy disk, a flexible disk, a hard disk, a tape, or any other magnetic medium, a compact disk or any other optical medium, a cache memory and/or any other memory chip or unit, and/or any other medium from which a computer may read data, instructions and/or code. The intelligent end-user privacy protecting means may further comprise a Random Access Memory (RAM)310 and a read-only memory (RO)M) 312. The ROM 312 may store programs, utilities or processes to be executed in a nonvolatile manner. RAM 310 may provide volatile data storage and store instructions related to the operation of the intelligent end user privacy protection device. The intelligent end-user privacy protection apparatus may also include a network/bus interface 316 coupled to a data link 318. The network/bus interface 316 may be any kind of device or system capable of enabling communication with external devices and/or networks, and may include, but is not limited to, a modem, a network card, an infrared communication device, a wireless communication device, and/or a chipset (such as bluetooth)TMDevices, 1302.11 devices, WiFi devices, WiMax devices, cellular communications facilities, etc.).
Various aspects, embodiments, implementations, or features of the foregoing embodiments may be used alone or in any combination. Various aspects of the foregoing embodiments may be implemented by software, hardware, or a combination of hardware and software.
For example, the foregoing embodiments may be embodied as computer readable code on a computer readable medium. The computer readable medium is any data storage device that can store data which can thereafter be read by a computer system. Examples of a computer readable medium include read-only memory, random-access memory, CD-ROMs, DVDs, magnetic tape, hard drives, solid state drives, and optical data storage devices. The computer readable medium can also be distributed over network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.
For example, the foregoing embodiments may take the form of hardware circuitry. Hardware circuitry may include any combination of combinational logic circuitry, clocked storage devices (such as floppy disks, flip-flops, latches, etc.), finite state machines, memories such as static random access memories or embedded dynamic random access memories, custom designed circuits, programmable logic arrays, etc.
Although some embodiments of the present invention have been described in detail by way of illustration, it should be understood by those skilled in the art that the foregoing examples are for purposes of illustration only and are not intended to limit the scope of the present invention. It will be appreciated by those skilled in the art that modifications may be made to the above embodiments without departing from the scope and spirit of the invention. The scope of the invention is defined by the appended claims.
Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In some cases, multitasking and parallel processing may be advantageous.

Claims (12)

1. An intelligent terminal privacy information protection method comprises the following steps:
carrying out quantitative evaluation on privacy risks faced by the intelligent terminal under the existing configuration to generate a privacy risk level L;
collecting privacy preferences of intelligent terminal users to generate user privacy preferences P;
calculating actual privacy overhead K after various privacy protection technologies are adopted based on the privacy risk level L and the user privacy preference degree P;
comparing the actual privacy overhead K with a preset threshold value T, and if the actual privacy overhead K is less than or equal to the preset threshold value T, determining the combination of the various adopted privacy protection technologies as a privacy protection strategy S;
sending the privacy protection strategy S to an intelligent terminal operating system to inquire whether a user agrees with the privacy protection strategy S; and
and under the condition that the user agrees to the privacy protection strategy S, mapping the privacy protection strategy S into a combination of various privacy protection technologies, and informing an intelligent terminal operating system to start the various privacy protection technologies.
2. The intelligent terminal privacy information protection method of claim 1, wherein calculating the actual privacy overhead K after various privacy protection techniques are adopted based on the privacy risk level L and the user privacy preference degree P comprises:
the initial privacy overhead K is calculated as follows0
K0Wherein α is not less than 0 and not more than 1, β is not less than 0 and not more than 1, and α + β is not less than 1
If K is0>A threshold value T, then the initial privacy cost K is calculated according to the following formula0Subtracting privacy overhead K saved by employing various privacy protection techniquesiTo get the actual privacy overhead K:
Figure FDA0002753733080000011
wherein, ω isiIs a weight coefficient, 0 ≦ ωiNo more than 1, i is an integer and i is 1, …, n;
the threshold value T represents the maximum value of the actual privacy overhead corresponding to the protection of privacy that a user can accept by employing various privacy protection techniques.
3. The intelligent terminal privacy information protection method according to claim 1 or 2, wherein the quantitatively evaluating the privacy risk faced by the intelligent terminal under the existing configuration to generate the privacy risk level L includes:
and quantitatively evaluating the quantity and frequency of the calling authorities of various kinds of privacy aiming at the installed application in the intelligent terminal, and generating the quantitative expression of the privacy risk of the intelligent terminal, namely the privacy risk level L.
4. The intelligent terminal privacy information protection method according to claim 1 or 2, wherein the privacy preferences of the user include privacy contents and privacy weights, and the user privacy preference degree P is obtained by quantitatively combining the privacy contents according to the privacy weights.
5. The intelligent terminal privacy information protection method according to claim 1 or 2, wherein the privacy risk level L is a numerical value between [0-10], wherein [0-3] is low risk, [4-6] is medium risk, and [7-10] is high risk.
6. The intelligent terminal privacy information protection method according to claim 1 or 2, wherein the user privacy preference degree P is a numerical value between [0-10], wherein [0-3] represents a low privacy preference, [4-6] represents a medium privacy preference, and [7-10] represents a high privacy preference.
7. The intelligent terminal privacy information protection method according to claim 1 or 2, wherein the privacy protection technology comprises a system permission calling rule, and encryption or perturbation technology processing is enabled for user original privacy data and terminal equipment information.
8. The intelligent terminal privacy information protection method according to claim 1 or 2, wherein a privacy protection technology that reduces a system privacy authority calling frequency is enabled for a non-real-time interactive application installed in the intelligent terminal.
9. An intelligent terminal privacy information protection apparatus (100) comprising:
the privacy risk assessment unit (101) is used for quantitatively assessing privacy risks faced by the intelligent terminal under the existing configuration of the intelligent terminal to generate a privacy risk level L;
a privacy preference collecting unit (102) for collecting privacy preferences of the intelligent terminal user to generate a user privacy preference degree P;
the privacy policy generation unit (103) is used for calculating actual privacy expenses K after various privacy protection technologies are adopted based on the received terminal privacy risk level L from the privacy risk evaluation unit (101) and the user privacy preference P from the privacy preference collection unit (102), and determining the combination of the various privacy protection technologies as a privacy protection policy S and sending the privacy protection policy S to the intelligent terminal operation system to inquire whether the user agrees to the privacy protection policy S or not under the condition that the actual privacy expenses K are less than or equal to a preset threshold value T; and
and the privacy management implementation unit (104) is used for mapping the privacy protection strategy S received from the privacy strategy generation unit (103) into the combination of various privacy protection technologies and informing the intelligent terminal operating system to enable the various privacy protection technologies under the condition that the user agrees to the privacy protection strategy S generated by the privacy strategy generation unit (103).
10. The intelligent terminal privacy information protection device of claim 9, wherein calculating the actual privacy overhead K after various privacy protection techniques are adopted based on the privacy risk level L and the user privacy preference degree P comprises:
the initial privacy overhead K is calculated as follows0
K0Wherein 0. ltoreq. alpha. ltoreq.1, 0. ltoreq. beta. ltoreq.1, and α + beta. 1
If K is0>A threshold value T, then the initial privacy cost K is calculated according to the following formula0Subtracting privacy overhead K saved by employing various privacy protection techniquesiTo get the actual privacy overhead K:
Figure FDA0002753733080000031
wherein, ω isiIs a weight coefficient, 0 ≦ ωi≤1;
The threshold value T represents the maximum value of the actual privacy overhead corresponding to the protection of privacy that a user can accept by employing various privacy protection techniques.
11. An intelligent terminal privacy information protection device comprises:
a memory having instructions stored thereon; and
a processor configured to execute instructions stored on the memory to perform the method of any of claims 1 to 8.
12. A computer-readable storage medium comprising computer-executable instructions that, when executed by one or more processors, cause the one or more processors to perform the method of any one of claims 1-8.
CN202011194847.XA 2020-10-30 2020-10-30 Intelligent terminal privacy protection method and device Pending CN114443184A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011194847.XA CN114443184A (en) 2020-10-30 2020-10-30 Intelligent terminal privacy protection method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011194847.XA CN114443184A (en) 2020-10-30 2020-10-30 Intelligent terminal privacy protection method and device

Publications (1)

Publication Number Publication Date
CN114443184A true CN114443184A (en) 2022-05-06

Family

ID=81358417

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011194847.XA Pending CN114443184A (en) 2020-10-30 2020-10-30 Intelligent terminal privacy protection method and device

Country Status (1)

Country Link
CN (1) CN114443184A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9356961B1 (en) * 2013-03-11 2016-05-31 Emc Corporation Privacy scoring for cloud services
CN109388963A (en) * 2017-08-08 2019-02-26 武汉安天信息技术有限责任公司 A kind of mobile terminal user's private data means of defence and device
CN109684865A (en) * 2018-11-16 2019-04-26 中国科学院信息工程研究所 A kind of personalization method for secret protection and device
CN110365679A (en) * 2019-07-15 2019-10-22 华瑞新智科技(北京)有限公司 Context aware cloud data-privacy guard method based on crowdsourcing assessment
CN110519218A (en) * 2019-07-05 2019-11-29 中国科学院信息工程研究所 A kind of method for protecting privacy and system based on privacy leakage assessment
US20200175193A1 (en) * 2018-11-29 2020-06-04 GM Global Technology Operations LLC Systems and methods for preserving the privacy of collected vehicular data

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9356961B1 (en) * 2013-03-11 2016-05-31 Emc Corporation Privacy scoring for cloud services
CN109388963A (en) * 2017-08-08 2019-02-26 武汉安天信息技术有限责任公司 A kind of mobile terminal user's private data means of defence and device
CN109684865A (en) * 2018-11-16 2019-04-26 中国科学院信息工程研究所 A kind of personalization method for secret protection and device
US20200175193A1 (en) * 2018-11-29 2020-06-04 GM Global Technology Operations LLC Systems and methods for preserving the privacy of collected vehicular data
CN110519218A (en) * 2019-07-05 2019-11-29 中国科学院信息工程研究所 A kind of method for protecting privacy and system based on privacy leakage assessment
CN110365679A (en) * 2019-07-15 2019-10-22 华瑞新智科技(北京)有限公司 Context aware cloud data-privacy guard method based on crowdsourcing assessment

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
张盼盼;彭长根;郝晨艳;: "一种基于隐私偏好的隐私保护模型及其量化方法", 计算机科学, vol. 45, no. 06, 15 June 2018 (2018-06-15), pages 130 - 134 *
朱光;丰米宁;陈叶;杨嘉韵;: "大数据环境下社交网络隐私风险的模糊评估研究", 情报科学, vol. 34, no. 09, 5 September 2016 (2016-09-05), pages 94 - 98 *

Similar Documents

Publication Publication Date Title
US20210240848A1 (en) Detecting an attempted access of personal information on client computing devices
US20130333039A1 (en) Evaluating Whether to Block or Allow Installation of a Software Application
US20140068706A1 (en) Protecting Assets on a Device
CN107305569B (en) Information processing method and device
US11316693B2 (en) Trusted platform module-based prepaid access token for commercial IoT online services
US8190636B2 (en) Method, apparatus and computer program product for providing object privilege modification
CN103907366A (en) Method for selectively exposing subscriber data
US11895105B2 (en) Authenticated interface element interactions
US20230259548A1 (en) Virtual file library
US11017109B1 (en) Dynamic sandboxing of user data
CN113326502A (en) Android application classification authorization method for quantitative evaluation of suspicious behaviors
CN116680738A (en) Data query protection method, device, electronic equipment, medium and program product
Mendes et al. Enhancing user privacy in mobile devices through prediction of privacy preferences
Li et al. PhotoSafer: content-based and context-aware private photo protection for smartphones
US20210397751A1 (en) Software initiated camera and microphone indicator
Alsoubai et al. Permission vs. app limiters: profiling smartphone users to understand differing strategies for mobile privacy management
CN114443184A (en) Intelligent terminal privacy protection method and device
CN115242433B (en) Data processing method, system, electronic device and computer readable storage medium
CN107526960B (en) System and method for protecting mobile contact information
US8156297B2 (en) Smart device recordation
US11256864B2 (en) Contacts autocomplete keyboard
US20210344683A1 (en) Systems and methods for data security notification generation
Liang et al. Providing privacy protection and personalization awareness for android devices
Havey PrivSense: Static Analysis for Detecting Privacy-related Code in Mobile Apps
CN114638012A (en) User data authorization use method, system and storage medium based on privacy protection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination