CN112784323B - Information protection device and electronic equipment - Google Patents

Information protection device and electronic equipment Download PDF

Info

Publication number
CN112784323B
CN112784323B CN202011633518.0A CN202011633518A CN112784323B CN 112784323 B CN112784323 B CN 112784323B CN 202011633518 A CN202011633518 A CN 202011633518A CN 112784323 B CN112784323 B CN 112784323B
Authority
CN
China
Prior art keywords
viewer
information
display screen
face
peeping
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011633518.0A
Other languages
Chinese (zh)
Other versions
CN112784323A (en
Inventor
区国雄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Fushi Technology Co Ltd
Original Assignee
Shenzhen Fushi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Fushi Technology Co Ltd filed Critical Shenzhen Fushi Technology Co Ltd
Priority to CN202011633518.0A priority Critical patent/CN112784323B/en
Publication of CN112784323A publication Critical patent/CN112784323A/en
Application granted granted Critical
Publication of CN112784323B publication Critical patent/CN112784323B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/70Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer
    • G06F21/82Protecting input, output or interconnection devices
    • G06F21/84Protecting input, output or interconnection devices output devices, e.g. displays or monitors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/4401Bootstrapping
    • G06F9/4418Suspend and resume; Hibernate and awake
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Computer Hardware Design (AREA)
  • Computer Security & Cryptography (AREA)
  • General Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The application provides an information protection device, which is used for protecting information displayed on a display screen from being peeped by unauthorized persons, and comprises: the information collector is used for acquiring three-dimensional data of the face of the viewer; and the processor is used for analyzing whether the acquired three-dimensional data of the face of the viewer meets a preset peeping judging condition to judge whether the viewer has peeping risks, identifying whether the viewer is a preset authorizer by matching the difference between the three-dimensional data of the face of the viewer and a preset authorizer identity characteristic template after judging that the viewer has peeping risks, and executing information protection operation when identifying that the viewer with peeping risks is an unauthorized person. The application also provides electronic equipment related to the information protection device.

Description

Information protection device and electronic equipment
Technical Field
The application belongs to the field of biological identification, and particularly relates to an information protection device and electronic equipment.
Background
With the increasing functionality of electronic devices such as notebook computers, tablet computers, mobile phones, self-service terminals, more and more important things need to be processed on these electronic devices, but at the same time the risk of leakage of important information from these electronic devices is increased. For example: users are easy to peep by bystanders when operating electronic equipment in public places, so that information leakage is caused. Therefore, how to prevent peeping by bystanders when using electronic devices is an important issue that needs to be addressed for information protection.
Disclosure of Invention
The application aims to solve the technical problem of providing an information protection device and electronic equipment so as to prevent information leakage caused by peeping by bystanders when the electronic equipment is used.
The embodiment of the application provides an information protection device for protecting information displayed on a display screen from being peeped by unauthorized persons, which comprises the following components: the information collector is used for acquiring three-dimensional data of the face of the viewer; and the processor is used for analyzing whether the acquired three-dimensional data of the face of the viewer meets a preset peeping judging condition to judge whether the viewer has peeping risks, identifying whether the viewer is a preset authorizer by matching the difference between the three-dimensional data of the face of the viewer and a preset authorizer identity characteristic template after judging that the viewer has peeping risks, and executing information protection operation when identifying that the viewer with peeping risks is an unauthorized person.
In some embodiments, the peeping criteria include a first criteria: the distance between the face of the viewer and the display screen is smaller than a preset peeping distance threshold;
the peeping judging conditions comprise a second judging condition: the visual area of the viewer is overlapped with a preset key area of the display screen;
the processor judges that the peeping risk exists for the viewer when any one of the first judging condition or the second judging condition is met; or alternatively
And when the first judging condition and the second judging condition are simultaneously established, the processor judges that the peeping risk exists for the viewer.
In some embodiments, the processor is configured to construct an eye-visible angle range of the viewer by using the face orientation of the viewer as a center, and deviating the set eye-visible angle threshold from the face orientation along a preset direction, and take an area covered by the eye-visible angle range of the viewer on a plane on which the display screen is located as a visible area of the viewer at the current position and with the face orientation.
In some embodiments, the processor calculates the face orientation of the viewer from the acquired three-dimensional data of the face of the viewer.
In some embodiments, the processor extracts feature points of the viewer's face, connects the extracted feature points to construct a corresponding face reference plane, and calculates a vertical vector of the face reference plane as the viewer's face orientation from three-dimensional data of the feature points of the face reference plane.
In some embodiments, the information collector acquires three-dimensional data of the face of the viewer according to one or more of structured light sensing principles, time-of-flight sensing principles, binocular vision sensing principles.
In some embodiments, the information protection operation includes closing a display screen, popping up a text prompt for preventing peeping, changing the brightness of the display screen, sending out a sound prompt for preventing peeping, recording related evidence of peeping risk, and automatically alarming.
In some embodiments, the information protection device further includes an auto-sensor, configured to obtain environmental information of an environment in which the display screen is located and/or status information of the display screen, and the processor wakes up the information protection device according to a comparison result of the obtained environmental information and/or status information and preset sensing reference information.
In some embodiments, the sensing reference information includes an audio feature template, a proximity distance threshold, a face number threshold, and an acceleration change threshold, and the processor compares the environmental sound information acquired by the automatic sensor with an audio feature template obtained by training in a specific scene in advance, and wakes up the information protection device when the display screen is determined to be in the preset specific scene after comparison;
Or the automatic sensor acquires the distance between the viewer and the display screen, and when the distance between the viewer and the display screen is smaller than a preset approach distance threshold value, the processor wakes up the information protection device;
or the automatic sensor acquires image information in front of a display screen, and the processor wakes up the information protection device when the number of faces appearing in the image in front of the display screen exceeds a preset threshold value of the number of faces;
Or the automatic sensor acquires the acceleration change condition of the display screen, and the processor wakes up the information protection device when the sensed acceleration change amplitude exceeds a preset acceleration change threshold value.
In certain embodiments, the auto-sensor comprises one or a combination of more of a microphone, an image sensor, a proximity sensor, an acceleration sensor.
The application also provides electronic equipment, which comprises a display screen for displaying information; and the information protection device according to the above embodiment.
According to the information protection device and the corresponding information protection method thereof, provided by the application, according to the acquired three-dimensional data of the viewer in front of the display screen, the peeping risk of the viewer is judged first, and then the identity of the viewer is identified, so that the frequent identification based on the three-dimensional data with larger power consumption can be avoided. Secondly, when the viewer with peeping risk is identified as an unauthorized person, corresponding information protection operation can be automatically executed to prevent important information on the display screen from being peeped.
Additional aspects and advantages of embodiments of the application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of embodiments of the application.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device provided with an information protection device according to an embodiment of the present application.
Fig. 2 is a schematic diagram of functional modules of the electronic device shown in fig. 1.
Fig. 3 is a schematic diagram of the calculation of the viewable area of a viewer in an embodiment of the application.
FIG. 4 is a schematic diagram of the calculation of the orientation of a viewer's face in an embodiment of the application.
Fig. 5 is a schematic diagram of the calculation of the orientation of the face of a viewer in another embodiment of the present application.
Fig. 6 is a schematic functional block diagram of the electronic device according to another embodiment of the present application.
Fig. 7 is a flowchart of an information protection method based on the information protection device according to an embodiment of the present application.
Fig. 8 is a substep flow chart of step S102 in fig. 7.
Fig. 9 is a substep flowchart of step S103 in fig. 7.
Fig. 10 is a substep flowchart of step S104 provided in the embodiment of the present application.
Fig. 11 is a substep flowchart of step S104 according to another embodiment of the present application.
Fig. 12 is a substep flowchart of step S104 according to still another embodiment of the present application.
Fig. 13 is a flowchart of an information protection method based on the information protection device according to another embodiment of the present application.
DETAILED DESCRIPTION OF EMBODIMENT (S) OF INVENTION
Embodiments of the present application are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout. The embodiments described below by referring to the drawings are illustrative only and are not to be construed as limiting the application. In the description of the present application, it should be understood that the terms "first" and "second" are used for descriptive purposes only and are not to be interpreted as indicating or implying a relative importance or order of such features. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more of the described feature. In the description of the present application, the meaning of "a plurality" is two or more, unless explicitly defined otherwise.
In the description of the present application, it should be noted that, unless explicitly specified or limited otherwise, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected; can be mechanically connected, electrically connected or communicated with each other; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements or interaction relationship between the two elements. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art according to the specific circumstances.
The following disclosure provides many different embodiments, or examples, for implementing different structures of the application. In order to simplify the present disclosure, only the components and arrangements of specific examples will be described below. They are, of course, merely examples and are not intended to limit the application. Furthermore, the present application may repeat use of reference numerals and/or letters in the various examples, and is intended to be simplified and clear illustration of the present application, without itself being indicative of the particular relationships between the various embodiments and/or configurations discussed. In addition, the various specific processes and materials provided in the following description of the present application are merely examples of implementation of the technical solutions of the present application, but those of ordinary skill in the art should recognize that the technical solutions of the present application may also be implemented by other processes and/or other materials not described below.
Further, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to provide a thorough understanding of embodiments of the application. It will be appreciated, however, by one skilled in the art that the inventive aspects may be practiced without one or more of the specific details, or with other structures, components, etc. In other instances, well-known structures or operations are not shown or described in detail to avoid obscuring aspects of the application.
Referring to fig. 1 and 2 together, an embodiment of the present application provides an information protection device 1 for preventing information leakage caused by peeping by an unauthorized person at the time of operating an electronic device 2. The electronic device 2 may be, but is not limited to: a computer, a cell phone, or a self-service terminal, etc. The self-service terminal is, for example, an Automatic Teller Machine (ATM) of a bank, a touch-control interactive terminal arranged in a bank hall or a government office hall for handling self-service business, and the like. The electronic device 2 comprises a display 3, which display 3 is arranged to display information when operated. The information protection device 1 is used for sending out prompt or hidden information when detecting that an unauthorized person meeting the peeping judgment condition appears in front of the display screen 3, so as to protect the information displayed on the display screen 3 from being peeped by the unauthorized person.
The information protection device 1 comprises an information collector 12, a processor 14 and a control system 16. The information collector 12 is configured to collect depth information in a scene, and analyze the depth information to be used in fields such as identity recognition or state sensing. Optionally, in some embodiments, the depth information includes a distance between the viewer and the display screen 3 and three-dimensional data of the viewer's face. The distance between the viewer and the display screen 3 is used to determine whether the viewer has entered a range that can be peeped to decide whether the information protection device 1 needs to be woken up. And the three-dimensional reconstruction of the face of the viewer can be realized by means of the logical operation of the processor 14 on the three-dimensional data of the face of the viewer, so as to construct a three-dimensional point cloud of the face of the viewer. The information protection device 1 can perform identity recognition by matching and analyzing the acquired difference between the three-dimensional data of the face of the viewer and the three-dimensional data of the face of the authorizer. The three-dimensional data of the face of the authorized person is obtained by three-dimensional scanning of the face of a legally authorized user of the electronic device 2 by the information collector 12, and can be stored for later use in recognition. In addition, in some embodiments, the information collector 12 may also use infrared light to collect and identify three-dimensional data, so that three-dimensional data of the face of the viewer can be accurately obtained even in a dark environment, and the adaptability and the stability are better. Moreover, the information collector 12 can also be used to distinguish the real human skin from the fake human face model or human face photo due to the difference between the reflection and absorption of the infrared light by the human skin, so as to improve the reliability of the identification.
Optionally, in some embodiments, the information collector 12 is disposed on the display screen 3. The information collector 12 and the display screen 3 have a fixed relative position relationship, and the depth information of the external object acquired by the information collector 12 relative to the information collector 12 can obtain the depth information of the corresponding external object relative to the display screen 3 through geometric conversion. Alternatively, in other embodiments, the information collector 12 may be disposed at other locations of the electronic device 2, which is not limited by the present application.
It will be appreciated that in some embodiments, the information collector 12 may include one or more sets of components corresponding to one or more three-dimensional sensing principles employed. For example: a structured light emitter (not shown) and an image sensor (not shown) according to the structured light sensing principle, at least two image sensors (not shown) according to the binocular vision sensing principle, or a light emitter (not shown) and a light receiver (not shown) according to the Time of Flight (TOF) sensing principle.
Fig. 2 is a schematic diagram of a functional module 160 of the electronic device 2 provided with the information protection device 1 according to an embodiment of the present invention. The electronic device 2 comprises a storage medium 22 and a power supply 24. The information collector 12, processor 14, power supply 24, storage medium 22, and control system 16 may be interconnected by a bus to communicate data and signals with each other.
The power supply 24 may supply power to the various components of the electronic device 2 by connecting to mains and performing a corresponding adapting process. The power supply 24 may also include an energy storage element such as a battery to provide power to the various components via the battery.
The storage medium 22 includes, but is not limited to, flash Memory (Flash Memory), charged erasable programmable read-only Memory (ELECTRICALLY ERASABLE PROGRAMMABLE READ ONLY MEMORY, EEPROM), programmable read-only Memory (Programmable read only Memory, PROM), and hard disk. The storage medium 22 is used for storing an identity characteristic template of a preset authorizer, intermediate data generated in the identification process, computer software codes for realizing the identification and control functions, and the like.
Optionally, in some embodiments, the control system 16 includes one or more functional modules 160, the functional modules 160 including, but not limited to, a setup module 161, a depth information acquisition module 162, a face orientation calculation module 163, a viewable area calculation module 164, an identification module 165, a judgment module 166, and an information protection module 167. The functional module 160 may be firmware solidified in the corresponding storage medium 22 or computer software code stored in the storage medium 22. The functional modules 160 are executed by the corresponding one or more processors 14 to control the relevant components to implement the corresponding functions, such as: and an information protection function.
It will be appreciated that in other embodiments, the corresponding functions of one or more of the functional modules 160 in the control system 16 may also be implemented in hardware, such as any one or a combination of the following hardware: discrete logic circuits having logic gates for implementing logic functions on data signals, application specific integrated circuits having suitable combinational logic gates, programmable Gate Arrays (PGAs), field Programmable Gate Arrays (FPGAs), and the like.
The setting module 161 is configured to preset various parameter information that is required in the information protection process. The parameter information includes, but is not limited to, induction reference information required by auto-induction wake-up, peeping judgment conditions according to which peeping judgment is performed, reference data and identity feature templates required by identity recognition. The parameter information may be stored in the storage medium 22. It can be appreciated that the parameter information can be preset by the manufacturer before leaving the factory, or can be set or adjusted by the user during the use of the product.
As shown in fig. 3, in some embodiments, the reference data includes, but is not limited to, a screen reference direction Z, a distance threshold between the viewer's face and the display screen 3, and an eye visibility angle threshold α. The screen reference direction Z is a direction perpendicular to the surface of the display screen 3. The information collector 12 is fixedly arranged relative to the display screen 3, and the display screen 3 has a certain coordinate value in a coordinate system established by taking the information collector 12 as a reference point, so that a vector direction vertical to the surface of the display screen 3 can be defined as the screen reference direction Z. The distance H between the viewer's face and the display screen 3 refers to the minimum distance value between the viewer's face and the display screen 3 acquired by the information collector 12. And if the distance between the face of the viewer and the display screen 3 is smaller than a preset peeping distance threshold, the viewer is considered to have peeping risk. Alternatively, the peeping distance threshold may be 20 cm, 30 cm, 45 cm, 60cm, 80 cm, or the like.
The eyes of a person can be rotated within a certain angle range, so that a viewer can observe a situation that the face is deviated from the face direction F within a certain angle range while the head remains stationary. For this purpose, the eye visibility angle threshold may be defined as the maximum angular range that the viewer's eye gaze E can observe from the face orientation F. In order to effectively prevent peeping of a viewer, it is necessary that the viewing area S in which the plane of the display screen 3 is covered by the range of viewing angles of eyes of the viewer does not overlap with the display screen 3 or a partial area within the display screen 3.
Optionally, in some embodiments, the eye visibility angle threshold may be 45 degrees, 50 degrees, 60 degrees, 70 degrees, or 80 degrees. It will be appreciated that the eye visibility angle threshold may take the same angle value or different angle values along different directions, for example: the threshold of eye visibility in the horizontal direction of the head is 60 degrees, and the threshold of eye visibility in the vertical direction of the head is 45 degrees. Therefore, a cone-shaped visual angle range of the eyes of the viewer can be constructed by setting the visual angle thresholds of the eyes along different directions of the head, and the area covered by the visual angle range of the eyes of the viewer on the plane of the display screen 3 is the visual area S of the viewer under the current face direction F. If the visual area S overlaps the display screen 3, it may be considered that the viewer has a peeping risk in the state where the viewer is currently located and the face is oriented to F; on the contrary, if the visual area S does not overlap with the display screen 3, it may be considered that the viewer is not at risk of peeping in the state of the current position and the face orientation F.
Alternatively, in some embodiments, not all of the display screen 3 may be invisible, or only a local area of the display screen 3 for displaying important information may be restricted from being visible. In this case, the viewer is considered to be at risk of peeping only when the viewing area S of the viewer overlaps with a preset partial area within the display screen 3. For convenience of explanation, the area on the display 3 that is not allowed to be peeped is collectively defined as a key area K, which may be the entire display 3 or a local area within the display 3 for displaying important information.
Optionally, in some embodiments, the peeping judgment condition includes a first judgment condition and/or a second judgment condition. The first judgment condition is that the distance H between the face of the viewer and the display screen 3 is smaller than a preset peeping distance threshold. The second judgment condition is that the visible area S of the viewer overlaps the display screen 3 or a preset local area in the display screen 3. It will be appreciated that in some embodiments, the viewer is judged to be at risk of peeping when either of the first or second judgment conditions is met. Optionally, in other embodiments, the first determining condition and the second condition must be met at the same time to determine that the viewer peeps.
The depth information acquiring module 162 is configured to control the information acquirer 12 to acquire depth data in a scene, and analyze and process the depth data acquired by the information acquirer 12 to acquire depth information in the scene. Optionally, in some embodiments, the depth information includes a distance H between the viewer and the display screen 3 and three-dimensional data of the viewer's face. The method for acquiring the depth information depends on the three-dimensional sensing principle adopted by the information collector 12. For example: in some embodiments, the information collector 12 employs structured light sensing principles to project a patterned beam of light toward a viewer or the space in which the viewer is located, such as: the beam is scattered and the light pattern formed by the patterned beam in the space of the viewer or the viewer is collected again. The depth information acquiring module 162 acquires depth information of a face or a space of a viewer by calculating a distortion value between the acquired light pattern and a preset reference plane light pattern. In other embodiments, the information collector 12 employs the TOF principle to transmit light beams at a specific frequency/time period to the viewer or space where the viewer is located, and then receive light beams reflected back from the viewer or space where the viewer is located. The depth information acquiring module 162 acquires depth information of a viewer or a space in which the viewer is located by calculating a time difference required for the light beam from being transmitted to being received. In other embodiments, the information collector 12 and the depth information acquisition module 162 may also employ binocular vision sensing principles to acquire the depth of the viewer or the space in which it is located.
The face direction F calculating module 163 is configured to calculate a face direction F of the viewer from the acquired three-dimensional data of the face of the viewer. Alternatively, in some embodiments, as shown in fig. 4, the face direction F calculating module 163 extracts feature points of the face of the viewer, acquires three-dimensional data of the feature points, connects the extracted feature points to construct a corresponding face reference plane, and calculates a vertical vector of the face reference plane as a vector of the face direction F according to the three-dimensional data of the feature points constructing the face reference plane. Alternatively, the facial feature points may be set to, but not limited to, left eye, right eye, nose tip, mouth corner, and the left/right eyes may also be corresponding eye corners.
The number of the facial feature points of the viewer connected to the face orientation F calculation module 163 is not limited in principle, but it is required to satisfy that the facial feature points for constructing the face reference plane are required to be located on the same plane so that the constructed face reference plane is a plane. Alternatively, in some embodiments, the face orientation F calculation module 163 may construct the face reference plane by connecting three feature points of the viewer's face. For example: as shown in fig. 4, the left eye, the right eye and the left mouth corner form a face reference plane, and the face orientation F of the viewer can be obtained by calculating the vertical vector of the face reference plane.
It will be appreciated that the face orientation F calculated by selecting different facial feature points to construct the face reference plane will differ slightly. Alternatively, in some embodiments, as shown in fig. 5, the face orientations F1 and F2 calculated for two different face reference planes respectively may be calculated, and the vector direction obtained by summing the vectors of the two may be taken as the face orientation F. Similarly, the final face orientation F may be obtained by sequentially adding the vertical vectors of the face reference planes constructed corresponding to the different facial feature points.
The visible area calculating module 164 is configured to calculate a range of a visible area S of the viewer on the plane on which the display screen 3 is located according to the face direction F of the viewer and a preset eye visibility angle threshold. Optionally, in some embodiments, as shown in fig. 3, the visible region calculating module 164 is configured to deviate from the face direction F by a set eye-visibility angle threshold value along a preset direction with the face direction F as a center to construct the eye-visibility angle range of the viewer. Alternatively, the range of viewing angles for the viewer's eyes may be generally a three-dimensional cone-shaped structure. The visible area calculating module 164 obtains an area S covered by the range of the visible angles of the eyes of the viewer on the plane of the display screen 3, for example: coordinate values of coordinate points on and in the boundary of the region S are calculated, and the region S is taken as a visual region S of the viewer at the current position and face orientation F. It will be appreciated that the above-described calculation of the viewing area S is performed within a coordinate system established with the information collector 12 as a reference point.
The judging module 166 is configured to judge whether the current viewer has a peeping risk according to a preset peeping judging condition. Optionally, if the peeping judging condition is set to only need to meet the first judging condition: the judging module 166 compares the distance between the face of the viewer and the display screen 3, which is acquired by the information collector 12, with a preset peeping distance threshold, and judges that the current viewer has peeping risk when the distance between the face of the viewer and the display screen 3 is smaller than the preset peeping distance threshold.
Optionally, if the peeping judgment condition is set to only need to satisfy the second judgment condition: the judging module 166 compares the visual area S of the viewer with the preset key area K of the display screen 3 in the current position and the face direction F, if the visual area S is overlapped with the key area K of the display screen 3, the judging module 166 judges that the viewer is at risk of peeping, and if the visual area S is not overlapped with the key area K of the display screen 3, the judging module 166 judges that the viewer is not at risk of peeping. It may be appreciated that, the current visual area S of the viewer and the critical area K of the display screen 3 may be marked in a coordinate system established by the information collector 12 for a reference point, and the determining module 166 may determine whether the two overlap according to whether the coordinate point in the current visual area S of the viewer is located in the coordinate range of the critical area K of the display screen 3, or may determine whether the two overlap according to whether the coordinate point in the critical area K of the display screen 3 is located in the coordinate range of the current visual area S of the viewer.
Optionally, if the peeping judgment condition is set to satisfy the first judgment condition and the second judgment condition at the same time, the judgment module 166 judges that the current viewer has peeping risk when the distance between the face of the current viewer and the display screen 3 is smaller than the preset peeping distance threshold and the visible region S of the current viewer overlaps the key region K of the display screen 3.
The identifying module 165 is configured to identify whether the identity of the current viewer is a preset authorizer if it is determined that the current viewer has a peeping risk. Optionally, in some embodiments, the identification module 165 identifies whether the viewer is a preset authorizer by matching and analyzing differences between the three-dimensional data of the viewer's face acquired by the information collector 12 and a preset authorizer identity template.
The information protection module 167 is configured to control the electronic device 2 to perform a corresponding information protection operation when it is identified that the viewer with the peeping risk is not a preset authorizer. Optionally, in some embodiments, the information protection operation includes, but is not limited to, closing the display screen 3, popping up a text prompt with peeping resistance, changing the brightness of the display screen 3, sending out a sound prompt with peeping resistance, recording related evidence of the viewer at risk of peeping, and automatic alarm.
Optionally, in some embodiments, as shown in fig. 6, the information protection device 1 may further comprise an automatic sensor 16. The auto-sensor 16 is configured to obtain environmental information of an environment in which the electronic device 2 is located and/or status information of the electronic device 2. The environment information includes sound information, image information, depth information, and the like. The state information includes acceleration information and the like.
Accordingly, in some embodiments, the control system 16 further includes an auto-sensing module 168. The setting module 161 is preset with sensing reference information. The inductive reference information may be pre-stored in the storage medium 22 for triggering a wake-up function of the information protection device 1. The auto-sensing module 168 compares the environmental information and/or status information with a preset sensed reference information to wake up or shut down the information protection device 1. Optionally, the sensing reference information includes an audio feature template, a proximity distance threshold, a face number threshold, an acceleration change threshold, and the like. For example; the auto-sensing module 168 may compare the acquired environmental sound information with an audio feature template obtained by training in a specific scene in advance, and automatically wake up/turn off the information protection device 1 when the electronic device 2 is determined to be in the specific scene after the comparison. Or the auto-sensing module 168 analyzes the acquired distance between the viewer and the display screen 3, and automatically wakes up the information protection device 1 when the distance between the viewer and the display screen 3 is less than a preset approach distance threshold, alternatively, the approach distance threshold may be 1 meter, 2 meters, 3 meters, etc. Or the auto-sensing module 168 analyzes the acquired front image information of the display screen 3, and wakes up the information protection device 1 automatically when the number of faces appearing in the front image of the display screen 3 exceeds a preset threshold of the number of faces, alternatively, the threshold of the number of faces may be 1,2, 3, etc. Or according to the sensed acceleration change condition of the electronic device 2, when the sensed acceleration change amplitude exceeds a preset acceleration change threshold value, the electronic device 2 is judged to be lifted, and then the information protection device 1 is automatically awakened.
Correspondingly, in some embodiments, the auto-sensor 16 includes, but is not limited to, one or more of a microphone, an image sensor, a proximity sensor, an acceleration sensor, or a combination thereof. It will be appreciated that the image information for auto-sensing described above may be obtained by an image sensor in the information collector 12, or by an image sensor that is otherwise specifically provided.
It can be appreciated that the working power consumption of the auto-inductor 16 is significantly smaller than that of the information collector 12, and the electronic device 2 can wake up the information protection device 1 to collect three-dimensional data through the information collector 12 by setting the auto-inductor 16 and the corresponding functional module 160 when peeping is possible, so that the information collector 12 with higher power consumption is prevented from working for a long time, and the overall power consumption of the electronic device 2 is reduced as much as possible while the information protection function is realized.
Referring to fig. 2 and fig. 7 together, the embodiment of the application further provides a method for protecting information of the electronic device 2 by using the information protection device 1. The information protection method comprises the following steps:
Step S101, three-dimensional data of the face of the viewer is acquired. Optionally, in some embodiments, the processor 14 controls the information collector 12 to acquire three-dimensional data of the face of the viewer in front of the display screen 3 by executing a depth information acquisition module 162. The method for acquiring the depth information depends on the three-dimensional sensing principle adopted by the information collector 12. For example: in some embodiments, the information collector 12 employs structured light sensing principles to project a patterned beam of light toward a viewer or the space in which the viewer is located, such as: the beam is scattered and the light pattern formed by the patterned beam in the space of the viewer or the viewer is collected again. The depth information acquiring module 162 acquires depth information of a face or a space of a viewer by calculating a distortion value between the acquired light pattern and a preset reference plane light pattern. In other embodiments, the information collector 12 employs the TOF principle to transmit light beams at a specific frequency/time period to the viewer or space where the viewer is located, and then receive light beams reflected back from the viewer or space where the viewer is located. The depth information acquiring module 162 acquires depth information of a viewer or a space in which the viewer is located by calculating a time difference required for the light beam from being transmitted to being received. In other embodiments, the information collector 12 and the depth information acquisition module 162 may also employ binocular vision sensing principles to acquire the depth of the viewer or the space in which it is located.
Step S102, the face orientation F of the viewer is calculated. Optionally, in some embodiments, the processor 14 calculates the face orientation F of the viewer from the acquired three-dimensional data of the face of the viewer by executing the face orientation F calculation module 163. Optionally, in some embodiments, referring to fig. 4, 5 and 8, the step S102 may further include the following sub-steps:
And step S1021, extracting the three-dimensional data of the facial feature points of the viewer. Optionally, in some embodiments, the processor 14 extracts the three-dimensional data of facial feature points from the acquired three-dimensional data of the viewer's face by executing the face orientation F calculation module 163. Alternatively, the facial feature points may be set to, but not limited to, left eye, right eye, nose tip, mouth corner, and the left/right eyes may also be corresponding eye corners.
Step S1022, concatenating the extracted facial feature points to construct a corresponding facial reference plane. Optionally, in some embodiments, the processor 14 connects the extracted viewer facial feature points by executing a facial orientation F calculation module 163 to construct a corresponding facial reference plane. In this step, the number of connected viewer facial feature points is not limited in principle, but it is required to satisfy that facial feature points for constructing a face reference plane need to be located on the same plane so that the constructed face reference plane is a plane. Alternatively, in some embodiments, the face reference plane may be constructed by connecting three feature points of the viewer's face. For example: as shown in fig. 4, the left eye, the right eye and the left mouth corner form a face reference plane, and the face orientation F of the viewer can be obtained by calculating the vertical vector of the face reference plane.
Step S1023, calculating the vertical vector of the face reference plane as the vector of the face direction F. Alternatively, in some embodiments, the processor 14 calculates the vertical vector of the face reference plane as the vector of the face orientation F from three-dimensional data of facial feature points constructing the face reference plane by executing the face orientation F calculation module 163.
It will be appreciated that the face orientation F calculated by selecting different facial feature points to construct the face reference plane will differ slightly. Alternatively, in some embodiments, the face directions F calculated for the two different face reference planes may be calculated, and the vector direction obtained by summing the vectors of the two faces may be taken as the face direction F. Similarly, the vertical vectors of the face reference planes constructed corresponding to the different facial feature points may be sequentially added to obtain the final face orientation F.
Step S103, the visual area S range of the viewer on the plane of the display screen 3 is calculated according to the face direction F of the viewer and the preset eye visibility angle threshold. The processor 14 executes the visible area S calculating module 164 to calculate the visible area S of the viewer on the plane of the display screen 3 according to the obtained face direction F of the viewer and the preset threshold value of the eye visibility angle. Optionally, in some embodiments, referring to fig. 3 and fig. 9 together, the step S103 may further include the following sub-steps:
In step S1031, a viewer' S eye-viewable angle range is constructed. Optionally, in some embodiments, the processor 14 executes the visual area S calculation module 164 to construct the range of eye-visibility angles for the viewer by respectively deviating the set eye-visibility angle thresholds from the face orientation F in a preset direction, centering on the face orientation F of the viewer. Alternatively, the range of viewing angles for the viewer's eyes may be generally a three-dimensional cone-shaped structure.
In step S1032, the area covered by the visual angle range of the eyes of the viewer on the plane of the display screen 3 is calculated and used as the visual area S of the viewer at the current position and the face direction F. Optionally, in some embodiments, the processor 14 executes the viewable area S calculation module 164 to obtain an area of coverage of the constructed viewer eye viewable angle range on the plane of the display screen 3, for example: coordinate values of coordinate points on and in the boundary of the region are calculated, and the region is taken as a visible region S of the viewer at the current position and face orientation F. It will be appreciated that the above-described calculation of the viewable area S is performed within a coordinate system established for the reference point by the information collector 12.
Step S104, judging whether the current viewer has peeping risks or not according to preset peeping judging conditions. Optionally, in some embodiments, the processor 14 executes the determining module 166 to determine whether the current viewer is at risk of peeping according to a preset peeping determination condition.
Alternatively, in some embodiments, as shown in fig. 10, if the peeping judgment condition is set so that only the first judgment condition needs to be satisfied: the distance between the face of the viewer and the display screen 3 is smaller than a preset distance threshold, and the step S104 includes the following sub-steps:
In step S1041, a distance between the face of the current viewer and the display screen 3 is acquired. Optionally, in some embodiments, the processor 14 executes the determination module 166 to obtain the distance between the current viewer's face and the display screen 3 via the information collector 12.
In step S1042, the obtained distance between the face of the current viewer and the display screen 3 is compared with a preset peeping distance threshold. Optionally, in some embodiments, the processor 14 executes the determination module 166 to compare the distance between the current viewer's face, as acquired by the information collector 12, and the display screen 3 to a preset peeping distance threshold.
Step S1043, when the distance between the face of the viewer and the display screen 3 is smaller than the preset peeping distance threshold, determining that the current viewer has peeping risk. And when the distance between the face of the viewer and the display screen 3 is larger than a preset peeping distance threshold value, judging that the current viewer has no peeping risk.
Alternatively, in some embodiments, as shown in fig. 11, if the peeping judgment condition is set so that only the second judgment condition needs to be satisfied: the visual area S of the current viewer overlaps with the preset key area of the display screen 3, and the step S104 includes the following sub-steps:
Step S1044 is to compare the current viewer visible area S with the preset key area of the display screen 3. The processor 14 executes a decision module 166 to compare the viewable area S of the viewer at the current position and face orientation F with the key area of the display screen 3. It may be understood that the current visual area S of the viewer and the key area of the display screen 3 may be marked in a coordinate system established by the information collector 12 for a reference point, whether the two overlap may be determined according to whether the coordinate point in the current visual area S of the viewer is located in the coordinate range of the key area of the display screen 3, or whether the two overlap may be determined according to whether the coordinate point in the key area of the display screen 3 is located in the coordinate range of the current visual area S of the viewer.
In step S1045, if the visible area S overlaps with the key area of the display screen 3, it is determined that the viewer is currently at risk of peeping. If the visual area S does not overlap with the key area of the display 3, the determining module 166 determines that the viewer is not at risk of peeping.
Alternatively, in some embodiments, as shown in fig. 12, if the peeping judging condition is set to be required to satisfy the first judging condition and the second judging condition at the same time, the step S104 includes the following sub-steps:
In step S1046, a distance between the face of the current viewer and the display screen 3 is acquired. The processor 14 executes the judgment module 166 to acquire the distance between the current viewer's face and the display screen 3 through the information collector 12.
Step S1047, comparing the obtained distance between the face of the current viewer and the display screen 3 with a preset peeping distance threshold. The processor 14 executes the judgment module 166 to compare the distance between the current viewer's face acquired by the information collector 12 and the display screen 3 with a preset peeping distance threshold.
In step S1048, the obtained viewer visible area S is compared with the preset key area of the display screen 3 when the distance between the viewer face and the display screen 3 is smaller than the preset peeping distance threshold.
In step S1049, if the distance between the face of the current viewer and the display screen 3 is smaller than the preset distance threshold and the visible area S of the current viewer overlaps the key area of the display screen 3, it is determined that the current viewer has a peeping risk. And if the distance between the face of the current viewer and the display screen 3 is larger than a preset distance threshold or the visible area S of the viewer is not overlapped with the key area of the display screen 3, judging that the current viewer has no peeping risk.
Step S105, if it is determined that the current viewer has a peeping risk, identifying whether the identity of the viewer is a preset authorizer.
Optionally, in some embodiments, the processor 14 identifies whether the viewer is a preset authorizer by executing the identification module 165 to match and analyze differences between the three-dimensional data of the viewer's face acquired by the information collector 12 and a preset authorizer identity template.
In step S106, if it is identified that the viewer with the peeping risk is not the preset authorizer, the electronic device 2 is controlled to execute the corresponding information protection operation. Optionally, in some embodiments, the processor 14 implements a corresponding information protection operation by executing the information protection module 167. The information protection operation includes, but is not limited to, closing the display screen 3, popping up a word prompt with peep prevention, changing the brightness of the display screen 3, sending out a sound prompt with peep prevention, recording relevant evidence of a viewer with peep risk, automatically alarming, etc.
According to the information protection device 1 and the corresponding information protection method thereof provided by the application, according to the acquired three-dimensional data of the viewer in front of the display screen 3, the peeping risk of the viewer is judged first, and then the identity of the viewer is identified, so that the frequent identification based on the three-dimensional data with larger power consumption can be avoided. Secondly, when the viewer who is identified as an unauthorized person at risk of peeping is also identified, a corresponding information protection operation may be automatically performed to prevent important information on the display screen 3 from being peeped.
Referring to fig. 6 and fig. 13 together, in some other embodiments, the information protection method provided by the present application further includes step S100 executed before step S101:
Step S100, steps S101 to S106 of the information protection method are started according to the environmental information of the environment in which the electronic device 2 is located and/or the status information of the electronic device 2. The processor 14 initiates or stops the information protection method by executing an auto-sensing module 168 based on the environmental information and status information of the electronic device 2 acquired by the auto-sensor 16. The environment information includes sound information, image information, depth information, and the like. The state information includes acceleration information and the like.
Specifically, the environmental information and/or status information of the electronic device 2 is acquired by the auto-sensor 16, and the acquired environmental information and/or status information is compared with preset sensing reference information. The processor 14 compares the acquired environmental information and/or status information with preset sensing reference information by executing the auto-sensing module 168, and wakes up or shuts down the information protection device 1 according to the comparison result.
Optionally, the sensing reference information includes an audio feature template, a proximity distance threshold, a face number threshold, an acceleration change threshold, and the like. For example; the auto-sensing module 168 may compare the acquired environmental sound information with an audio feature template obtained by training in a specific scene in advance, and automatically wake up/turn off the information protection device 1 when the electronic device 2 is determined to be in the specific scene after the comparison. Or the auto-sensing module 168 analyzes the acquired distance between the viewer and the display screen 3, and automatically wakes up the information protection device 1 when the distance between the viewer and the display screen 3 is less than a preset approach distance threshold, alternatively, the approach distance threshold may be 1 meter, 2 meters, 3 meters, etc. Or the auto-sensing module 168 analyzes the acquired front image information of the display screen 3, and wakes up the information protection device 1 automatically when the number of faces appearing in the front image of the display screen 3 exceeds a preset threshold of the number of faces, alternatively, the threshold of the number of faces may be 1, 2, 3, etc. Or according to the sensed acceleration change condition of the electronic device 2, when the sensed acceleration change amplitude exceeds a preset acceleration change threshold value, the electronic device 2 is judged to be lifted, and then the information protection device 1 is automatically awakened.
It will be appreciated that the operating power consumption of the autoinductor 16 is significantly less than the operating power consumption of the information collector 12. According to the information protection method provided by the application, the information protection device 1 is awakened to collect three-dimensional data through the information collector 12 when the peeping condition is possibly sensed, so that the long-term work of the information collector 12 with higher power consumption is avoided, the information protection function is realized, and the overall power consumption of the electronic equipment 2 is reduced as much as possible.
In the description of the present specification, reference to the terms "one embodiment," "certain embodiments," "illustrative embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and further implementations are included within the scope of the preferred embodiment of the present application in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
Logic and/or steps represented in the flowcharts or otherwise described herein, e.g., a ordered listing of executable instructions for implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, system that includes a processing module, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a random access storage medium 22 (RAM), a read-only storage medium 22 (ROM), an erasable programmable read-only storage medium 22 (EPROM or flash memory medium 22), an optical fiber device, and a portable compact disc read-only memory (CDROM). In addition, the computer readable medium may even be paper or other suitable medium on which the program is printed, as the program may be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in the computer storage medium 22.
It is to be understood that portions of embodiments of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above-described embodiments, the various steps or methods may be implemented in software or firmware stored in the storage medium 22 and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, may be implemented using any one or combination of the following techniques, as is well known in the art: discrete logic circuits having logic gates for implementing logic functions on data signals, application specific integrated circuits having suitable combinational logic gates, programmable Gate Arrays (PGAs), field Programmable Gate Arrays (FPGAs), and the like.
The foregoing description of the preferred embodiments of the application is not intended to be limiting, but rather is intended to cover all modifications, equivalents, and alternatives falling within the spirit and principles of the application.

Claims (6)

1. An information protection device for protecting information displayed on a display screen from being peeped by an unauthorized person, the information protection device comprising:
the information collector is used for collecting depth information in a scene according to a three-dimensional sensing principle, wherein the depth information comprises three-dimensional data of the face of a viewer; and
The processor is used for analyzing whether the acquired three-dimensional data of the face of the viewer meets a preset peeping judging condition to judge whether the viewer has peeping risks, identifying whether the viewer is a preset authorizer by matching the difference between the three-dimensional data of the face of the viewer and a preset authorizer identity characteristic template after judging that the viewer has peeping risks, and executing information protection operation when identifying that the viewer with peeping risks is an unauthorized person;
The processor extracts feature points of the face of the viewer, connects the extracted feature points of the face to construct a corresponding face reference plane, and calculates a vertical vector of the face reference plane as the face orientation of the viewer according to three-dimensional data of the feature points of the face constructing the face reference plane;
The processor is used for setting the face orientation of the viewer as the center, deviating the set eye visual angle threshold value from the face orientation along the preset direction to construct the eye visual angle range of the viewer, and taking the area covered by the eye visual angle range of the viewer on the plane of the display screen as the visual area of the viewer at the current position and with the face orientation;
The peeping judgment conditions comprise a first judgment condition and a second judgment condition, and the first judgment condition is as follows: the distance between the face of the viewer and the display screen is smaller than a preset peeping distance threshold, and the second judgment condition is that: the visual area of the viewer is overlapped with a preset key area of the display screen;
when the first judging condition and the second judging condition are simultaneously established, the processor judges that the peeping risk exists for the viewer, and the depth information also comprises the distance between the face of the viewer and the display screen; or alternatively
And the processor judges that the peeping risk exists for the viewer only when the second judging condition is met.
2. The information protection device according to claim 1, wherein: the information protection operation comprises the steps of closing a display screen, popping up a word prompt which is used for preventing peeping, changing the brightness of the display screen, sending out a sound prompt which is used for preventing peeping, recording related evidence of peeping risk and automatically alarming.
3. An information protection device according to claim 1 or 2, characterized in that: the information protection device further comprises an automatic sensor, wherein the automatic sensor is used for acquiring environmental information of the environment where the display screen is located and/or state information of the display screen, and the processor wakes up the information protection device according to a comparison result of the acquired environmental information and/or state information and preset sensing reference information.
4. The information protection device according to claim 3, wherein the sensing reference information includes an audio feature template, a proximity distance threshold, a face number threshold, and an acceleration change threshold, and the processor compares the environmental sound information acquired by the automatic sensor with the audio feature template obtained by training in a specific scene in advance, and wakes up the information protection device when the display screen is determined to be in the preset specific scene after the comparison;
Or the automatic sensor acquires the distance between the viewer and the display screen, and when the distance between the viewer and the display screen is smaller than a preset approach distance threshold value, the processor wakes up the information protection device;
or the automatic sensor acquires image information in front of a display screen, and the processor wakes up the information protection device when the number of faces appearing in the image in front of the display screen exceeds a preset threshold value of the number of faces;
Or the automatic sensor acquires the acceleration change condition of the display screen, and the processor wakes up the information protection device when the sensed acceleration change amplitude exceeds a preset acceleration change threshold value.
5. The information protection device of claim 3, wherein the auto-sensor comprises one or more of a microphone, an image sensor, a proximity sensor, an acceleration sensor.
6. An electronic device, comprising:
the display screen is used for displaying information; and
An information protection device according to any one of claims 1 to 5, for protecting information displayed on a display screen from being peeped by an unauthorized person.
CN202011633518.0A 2020-12-31 2020-12-31 Information protection device and electronic equipment Active CN112784323B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011633518.0A CN112784323B (en) 2020-12-31 2020-12-31 Information protection device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011633518.0A CN112784323B (en) 2020-12-31 2020-12-31 Information protection device and electronic equipment

Publications (2)

Publication Number Publication Date
CN112784323A CN112784323A (en) 2021-05-11
CN112784323B true CN112784323B (en) 2024-05-03

Family

ID=75754774

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011633518.0A Active CN112784323B (en) 2020-12-31 2020-12-31 Information protection device and electronic equipment

Country Status (1)

Country Link
CN (1) CN112784323B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023156475A1 (en) * 2022-02-15 2023-08-24 Trinamix Gmbh Method for protecting information displayed on a display device and display device
CN115294925A (en) * 2022-09-16 2022-11-04 浙江亿洲电子科技有限公司 LED display screen control system for privacy protection according to environment detection

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010277205A (en) * 2009-05-26 2010-12-09 Nec Corp Information processing device, method and program
JP2012008773A (en) * 2010-06-24 2012-01-12 Hitachi Omron Terminal Solutions Corp Cash automatic transaction device, program and transaction method
CN102610035A (en) * 2012-04-05 2012-07-25 广州广电运通金融电子股份有限公司 Financial self-service device and anti-peeping system and anti-peeping method thereof
CN106156663A (en) * 2015-04-14 2016-11-23 小米科技有限责任公司 A kind of terminal environments detection method and device
CN106682540A (en) * 2016-12-06 2017-05-17 上海斐讯数据通信技术有限公司 Intelligent peep-proof method and device
CN107908983A (en) * 2017-11-14 2018-04-13 维沃移动通信有限公司 A kind of method and mobile terminal for controlling mobile terminal screen
CN108322596A (en) * 2017-12-26 2018-07-24 努比亚技术有限公司 A kind of display control method, terminal and computer readable storage medium
CN109376518A (en) * 2018-10-18 2019-02-22 深圳壹账通智能科技有限公司 Privacy leakage method and relevant device are prevented based on recognition of face
CN110619240A (en) * 2019-09-23 2019-12-27 珠海格力电器股份有限公司 Anti-peeping device, anti-peeping method and display terminal equipment
CN110955912A (en) * 2019-10-29 2020-04-03 平安科技(深圳)有限公司 Privacy protection method, device and equipment based on image recognition and storage medium thereof
CN111611630A (en) * 2020-04-14 2020-09-01 上海卓易科技股份有限公司 Mobile terminal peeping prevention method and mobile terminal

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010277205A (en) * 2009-05-26 2010-12-09 Nec Corp Information processing device, method and program
JP2012008773A (en) * 2010-06-24 2012-01-12 Hitachi Omron Terminal Solutions Corp Cash automatic transaction device, program and transaction method
CN102610035A (en) * 2012-04-05 2012-07-25 广州广电运通金融电子股份有限公司 Financial self-service device and anti-peeping system and anti-peeping method thereof
CN106156663A (en) * 2015-04-14 2016-11-23 小米科技有限责任公司 A kind of terminal environments detection method and device
CN106682540A (en) * 2016-12-06 2017-05-17 上海斐讯数据通信技术有限公司 Intelligent peep-proof method and device
CN107908983A (en) * 2017-11-14 2018-04-13 维沃移动通信有限公司 A kind of method and mobile terminal for controlling mobile terminal screen
CN108322596A (en) * 2017-12-26 2018-07-24 努比亚技术有限公司 A kind of display control method, terminal and computer readable storage medium
CN109376518A (en) * 2018-10-18 2019-02-22 深圳壹账通智能科技有限公司 Privacy leakage method and relevant device are prevented based on recognition of face
CN110619240A (en) * 2019-09-23 2019-12-27 珠海格力电器股份有限公司 Anti-peeping device, anti-peeping method and display terminal equipment
CN110955912A (en) * 2019-10-29 2020-04-03 平安科技(深圳)有限公司 Privacy protection method, device and equipment based on image recognition and storage medium thereof
CN111611630A (en) * 2020-04-14 2020-09-01 上海卓易科技股份有限公司 Mobile terminal peeping prevention method and mobile terminal

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
谢剑斌 等编著.《视觉感知与智能视频监控》.国防科技大学出版社,2012,第224-225页. *

Also Published As

Publication number Publication date
CN112784323A (en) 2021-05-11

Similar Documents

Publication Publication Date Title
EP3872689B1 (en) Liveness detection method and device, electronic apparatus, storage medium and related system using the liveness detection method
EP3647129A1 (en) Vehicle, vehicle door unlocking control method and apparatus, and vehicle door unlocking system
WO2020135096A1 (en) Method and device for determining operation based on facial expression groups, and electronic device
EP3453794B1 (en) Foreign matter recognition method and device
US9436862B2 (en) Electronic apparatus with segmented guiding function and small-width biometrics sensor, and guiding method thereof
CN112784323B (en) Information protection device and electronic equipment
JP6702045B2 (en) Monitoring device
CN108446638B (en) Identity authentication method and device, storage medium and electronic equipment
CN104091128B (en) A kind of terminal
US9877196B2 (en) User authentication systems and methods
CN110209273A (en) Gesture identification method, interaction control method, device, medium and electronic equipment
CN108647504B (en) Method and system for realizing information safety display
CN102129554A (en) Method for controlling password input based on eye-gaze tracking
CN112632510A (en) Information protection method and storage medium
CN110619239A (en) Application interface processing method and device, storage medium and terminal
KR101958878B1 (en) Method for security unlocking of terminal and terminal thereof
CN108629278B (en) System and method for realizing information safety display based on depth camera
CN212160784U (en) Identity recognition device and entrance guard equipment
CA2955072A1 (en) Reflection-based control activation
CN104796539A (en) Terminal state control method
CN111695509A (en) Identity authentication method, identity authentication device, machine readable medium and equipment
CN103745199A (en) Risk prevention financial self-help acceptance device and method on basis of face recognition technology
CN109144221B (en) Information prompting method, storage medium and electronic equipment
CN111223219A (en) Identity recognition method and storage medium
JP2009156948A (en) Display control device, display control method, and display control program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant