CN113051538A - Information unlocking method and electronic equipment - Google Patents

Information unlocking method and electronic equipment Download PDF

Info

Publication number
CN113051538A
CN113051538A CN202110345690.4A CN202110345690A CN113051538A CN 113051538 A CN113051538 A CN 113051538A CN 202110345690 A CN202110345690 A CN 202110345690A CN 113051538 A CN113051538 A CN 113051538A
Authority
CN
China
Prior art keywords
parameters
space
unlocking
motion
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110345690.4A
Other languages
Chinese (zh)
Other versions
CN113051538B (en
Inventor
刘曾发
关硕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202110345690.4A priority Critical patent/CN113051538B/en
Publication of CN113051538A publication Critical patent/CN113051538A/en
Application granted granted Critical
Publication of CN113051538B publication Critical patent/CN113051538B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/602Providing cryptographic facilities or services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/64Protecting data integrity, e.g. using checksums, certificates or signatures

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Software Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Hardware Design (AREA)
  • Bioethics (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses an information unlocking method and electronic equipment, and belongs to the technical field of equipment unlocking. The information unlocking method comprises the following steps: receiving a first input requesting unlocking target information; acquiring parameters of a first space track drawn by a second device paired with the first device through motion in space, wherein the parameters of the first space track are determined according to motion parameters detected by a space motion sensor configured on the second device; and unlocking the target information under the condition that the parameters of the first space track are determined to be matched with the second space track. Due to the fact that other people cannot easily observe the track of the second device moving in the space, the effect of well unlocking secret keys can be achieved, and the safety of information privacy is improved.

Description

Information unlocking method and electronic equipment
Technical Field
The application belongs to the technical field of equipment unlocking, and particularly relates to an information unlocking method and electronic equipment.
Background
At present, information in a device is encrypted generally by setting a character string password, setting an unlocking gesture and the like. When the character string password is input, people around the user can easily observe the condition of the keyboard, and the unlocking gesture can be directly displayed on the screen, so that the unlocking mode of the information easily causes the unlocking key to be seen by other people, and the information is unsafe.
Disclosure of Invention
The embodiment of the application aims to provide an information unlocking method and electronic equipment, and the problem that an unlocking key is easily seen to cause unsafe information can be solved.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present application provides an information unlocking method, where the method includes: receiving a first input requesting unlocking target information; acquiring parameters of a first space track drawn by a second device paired with the first device through motion in space, wherein the parameters of the first space track are determined according to motion parameters detected by a space motion sensor configured on the second device; and unlocking the target information under the condition that the parameters of the first space track are determined to be matched with the second space track.
In a second aspect, an embodiment of the present application provides an information unlocking apparatus, including: a first receiving unit for receiving a first input for requesting unlocking target information; the device comprises a first acquisition unit, a second acquisition unit and a control unit, wherein the first acquisition unit is used for acquiring parameters of a first space track drawn by the motion of a second device which is paired with a first device in space, and the parameters of the first space track are determined according to motion parameters detected by a space motion sensor configured on the second device; and the unlocking unit is used for unlocking the target information under the condition that the parameters of the first space track are determined to be matched with the second space track.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a program or an instruction stored on the memory and executable on the processor, and when the program or the instruction is executed by the processor, the method for unlocking information according to the first aspect is implemented.
In a fourth aspect, an embodiment of the present application provides an information unlocking system, where the system includes: a first device comprising a processor, a memory, and a program or instructions stored on the memory and executable on the processor, the program or instructions, when executed by the processor, implementing the steps of the information unlocking method according to the first aspect; a second device comprising a spatial motion sensor for detecting a motion parameter.
In a fifth aspect, the present application provides a readable storage medium, on which a program or instructions are stored, and when executed by a processor, the program or instructions implement the steps of the information unlocking method according to the first aspect.
In a sixth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the information unlocking method according to the first aspect.
In the embodiment of the application, a user can draw a first space track in a space through the second device, the first device is matched with a known second space track to serve as a key of unlocking information, another new unlocking mode is provided, and due to the fact that other people in the track of the second device moving in the space are not easy to observe in the onlooker process, the good effect of keeping the unlocking key secret can be achieved, and the safety of information privacy is improved. When the method is applied to a scene of the head-mounted equipment, other people cannot see the imaging of the stereoscopic graph in the glasses, and the safety of the secret key is further improved.
Drawings
Fig. 1 is a schematic architecture diagram of an information unlocking system according to an embodiment of the present application;
fig. 2 is a schematic diagram of a first spatial trajectory flowchart of an information unlocking method provided in an embodiment of the present application;
fig. 3 is a schematic diagram of a second first spatial trajectory in an information unlocking method provided in an embodiment of the present application;
fig. 4 is a first schematic diagram of a second spatial trajectory in an information unlocking method provided in the embodiment of the present application;
fig. 5 is a schematic diagram illustrating a combination of a first spatial trajectory and a second spatial trajectory in an information unlocking method provided in an embodiment of the present application;
fig. 6 is a second schematic diagram of a second spatial trajectory in the information unlocking method according to the embodiment of the present application;
fig. 7 is a block diagram of an information unlocking device according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in sequences other than those illustrated or described herein. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The information unlocking method provided by the embodiment of the present application is described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
An embodiment of the present application provides an information unlocking system, as shown in fig. 1, which is a schematic diagram of an architecture of an optional information unlocking system, and includes a first device 201 and a second device 202.
The first device 201 comprises a processor, a memory and a program or instructions stored on the memory and executable on the processor, the program or instructions, when executed by the processor, implementing the steps of the information unlocking method according to the first aspect; the second device 202 comprises a spatial motion sensor for detecting a motion parameter.
The first device 201 may pair and communicate with the second device 202 in a wired or wireless manner. For example, the pairing relationship and the communication connection are established through communication methods such as bluetooth, infrared, wireless WLAN and the like. Thus, the first device 201 and the second device 202 can transmit data after establishing the communication connection relationship. The communication method used for establishing the pairing relationship may be different from the communication method used for establishing the communication connection.
The information unlocking method provided by the embodiment of the application can be applied to the first device 201 and executed by the first device 201.
As an example, the memory of the first device 201 stores instructions, and the instructions are executed by the processor to control the first device 201 to execute the information unlocking method provided in the embodiment of the present application. The first device 201 may be an electronic device such as a mobile phone, a tablet computer, an all-in-one machine, a notebook computer, and a smart television, or may also be a head-mounted electronic device, for example, 3D smart glasses, VR glasses, and the like, where the lens may be used as a display screen.
The second device 202 may be an electronic device configured with a spatial motion sensor. Based on the motion parameters acquired by the spatial motion sensor, a trajectory of the second device 202 moving in space may be determined. The spatial motion sensor may include a gravitational acceleration sensor, a gyroscope, a geomagnetic sensor, and the like. Illustratively, the second device 202 may be a wearable device, such as a ring, bracelet, or the like, or the second device may be a smart stylus designed in a pen-like fashion.
Fig. 2 is a schematic flow chart of an example implementation of an information unlocking method provided in an embodiment of the present application, including the following steps 101 to 103:
step 101, a first input for requesting unlocking target information is received.
The target information may be any information in the first device. The target information is locked in advance, the content cannot be viewed, and the content can be viewed after the target information is unlocked. For example, the target information may be a system of the first device, such that unlocking the target information corresponds to unlocking the system; or, the target information may be an application installed in the first device, and thus, unlocking the target information is equivalent to unlocking the corresponding application; for another example, the target information may be a folder in the first device, and thus unlocking the target information is equivalent to unlocking the corresponding folder.
The first input is for requesting unlock target information. The receiving may be through an input device configured by the first device or an input device communicatively coupled to the first device. For example, the first device is a notebook computer, the input device may be a keyboard or a touch screen, and the first input may be a click of any keyboard key to trigger a request for unlocking target information; or, the first device is the 3D smart glasses, the input device may be a smart band or a stylus pen paired with the 3D smart glasses, and the first input may be a click on a screen of the smart band or a press of a button on the stylus pen.
And 102, acquiring parameters of a first space track drawn by the motion of a second device paired with the first device in space.
As an alternative embodiment, the second device may be a device for receiving the first input. The second device can communicate with the first device in a wired connection (for example, through serial communication connection such as USB) or wireless connection (for example, through connection such as bluetooth, infrared, WIFI, etc.). The second device may be paired with the first device in advance, for example, the pairing mode may be: installing a driver of the second device, inputting a pairing password provided by the second device by the first device, inputting a pairing password provided by the first device by the second device, pairing the pairing password provided by the first device by the second device, accessing the second device to the wireless local area network provided by the first device, and the like, which cannot be exhausted, and are not described any more.
It should be noted that the second device is an input device provided with a spatial motion sensor. The second device or the first device may determine a trajectory of the second device moving in the space according to the motion parameters acquired by the space motion sensor. The spatial motion sensor may include a gravitational acceleration sensor, a gyroscope, a geomagnetic sensor, and the like. Illustratively, the second device may be a wearable device, such as a ring, bracelet, or the like, or the second device may be a smart stylus designed in a pen-like fashion.
Based on the motion parameters acquired by the second device, parameters of the first spatial trajectory rendered by the second device may be determined. Since the second device can acquire the motion parameters moving in the space, in order to reduce the influence of the misoperation of the user for moving the second device on the drawing, the user can determine when to start the drawing and when to finish the drawing through specified input.
For example, the second device may be a smart stylus pen, and a switch button may be provided on the stylus pen, and each click may be switched between on and off, and after the switch button is turned on, the switch button may be turned on, and after the switch button is turned off, the switch button may be turned off, so that the user may turn on the switch button before each pen starts drawing, and turn off the switch button after each pen draws, thereby preventing useless operations of moving when drawing different sides from being recorded in the first spatial trajectory.
The first spatial trajectory may be any spatial trajectory, for example, the first spatial trajectory may be a movement trajectory caused by a user waving his arm in the air while wearing the second device at a wrist/arm or the like.
In one example, the first spatial trajectory may be an outline of a solid figure in space. For example, the first spatial trajectory may be an outline of a polyhedron, each face of which is a polygon, so that the user may draw the outline of the above-mentioned solid figure in space when drawing the first spatial trajectory. In this case, the parameters of the first spatial trajectory may include data of vertex coordinates and/or side lengths of the first spatial trajectory, and the like.
The specific way to obtain the parameters of the first spatial trajectory may be to determine the parameters of the first spatial trajectory after the first spatial trajectory is generated.
Alternatively, the parameters of the first spatial trajectory may also be determined directly from the motion parameters detected by the spatial motion sensor of the second device configuration. For example, based on the motion parameters detected by the spatial motion sensor configured in the second device, the position parameters of the spatial points of which the deflection angles are greater than the preset threshold are determined, so as to obtain the coordinates of a plurality of inflection points of the first spatial trajectory.
The first spatial trajectory or the parameter of the first spatial trajectory may be generated by the second device directly according to the detected motion parameter, or may be generated by the first device by the second device sending the motion parameter to the first device.
And 103, under the condition that the parameters of the first space track are determined to be matched with the second space track, unlocking the target information.
The second spatial trajectory is a preset trajectory, and may specifically be preset by a user or preset by a factory. In one example, the second spatial trajectory may be an outline of a solid figure, or, for example, a trajectory of a user's wrist/finger/arm movement while performing a pre-specified set of gymnastics, taijiquan, dance, etc., when the second device is worn at the wrist/finger/arm location.
In one example, determining that the parameter of the first spatial trajectory matches the second spatial trajectory may be that the parameter of the first spatial trajectory differs from the corresponding parameter of the second spatial trajectory by less than a specified threshold.
For example, the parameter may be a motion parameter of a spatial trajectory, the motion parameter of the first spatial trajectory may be a motion parameter of a spatial trajectory drawn by the user when unlocking, and the motion parameter of the second spatial trajectory may be a stored motion parameter of a spatial trajectory drawn by the user when setting the key.
For another example, the obtained parameter of the first spatial trajectory may include an inflection point coordinate and/or a side length of the first spatial trajectory; then, determining that the parameters of the first spatial trajectory match the second spatial trajectory may include: and under the condition that the errors of the first space track and the second space track for the inflection point coordinates and/or the side length are determined to be within a first preset range, determining that the parameters of the first space track are matched with the second space track.
In another example, the second spatial trajectory may be an outline of a preset solid figure. After receiving the first input, a part of the preset first stereoscopic graphic may be displayed on the display screen of the first device, so that the user may complement the drawing of the rest part to be the preset first stereoscopic graphic with the displayed part complement. For example, as shown in fig. 3, an exemplary first stereoscopic graphic may display a second stereoscopic graphic as shown in fig. 4 after receiving the first input, the second stereoscopic graphic being a portion of the first stereoscopic graphic shown in fig. 3. If the difference between the parameters of the first spatial trajectory drawn by the user and the solid-line spatial trajectory shown in fig. 5 is smaller than a specified threshold, it indicates that the drawn first spatial trajectory can be combined with the partial solid figure shown in fig. 4 to form the preset solid figure shown in fig. 3, so that it can be determined that the parameters of the first spatial trajectory match the second spatial trajectory.
The following further describes an exemplary information unlocking method provided in the embodiment of the present application.
In an example embodiment, the step 102 of acquiring parameters of a first space track drawn by a second device paired with a first device through motion in a space may include the following steps 1021-1024:
step 1021, receiving the motion parameter sent by the second device.
And step 1022, generating a third space track drawn by the second device through motion in the space according to the motion parameters.
And step 1023, correcting the third space track into a smooth straight line or a smooth curve to obtain the first space track.
Step 1024, determining parameters of the first spatial trajectory.
After the spatial motion sensor of the second device detects the motion parameters, the spatial motion sensor can transmit the motion parameters to the first device in a real-time communication mode. Therefore, the first device receives the motion parameters, and generates a third space track according to the motion parameters. Since the second device may be operated by a person and thus the third spatial trajectory may not be smooth, the third spatial trajectory may be corrected to a smooth straight line or curve in order to generate a smooth graph, and the corrected third spatial trajectory is the first spatial trajectory. Furthermore, parameters of the first spatial trajectory may be determined, which may be pre-specified, for example, the first spatial trajectory may be a stereo, and then the parameters of the first spatial trajectory may be vertex coordinates and/or side lengths of the first spatial trajectory, and so on. By the implementation mode, the operation amount of the second device can be reduced, the calculation cost of the second device can be reduced, and the smooth contour of the first space track can be obtained, so that the first space track which is more suitable for the drawing intention of the user can be obtained.
Further, in step 1022, in order to prevent the user from generating a useless operation of moving the second device when drawing the first spatial trajectory in the third spatial trajectory drawn by the second device in the space according to the motion parameter, whether the movement of the second device is a drawing operation may be determined by whether the second device receives a pre-specified target operation.
Thus, step 1022 may include the following steps 201-203:
step 201, receiving a drawing switch identifier for the motion parameter sent by the second device.
And step 202, determining effective parameters in the motion parameters according to the drawing switch identification.
And step 203, generating a third space track according to the effective parameters.
The drawing switch identifier may identify whether the motion parameter is a valid parameter. Since the motion parameters are obtained in real time, the motion parameters may include valid parameters during rendering, and may also include invalid parameters in a non-rendering state in addition to the valid parameters. The draw switch identification may be generated based on a target operation received by the second device. For example, after the user presses a drawing button on the smart stylus (second device), or the user clicks the drawing button on the smart stylus to turn on the drawing function, the smart stylus sets the drawing switch flag to on; when the user releases the drawing or closes the drawing button, the smart stylus sets the drawing switch flag to off. The drawing switch identifier and the motion parameter can be synchronously sent to the first equipment, so that the effective parameter in the motion parameter can be determined according to the drawing switch identifier, the outline of the 1 st three-dimensional graph is generated according to the effective parameter, and the third space track is obtained.
In an exemplary embodiment, the second spatial trajectory is an outline of the first stereographic. The first solid figure can be a polyhedron, and each surface of the polyhedron is a polygon. After receiving the first input requesting to unlock the target information in step 101, an image of an outline of a second stereoscopic graphic based on the target viewing angle may also be displayed on the display screen of the first device, where the second stereoscopic graphic is a part of the first stereoscopic graphic, such as the first stereoscopic graphic shown in fig. 3 and the second stereoscopic graphic shown in fig. 4. At this time, the acquired first spatial trajectory may be an outline of the stereoscopic graphic drawn by the user. Thus, in step 103, one embodiment of determining that the parameter of the first spatial trajectory matches the second spatial trajectory may be to combine the first spatial trajectory with the contour of the second stereo graphic to obtain a contour of the third stereo graphic, and then determine whether an error between the contour of the third stereo graphic and the contour of the first stereo graphic is within a second preset range. And if the first spatial trajectory is within a second preset range, determining that the parameters of the first spatial trajectory match the second spatial trajectory.
It should be noted that the target viewing angle may be a preset viewing angle. When the first device is a head-mounted device, the target view angle is a view angle of a preset direction (for example, the preset direction may be a direction toward the front in the middle of two lenses of the head-mounted device) in a coordinate system of the head-mounted device at a spatial position where the head-mounted device is located.
In one example, after acquiring the parameters of the first spatial trajectory drawn by the second device paired with the first device through motion in space in step 102, the imaging of the parameters of the first spatial trajectory based on the target viewing angle may also be displayed on the display screen of the head-mounted device, where the display screen of the head-mounted device is a lens of the head-mounted device, and the target viewing angle is a viewing angle at a spatial position where the head-mounted device is located and facing a preset direction in a coordinate system of the head-mounted device.
In an example embodiment, the user may also set the second spatial trajectory before receiving the first input requesting the unlock target information in step 101. Specifically, before the step 101 is executed, the method further comprises the following steps 104-105:
step 104, receiving a second input requesting to set a second spatial trajectory.
And 105, acquiring a second space track drawn by the motion of the second device in the space.
The interaction mode between the first device and the second device when the second spatial trajectory is drawn may be similar to that when the first spatial trajectory is drawn, and details are not repeated here. Optionally, in order to prevent jitter of the spatial trajectory caused by unstable user operation, after obtaining the spatial trajectory drawn by the user operating the second device, smooth correction may be performed to obtain the second spatial trajectory.
According to the information unlocking method provided by the embodiment of the application, a user can draw the first space track in the space through the second device, the first device is matched with the known second space track to serve as the key of the unlocking information, another new unlocking mode is provided, and due to the fact that the track of the second device moving in the space is not easy to observe by others in the onlooker process, the good effect of keeping the unlocking key secret can be achieved, and the safety of information privacy is improved. When the method is applied to a scene of the head-mounted equipment, other people cannot see the imaging of the stereoscopic graph in the glasses, and the safety of the secret key is further improved.
The information unlocking method provided by the embodiment of the present application is exemplarily described below with reference to an optional specific application scenario.
In this application scenario, the first device is a 3D (three-dimensional stereo) smart glasses and the second device is a smart stylus. Firstly, the connection is established through bluetooth or WIFI to intelligence handwriting pen and 3D intelligent glasses, and establishes the connection back, and the data of intelligence handwriting pen and intelligent glasses can pass each other in real time. The intelligent handwriting pen is provided with sensors such as a gravity acceleration sensor, a gyroscope, geomagnetism and the like, and the processor on the intelligent handwriting pen can calculate the motion trail of the handwriting pen through related data provided by the sensors; meanwhile, the 3D graph track drawn by the intelligent handwriting pen can be seen in real time through the intelligent glasses. It should be noted that the trajectory displayed in the lens screen of the 3D smart glasses is an image rendered based on the viewing angle of the 3D smart glasses.
First, the user can set a stereoscopic image (first stereoscopic image) as a password. The following are example flow steps of operations when setting a password:
step 501: and the user enters a password setting menu of the 3D intelligent glasses.
Step 502: the software system of the 3D smart glasses prompts the user to draw a 3D graphic.
Step 503: the user draws the 3D graph for the first time by using the intelligent handwriting pen successfully paired with the 3D intelligent glasses.
Step 504: the 3D smart glasses store 3D graphics.
Step 505: the software system of the 3D smart glasses prompts the user to draw the 3D graphic again.
Step 506: and the software system of the 3D intelligent glasses compares the 3D images of the front and the back, if the images are consistent (the difference is within a preset range), the step 507 is carried out, and if the images are not consistent, the step 505 is returned.
Step 507: and reminding the user of successful password setting.
Then, after the user has set the second spatial trajectory as a password, two exemplary embodiments are provided for unlocking.
One exemplary embodiment, among others, includes the following process steps:
step 601: the user initiates a request to access an already encrypted file system (unlock target information) in the software system of the 3D smart glasses.
Step 602: the software system of the 3D smart glasses prompts the user to enter an unlocking password.
Step 603: the user hand-draws 3D graphics with the smart stylus.
Step 604: the software system of the 3D intelligent glasses compares a 3D graph (an image formed by the outline of the first space track) drawn by the hand of the user with a graph (a first stereo graph) preset by the software system of the 3D intelligent glasses. If so, go to step 605, and if not, go back to step 602.
Step 605: unlock and enter the file system.
In the first embodiment, when unlocking is needed, a 3D graph is drawn through the intelligent stylus pen, unlocking can be performed if the drawn 3D graph is consistent with a preset 3D graph, and unlocking fails if the drawn 3D graph is inconsistent with the preset 3D graph. The preset 3D graph may be a specific 3D graph customized by the smart stylus, the image is preset as an encryption key of the file system, and the specific process of setting the key may be as described in steps 501 to 507.
An exemplary embodiment includes the following process steps:
step 701: the user initiates a request to access an already encrypted file system (unlock target information) in the software system of the 3D smart glasses.
Step 702: the 3D smart glasses project a 3D stereoscopic image (second stereoscopic image) on the screen (lens).
Step 703: the user replenishes the remaining 3D graphics with the smart stylus.
Step 704: and the software system of the 3D intelligent glasses compares the 3D graph (third stereo graph) combined after the user replans with the graph (first stereo graph) preset by the software system of the 3D intelligent glasses. If so, step 705 is entered, and if not, step 703 is returned to.
Step 705: unlock and enter the file system.
In the second embodiment, when a user initiates an unlocking application, a software system of the 3D smart glasses projects a local graphic (a second stereoscopic graphic) of the unlocking key (a first stereoscopic graphic) first, the user supplements the remaining graphics through the smart stylus pen, and if the final completed graphic (a third stereoscopic graphic) is consistent with the preset 3D graphic (the first stereoscopic graphic), the unlocking is possible, and if the final completed graphic (the third stereoscopic graphic) is not consistent with the preset 3D graphic (the first stereoscopic graphic), the unlocking fails.
Compared with unlocking modes such as password setting and unlocking gestures in the related art, the implementation method provided by the example can utilize the advantage that a user wears the 3D intelligent glasses, and enables people around to be incapable of observing the drawn graphs through naked eyes while accurately drawing the 3D graphs.
In one example, the smart stylus may be paired with a plurality of 3D smart glasses, and if a second user wears the 3D smart glasses around the first user, even though the second user may see a stereoscopic graphic drawn by the first user, the stereoscopic graphic drawn by the first user is an image presented by the first user based on a viewing angle of the second user, and the second user still cannot accurately know the 3D graphic drawn by the first user, thereby improving privacy.
The process and related algorithms for drawing 3D graphics in conjunction with the smart stylus and 3D smart glasses are illustrated below with the solid triangular pyramidal map provided in conjunction with fig. 6 as an example of the first solid graphic.
Firstly, pair connection is carried out to intelligence handwriting pen and 3D intelligence glasses through bluetooth or WIFI, and after the connection succeeds, the data between intelligence handwriting pen and the intelligent glasses can carry out mutual biography in real time.
When the user moves from point a to point B shown in fig. 6 with the smart stylus at an initial velocity of 0, a processor on the smart stylus synchronously records a component a of an acceleration sensor (one of the spatial motion sensors) in the direction AB, and the length L of the point AB can be calculated by using a formula L ═ (1/2) × a × t2 when the smart stylus moves from point a to point B. The lengths of the line segments AC and AD can be calculated based on the same principle.
In the process of moving the intelligent stylus from A to B, the processor on the intelligent stylus can obtain the deflection angle beta 1 of the geomagnetism through a geomagnetic sensor (one of the space motion sensors); in the process of moving from A to C, the processor on the intelligent stylus can obtain the deflection angle beta 2 of the geomagnetism through the geomagnetic sensor; the angle β between the line AB and AC can thus be calculated as | β 1- β 2 |.
In the process of moving the intelligent stylus from A to C, the processor on the intelligent stylus can obtain the deflection angle alpha 1 of the gyroscope through the gyroscope sensor (one of the spatial motion sensors); in the process of moving from A to D, the processor on the intelligent stylus can obtain the deflection angle alpha 2 of the gyroscope through the gyroscope sensor; the angle α | α 1- α 2| between the straight lines AC and AD can thus be calculated.
According to the data, all information of the drawn 3D graph can be obtained by a processor on the intelligent handwriting pen, the information is transmitted to the 3D intelligent glasses in real time to be displayed, and the information can be stored in a software system of the 3D intelligent glasses.
The following describes an exemplary detailed description of the procedure when comparing two 3D graphics in the embodiment of the present application. This is also illustrated in the three-dimensional triangular pyramid pattern of fig. 6.
Firstly, a 3D graph (a stereoscopic graph formed by the outline of the first spatial trajectory) drawn by a user and a 3D graph (a first stereoscopic graph) preset by a software system of the 3D smart glasses are put in the same coordinate axis by a software system of the 3D smart glasses, and the coordinates of the point a are overlapped.
After the above operations, the coordinates of the three points B, C, D are compared. Taking the example of the comparison of the coordinates of the B point, if the coordinates of the B point of the preset graph are (Xb, Yb, Zb), and the coordinates of the B point of the graph drawn by the user are (Xb ', Yb', Zb '), the absolute values | Xb-Xb' |, | Yb-Yb '|, and | Zb-Zb' | of the difference values are calculated, and if the above values are respectively less than 5mm, the coordinate values are considered to be the same. Using the same algorithm, the coordinate values of the keypoints such as C, D can be compared.
When the coordinate values are the same, the line lengths of the key profiles are compared. For example, if the line lengths of the AB, AC, and AD of the preset graph are Lab, Lac, and Lad, respectively, the line lengths of the AB, AC, and AD of the graph drawn by the user are Lab ', Lac', and Lad ', respectively, and then absolute values of the differences | Lab-Lab' |, | Lac-Lac '|, and | Lad-Lad' |, are calculated, and if the above values are less than 5mm (preset), respectively, the lengths are considered to be the same.
If the coordinate difference and the length difference can meet the requirements, the graphs are considered to be consistent, and if one or more of the graphs are different, the graphs are considered to be inconsistent.
The embodiment of the application also provides an information unlocking device. It should be noted that in the information unlocking method provided in the embodiment of the present application, the execution main body may be an information unlocking device, or a control module in the information unlocking device, which is used for executing the loaded information unlocking method. In the embodiment of the present application, an information unlocking method executed by an information unlocking device is taken as an example, and the information unlocking method provided in the embodiment of the present application is described.
The information unlocking device provided by the embodiment of the present application is described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof. For the content that is not described in detail in the information unlocking device provided in the embodiment of the present application, reference may be made to the information unlocking method provided in the embodiment of the present application, and details are not described here again.
The information unlocking device provided in the embodiment of the present application may be applied to a first device, and as shown in fig. 7, the information unlocking device 60 may include a first receiving unit 61, a first obtaining unit 62, and an unlocking unit 63.
The first receiving unit 61 is configured to receive a first input for requesting unlocking target information; the first acquiring unit 62 is configured to acquire parameters of a first spatial trajectory drawn by a second device paired with the first device through motion in space, where the parameters of the first spatial trajectory are determined according to motion parameters detected by a spatial motion sensor configured to the second device; the unlocking unit 63 is configured to unlock the target information if it is determined that the parameter of the first spatial trajectory matches the second spatial trajectory.
In an alternative embodiment, the unlocking unit may include: and the first determining unit is used for determining that the parameters of the first spatial track are matched with the second spatial track under the condition that the error between the parameters of the first spatial track and the parameters of the second spatial track is determined to be within a first preset range.
In an optional embodiment, the first obtaining unit may include: the second receiving unit is used for receiving the motion parameters sent by the second equipment; the first generating unit is used for generating a third space track drawn by the second equipment in the space according to the motion parameters; the correction unit is used for correcting the third space track into a smooth straight line or curve to obtain a first space track; a second determination unit for determining a parameter of the first spatial trajectory.
In an alternative embodiment, the first generating unit may include: the third receiving unit is used for receiving a drawing switch identifier which is sent by the second device and aims at the motion parameter, wherein the drawing switch identifier is used for identifying whether the motion parameter is an effective parameter or not, and the drawing switch identifier is generated based on the target operation received by the second device; the third determining unit is used for determining effective parameters in the motion parameters according to the drawing switch identification; and the second generating unit is used for generating a third space track according to the effective parameters.
In an alternative embodiment, the second spatial trajectory is an outline of the first stereogram, and the apparatus may further include: a first display unit configured to display, on a display screen of the first device, imaging of a second stereoscopic graphic based on a target view angle after receiving a first input requesting unlocking of target information, wherein the second stereoscopic graphic is a part of the first stereoscopic graphic; the first determination unit includes: the combining unit is used for combining the first space track with the outline of the second three-dimensional graph to obtain the outline of a third three-dimensional graph; and the fourth determining unit is used for determining that the parameters of the first space track are matched with the second space track under the condition that the error between the contour of the third stereo figure and the contour of the first stereo figure is determined to be in a second preset range.
In an optional embodiment, the apparatus may further comprise: a fourth receiving unit for receiving a second input for requesting setting of a second spatial trajectory before receiving the first input for requesting unlocking of the target information; and the second acquisition unit is used for acquiring a second space track drawn by the motion of the second equipment in the space.
In an alternative embodiment, the first device may be a head-mounted device, and the apparatus may further include: and the second display unit is used for displaying imaging of the first space track based on a target visual angle on a display screen of the head-mounted device after acquiring parameters of the first space track drawn by the second device which is paired with the first device through motion in the space, wherein the display screen of the head-mounted device is a lens of the head-mounted device, and the target visual angle is a visual angle facing a preset direction in a coordinate system of the head-mounted device at a spatial position where the head-mounted device is located.
Through the information unlocking device provided by the embodiment of the application, a user can draw the first space track in the space through the second equipment, the first equipment is matched with the known second space track to serve as the key of the unlocking information, another new unlocking mode is provided, and due to the fact that other people on the track of the second equipment moving in the space are not easy to observe in the onlooker process, the good effect of keeping the unlocking key secret can be achieved, and the safety of information privacy is improved. When the method is applied to a scene of the head-mounted equipment, other people cannot see the imaging of the stereoscopic graph in the glasses, and the safety of the secret key is further improved.
The information unlocking device in the embodiment of the present application may be a device, or may also be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The information unlocking device in the embodiment of the present application may be a device having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The information unlocking device provided by the embodiment of the application can realize each process realized by the information unlocking device in the embodiment of the application method, and is not repeated here to avoid repetition.
Optionally, an embodiment of the present application further provides an electronic device, which includes a processor, a memory, and a program or an instruction stored in the memory and capable of running on the processor, where the program or the instruction is executed by the processor to implement each process of the information unlocking method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
It should be noted that the electronic devices in the embodiments of the present application include the mobile electronic devices and the non-mobile electronic devices described above.
Fig. 8 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application. The electronic device may be the first device described above.
The electronic device 400 includes, but is not limited to: radio unit 401, network module 402, audio output unit 403, input unit 404, sensor 405, display unit 406, user input unit 407, interface unit 408, memory 409, and processor 410.
Those skilled in the art will appreciate that the electronic device 400 may further include a power source (e.g., a battery) for supplying power to various components, and the power source may be logically connected to the processor 410 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system. The input unit 404 may include a graphics processor, a microphone, and the like. Display component 406 may include a display panel. The user input unit 407 may include a touch panel and other input devices, and the like. The memory 409 may store an application program, an operating system, and the like. The electronic device structure shown in fig. 8 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
The user input unit 407 is configured to receive a first input requesting unlocking target information.
The processor 410 is configured to obtain parameters of a first spatial trajectory drawn by a second device paired with the first device through motion in space, wherein the parameters of the first spatial trajectory are determined according to motion parameters detected by a spatial motion sensor configured in the second device; and unlocking the target information under the condition that the parameters of the first space track are determined to be matched with the second space track.
Optionally, when determining that the parameter of the first spatial trajectory matches the second spatial trajectory, the processor 410 may further perform the following steps: determining that the parameter of the first spatial trajectory matches the second spatial trajectory if it is determined that the error between the parameter of the first spatial trajectory and the parameter of the second spatial trajectory is within a first preset range.
Optionally, the radio frequency unit 401 may be configured to receive the motion parameter sent by the second device; the processor 410, when executing the step of obtaining parameters of a first spatial trajectory drawn by a second device in space, which has been paired with the first device, may further comprise the step of: generating a third space track drawn by the second equipment in the space according to the motion parameters; correcting the third space trajectory into a smooth straight line or curve to obtain a first space trajectory; parameters of the first spatial trajectory are determined.
Optionally, the radio frequency unit 401 may be configured to receive a drawing switch identifier sent by the second device and aiming at the motion parameter, where the drawing switch identifier is used to identify whether the motion parameter is an effective parameter, and the drawing switch identifier is generated based on a target operation received by the second device; the processor 410, when executing generating the third spatial trajectory drawn by the second device moving in the space according to the motion parameter, may include executing the following steps: determining effective parameters in the motion parameters according to the drawing switch identification; and generating a third space track according to the effective parameters.
Alternatively, the second spatial trajectory may be an outline of the first stereoscopic graphic, and after the processor 410 executes receiving the first input requesting to unlock the target information, the display unit 406 may display, on the display screen of the first device, an image of the outline of the second stereoscopic graphic based on the target viewing angle, where the second stereoscopic graphic is a part of the first stereoscopic graphic; the processor 410 performing the step of determining that the parameters of the first spatial trajectory match the second spatial trajectory may comprise performing the steps of: combining the first space track with the outline of the second three-dimensional graph to obtain the outline of a third three-dimensional graph; and under the condition that the error between the outline of the third stereo figure and the outline of the first stereo figure is determined to be in a second preset range, determining that the parameters of the first space track are matched with the second space track.
Optionally, before the processor 410 performs receiving the first input requesting to unlock the target information, the user input unit 407 may be further configured to receive a second input requesting to set a second spatial trajectory; the processor 410 is further configured to perform the obtaining a second spatial trajectory drawn by the second device moving in space.
Alternatively, in a case that the first device is a head-mounted device, after the processor 410 performs the step of acquiring parameters of a first spatial trajectory drawn by a second device paired with the first device through motion in space, the display unit 406 may display an image of the first spatial trajectory based on a target viewing angle on a display screen of the head-mounted device, where the display screen of the head-mounted device is a lens of the head-mounted device, and the target viewing angle is a viewing angle facing a preset direction in a coordinate system of the head-mounted device at a spatial position where the head-mounted device is located.
According to the electronic equipment provided by the embodiment of the application, the user can operate the second equipment to draw the first space track in the space, the first space track is matched with the known second space track by the electronic equipment to serve as the key of the unlocking information, another new unlocking mode is provided, and due to the fact that other people in the track of the second equipment moving in the space are not easy to observe in the onlooker process, the good effect of keeping the unlocking key secret can be achieved, and the safety of information privacy is improved. When the method is applied to a scene of the head-mounted equipment, other people cannot see the imaging of the stereoscopic graph in the glasses, and the safety of the secret key is further improved.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the above information unlocking method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer-readable storage medium, and the computer-readable storage medium may include a nonvolatile Memory, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction to implement each process of the above information unlocking method embodiment, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Aspects of the present application are described above with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, enable the implementation of the functions/acts specified in the flowchart and/or block diagram block or blocks. Such a processor may be, but is not limited to, a general purpose processor, a special purpose processor, an application specific processor, or a field programmable logic circuit. It will also be understood that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware for performing the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (11)

1. An information unlocking method, characterized in that the method comprises:
receiving a first input requesting unlocking target information;
acquiring parameters of a first space track drawn by a second device paired with a first device through motion in space, wherein the parameters of the first space track are determined according to motion parameters detected by a space motion sensor configured on the second device;
and unlocking the target information under the condition that the parameters of the first spatial track are determined to be matched with the second spatial track.
2. The information unlocking method according to claim 1, wherein the determining that the parameters of the first spatial trajectory match the second spatial trajectory includes:
and under the condition that the error between the motion parameters of the first space track and the motion parameters of the second space track is determined to be within a first preset range, determining that the parameters of the first space track are matched with the second space track.
3. The information unlocking method according to claim 1, wherein the obtaining of the parameters of a first spatial trajectory drawn by a second device paired with the first device moving in space comprises:
receiving the motion parameters sent by the second device;
generating a third space track drawn by the second device in a moving mode in the space according to the motion parameters;
correcting the third space trajectory into a smooth straight line or curve to obtain the first space trajectory;
parameters of the first spatial trajectory are determined.
4. The information unlocking method according to claim 3, wherein the generating of a third spatial trajectory drawn by the second device through motion in space according to the motion parameter comprises:
receiving a drawing switch identifier sent by the second device and aiming at the motion parameter, wherein the drawing switch identifier is used for identifying whether the motion parameter is a valid parameter, and the drawing switch identifier is generated based on a target operation received by the second device;
determining the effective parameters in the motion parameters according to the drawing switch identification;
and generating the third space track according to the effective parameters.
5. The information unlocking method according to claim 1, wherein the second spatial trajectory is an outline of a first solid figure,
after receiving the first input requesting unlocking of the target information, the method further includes: displaying, on a display screen of the first device, an image of an outline of a second stereoscopic graphic based on a target view angle, wherein the second stereoscopic graphic is a part of the first stereoscopic graphic;
the determining that the parameters of the first spatial trajectory match a second spatial trajectory comprises: combining the first space track with the outline of the second three-dimensional graph to obtain the outline of a third three-dimensional graph; and under the condition that the error between the contour of the third stereo figure and the contour of the first stereo figure is determined to be in a second preset range, determining that the parameters of the first space track are matched with the second space track.
6. The information unlocking method according to claim 1, wherein the first device is a head-mounted device, and after acquiring parameters of a first spatial trajectory drawn by a second device paired with the first device moving in space, the method further comprises:
displaying the imaging of the first space trajectory based on a target view angle on a display screen of the head-mounted device, wherein the display screen of the head-mounted device is a lens of the head-mounted device, and the target view angle is a view angle of a preset direction in a coordinate system of the head-mounted device and at a spatial position where the head-mounted device is located.
7. An information unlocking apparatus, characterized in that the apparatus comprises:
a first receiving unit for receiving a first input for requesting unlocking target information;
a first acquisition unit, configured to acquire parameters of a first spatial trajectory drawn by a second device paired with a first device through motion in space, where the parameters of the first spatial trajectory are determined according to motion parameters detected by a spatial motion sensor configured to the second device;
and the unlocking unit is used for unlocking the target information under the condition that the parameters of the first space track are determined to be matched with the second space track.
8. The information unlocking device according to claim 7, wherein the first acquisition unit includes:
the second receiving unit is used for receiving the motion parameters sent by the second equipment;
the first generating unit is used for generating a third space track drawn by the second equipment in a moving mode in the space according to the motion parameters;
the correction unit is used for correcting the third space track into a smooth straight line or curve to obtain the first space track;
a second determination unit for determining a parameter of the first spatial trajectory.
9. The information unlocking device according to claim 8, wherein the first generation unit includes:
a third receiving unit, configured to receive a drawing switch identifier sent by the second device and used for the motion parameter, where the drawing switch identifier is used to identify whether the motion parameter is a valid parameter, and the drawing switch identifier is generated based on a target operation received by the second device;
a third determining unit, configured to determine the valid parameter in the motion parameter according to the drawing switch identifier;
and the second generating unit is used for generating the third space track according to the effective parameters.
10. The information unlocking device according to claim 7, wherein the second spatial trajectory is an outline of a first solid figure,
the device further comprises: a first display unit, configured to display, on a display screen of the first device, imaging of a second stereoscopic graphic based on a target view angle after receiving a first input requesting unlocking of target information, where the second stereoscopic graphic is a part of the first stereoscopic graphic;
the first determination unit includes: the combining unit is used for combining the first space track with the outline of the second three-dimensional graph to obtain the outline of a third three-dimensional graph; and the fourth determining unit is used for determining that the parameters of the first spatial trajectory match the second spatial trajectory under the condition that the error between the contour of the third stereo figure and the contour of the first stereo figure is determined to be in a second preset range.
11. An electronic device comprising a processor, a memory, and a program or instructions stored on the memory and executable on the processor, the program or instructions when executed by the processor implementing the steps of the information unlocking method according to any one of claims 1-6.
CN202110345690.4A 2021-03-31 2021-03-31 Information unlocking method and electronic equipment Active CN113051538B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110345690.4A CN113051538B (en) 2021-03-31 2021-03-31 Information unlocking method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110345690.4A CN113051538B (en) 2021-03-31 2021-03-31 Information unlocking method and electronic equipment

Publications (2)

Publication Number Publication Date
CN113051538A true CN113051538A (en) 2021-06-29
CN113051538B CN113051538B (en) 2023-05-23

Family

ID=76516677

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110345690.4A Active CN113051538B (en) 2021-03-31 2021-03-31 Information unlocking method and electronic equipment

Country Status (1)

Country Link
CN (1) CN113051538B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113570761A (en) * 2021-08-06 2021-10-29 广州小鹏汽车科技有限公司 Vehicle control method, vehicle control device, vehicle-mounted terminal and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104541282A (en) * 2014-05-14 2015-04-22 华为技术有限公司 Unlocking method, apparatus and device
CN106375542A (en) * 2015-07-22 2017-02-01 中兴通讯股份有限公司 Method and apparatus for controlling operation of terminal
CN106650392A (en) * 2016-11-11 2017-05-10 捷开通讯(深圳)有限公司 VR headset device and unlock method
CN107122640A (en) * 2016-02-25 2017-09-01 百度在线网络技术(北京)有限公司 Unlocking method and a device for mobile terminal
CN107704742A (en) * 2017-08-16 2018-02-16 捷开通讯(深圳)有限公司 Intelligent mobile terminal and its unlocking method, the device with store function
CN111176522A (en) * 2019-12-16 2020-05-19 维沃移动通信有限公司 Unlocking method and electronic equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104541282A (en) * 2014-05-14 2015-04-22 华为技术有限公司 Unlocking method, apparatus and device
CN106375542A (en) * 2015-07-22 2017-02-01 中兴通讯股份有限公司 Method and apparatus for controlling operation of terminal
CN107122640A (en) * 2016-02-25 2017-09-01 百度在线网络技术(北京)有限公司 Unlocking method and a device for mobile terminal
CN106650392A (en) * 2016-11-11 2017-05-10 捷开通讯(深圳)有限公司 VR headset device and unlock method
CN107704742A (en) * 2017-08-16 2018-02-16 捷开通讯(深圳)有限公司 Intelligent mobile terminal and its unlocking method, the device with store function
CN111176522A (en) * 2019-12-16 2020-05-19 维沃移动通信有限公司 Unlocking method and electronic equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113570761A (en) * 2021-08-06 2021-10-29 广州小鹏汽车科技有限公司 Vehicle control method, vehicle control device, vehicle-mounted terminal and storage medium

Also Published As

Publication number Publication date
CN113051538B (en) 2023-05-23

Similar Documents

Publication Publication Date Title
US11321870B2 (en) Camera attitude tracking method and apparatus, device, and system
US11962930B2 (en) Method and apparatus for controlling a plurality of virtual characters, device, and storage medium
CN105320874B (en) Method and apparatus for encrypting or decrypting content
US20190130396A1 (en) Methods and apparatus for providing secure identification, payment processing and/or signing using a gesture-based input device
US20190392422A1 (en) Mobile terminal and control method therefor
US20160055330A1 (en) Three-dimensional unlocking device, three-dimensional unlocking method, and program
WO2020258748A1 (en) Bank card binding system, method and apparatus, and device and storage medium
JP2018506767A (en) Virtual wearable
EP3417393B1 (en) Portable electronic device for remote control
KR20160071887A (en) Mobile terminal and method for controlling the same
JP6300705B2 (en) Authentication management method by device cooperation, information processing device, wearable device, computer program
US9753539B2 (en) Method, device, system and non-transitory computer-readable recording medium for providing user interface
CN111415185B (en) Service processing method, device, terminal and storage medium
US20180350155A1 (en) System for manipulating a 3d simulation of a person by adjusting physical characteristics
JP6632322B2 (en) Information communication terminal, sharing management device, information sharing method, computer program
CN111062725B (en) Face payment method, device and system and computer readable storage medium
CN110290191B (en) Resource transfer result processing method, device, server, terminal and storage medium
CN106681506B (en) Interaction method for non-VR application in terminal equipment and terminal equipment
CN113051538B (en) Information unlocking method and electronic equipment
CN111078002A (en) Suspended gesture recognition method and terminal equipment
CN112818733B (en) Information processing method, device, storage medium and terminal
US20200033936A1 (en) Remote work supporting system, remote work supporting method, and program
CN110032861B (en) Password setting method and terminal equipment
CN114115544A (en) Human-computer interaction method, three-dimensional display device and storage medium
CN109918882B (en) Image encryption method and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant