CN108491142B - Control method of mobile terminal, mobile terminal and storage medium - Google Patents

Control method of mobile terminal, mobile terminal and storage medium Download PDF

Info

Publication number
CN108491142B
CN108491142B CN201810193462.8A CN201810193462A CN108491142B CN 108491142 B CN108491142 B CN 108491142B CN 201810193462 A CN201810193462 A CN 201810193462A CN 108491142 B CN108491142 B CN 108491142B
Authority
CN
China
Prior art keywords
mobile terminal
touch event
screen touch
front camera
interception
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201810193462.8A
Other languages
Chinese (zh)
Other versions
CN108491142A (en
Inventor
龙俊卫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201810193462.8A priority Critical patent/CN108491142B/en
Publication of CN108491142A publication Critical patent/CN108491142A/en
Application granted granted Critical
Publication of CN108491142B publication Critical patent/CN108491142B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Security & Cryptography (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Multimedia (AREA)
  • Telephone Function (AREA)

Abstract

The application is applicable to the technical field of terminal control, and provides a control method of a mobile terminal, the mobile terminal and a computer readable storage medium, wherein the control method comprises the following steps: under the trigger mechanism of predetermineeing, the interception mobile terminal's screen touch event to open leading camera, it is right preview image that leading camera was gathered in real time carries out face detection, if detect face image in the preview image that leading camera was gathered in real time, then right the interception of screen touch event is relieved, can solve present mobile terminal and often appear the problem of mistake touch-control through this application, and the damage of reducible mobile terminal physics button.

Description

Control method of mobile terminal, mobile terminal and storage medium
Technical Field
The present application belongs to the technical field of terminal control, and in particular, to a control method for a mobile terminal, and a computer-readable storage medium.
Background
With the rapid development of science and technology, the functions of the mobile phone are more and more comprehensive, people can read electronic books, browse webpages, shop, socialize and the like through the mobile phone, and people also generally use the mobile phone for dissipating time in the scattered time of public transport, queuing, eating and the like.
However, due to the sporadic usage time, it is often the case that the user interrupts the currently viewed content to handle other things. If the screen is closed by the keys after the browsing is interrupted, frequent operations may cause damage to the physical keys, and if the screen is not closed after the browsing is interrupted, erroneous touch operations may be caused.
Disclosure of Invention
In view of this, embodiments of the present application provide a control method for a mobile terminal, a mobile terminal and a computer readable storage medium, so as to solve the problem that a current mobile terminal often has a wrong touch, and reduce damage to physical keys of the mobile terminal.
A first aspect of an embodiment of the present application provides a method for controlling a mobile terminal, including:
intercepting a screen touch event of the mobile terminal under a preset trigger mechanism, and starting a front camera;
performing face detection on the preview image acquired by the front camera in real time;
and if a face image is detected in the preview image acquired by the front camera in real time, the interception of the screen touch event is released.
A second aspect of an embodiment of the present application provides a mobile terminal, including:
the camera starting module is used for intercepting a screen touch event of the mobile terminal under a preset trigger mechanism and starting a front camera;
the detection module is used for carrying out face detection on the preview image acquired by the front camera in real time;
and the first interception releasing module is used for releasing the interception of the screen touch event if a human face image is detected in the preview image acquired by the front camera in real time.
A third aspect of an embodiment of the present application provides a mobile terminal, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the method provided in the first aspect of the embodiment of the present application when executing the computer program.
A fourth aspect of embodiments of the present application provides a computer-readable storage medium storing a computer program which, when executed by one or more processors, performs the steps of the method provided by the first aspect of embodiments of the present application.
A fifth aspect of embodiments of the present application provides a computer program product comprising a computer program that, when executed by one or more processors, performs the steps of the method provided by the first aspect of embodiments of the present application.
The embodiment of the application intercepts the screen touch event of the mobile terminal under a preset trigger mechanism, and starts the front camera, and carries out face detection on the preview image acquired by the front camera in real time, if the face image is detected in the preview image acquired by the front camera in real time, the interception of the screen touch event is removed, because the screen touch event of the mobile terminal is intercepted under the preset trigger mechanism, the screen false touch event of a user is avoided, and the face detection is carried out through the preview image acquired by the front camera in real time, if the face image is detected in the preview image acquired by the front camera in real time, the user indicates that the user is currently still using the mobile terminal, at the moment, the interception of the screen touch event is removed, the screen false touch event of the user is avoided, and the use state of the mobile terminal is further confirmed through the camera, in the whole process, the physical keys of the mobile terminal do not need to be operated, so that the damage to the physical keys is reduced.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic implementation flowchart of a control method of a mobile terminal according to an embodiment of the present application;
fig. 2 is a schematic flow chart illustrating an implementation of a control method of a mobile terminal according to another embodiment of the present application;
fig. 3 is a schematic block diagram of a mobile terminal provided in an embodiment of the present application;
fig. 4 is a schematic block diagram of a mobile terminal according to another embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
Fig. 1 is a schematic implementation flowchart of a control method of a mobile terminal according to an embodiment of the present application, and as shown in the figure, the method may include the following steps:
and S101, intercepting a screen touch event of the mobile terminal under a preset trigger mechanism, and starting a front camera.
In this embodiment of the application, the preset trigger mechanism indicates a condition that a user may generate a screen mis-touch event to the mobile terminal, for example, it is detected that a screen of the mobile terminal is downward, and a face image is detected in a preview image currently acquired by a rear camera, which indicates that eyes of the user are not currently directed toward the screen, and a screen touch event received by the screen of the mobile terminal may be the screen mis-touch event, where the screen mis-touch event indicates that the user does not intend to operate the mobile terminal, but the mobile terminal receives a touch instruction.
As another embodiment of the present application, the preset trigger mechanism further includes: and if the preset application of the mobile terminal is in a foreground running state, and the azimuth change amplitude of the mobile terminal is monitored to be larger than a preset value.
In this embodiment of the application, the preset application may be a reading application, for example, an electronic book-related application, a social-related application, a shopping-related application, and the like, and a plurality of preset applications may be set, and as long as any one of the preset applications runs in a foreground and the detected change range of the orientation of the mobile terminal is greater than a preset value, the preset trigger condition is met, because when the user views the reading application, since the user needs to look at a screen of the mobile terminal with eyes, the mobile terminal is usually held in the hand of the user, and the user usually stabilizes the mobile terminal at a position in order to obtain a good reading experience. However, when the user holds the mobile terminal in his hand, it is impossible to keep the mobile terminal completely still, and the orientation of the mobile terminal changes, and at this time, a value (preset value) may be set, where the preset value indicates the change range of the orientation of the mobile terminal when the user reads, and when the change range of the orientation of the mobile terminal is monitored to be greater than the preset value, it indicates that the user may not read any more, for example, the user needs to deal with an emergency event urgently and places the mobile terminal in a pocket. In order to avoid the screen error touch event, the screen touch event of the mobile terminal can be intercepted. In practical application, it may also be set that no response is generated to the screen touch event of the mobile terminal, and after the screen touch event of the mobile terminal is intercepted or the screen touch event of the mobile terminal is not responded, no response is generated to the mobile terminal through operations such as clicking, sliding and the like generated on the screen of the mobile terminal by the user.
And step S102, carrying out face detection on the preview image acquired by the front camera in real time.
In the embodiment of the present application, even if the preset trigger mechanism is met, there may be a situation that the user is still reading, for example, when the user changes from holding the mobile terminal left or right to holding the mobile terminal right, the user is still reading. At this time, it is necessary to further confirm whether the user is currently reading, for example, a front camera of the mobile terminal may be turned on, a preview image of a front area is acquired by the front camera in real time, and face detection is performed on the preview image acquired by the front camera in real time, where the face detection is only used to identify whether the user is currently watching a screen of the mobile terminal, and is not used for authentication.
The process of face detection can be realized by the following method: firstly, designing one or more standard face templates, namely, judging whether a face exists or not for a preview image or a shot picture acquired by a front camera based on face features of a person (because the face has certain structural distribution features, such as the size, the position, the distance and other attributes of facial image facial contours of an iris, a nasal wing, a mouth corner and the like), further giving the position and the size of the face and position information of each main facial organ if the face exists, calculating the matching degree between the detected face features and the standard templates by adopting an Adaboost learning algorithm based on the face features, and judging whether the face exists or not by a threshold value.
It should be noted that the preview image or the shot picture collected by the front-facing camera may not be directly subjected to face detection due to the limitation of natural condition factors, and before the face detection is performed, preprocessing needs to be performed on the preview image or the shot picture collected by the front-facing camera, such as noise filtering, light compensation, gray scale transformation, histogram equalization, normalization, geometric correction, filtering, sharpening, and the like.
And step S103, if a human face image is detected in the preview image acquired by the front camera in real time, the interception of the screen touch event is released.
In this embodiment of the application, if the matching degree between the preview image or the shot picture acquired by the front-facing camera and the standard template is greater than a threshold, it indicates that a face image is detected, and if the matching degree between the preview image or the shot picture acquired by the front-facing camera and the standard template is less than or equal to the threshold, it indicates that a face image is not detected. And if the detected face image indicates that the user is reading currently, the interception of the screen touch event can be released, and at the moment, if the user performs sliding, clicking and other operations on the touch screen, the mobile terminal can generate corresponding response.
The embodiment of the application intercepts the screen touch event of the mobile terminal under a preset trigger mechanism, and starts the front camera, and carries out face detection on the preview image acquired by the front camera in real time, if the face image is detected in the preview image acquired by the front camera in real time, the interception of the screen touch event is removed, because the screen touch event of the mobile terminal is intercepted under the preset trigger mechanism, the screen false touch event of a user is avoided, and the face detection is carried out through the preview image acquired by the front camera in real time, if the face image is detected in the preview image acquired by the front camera in real time, the user indicates that the user is currently still using the mobile terminal, at the moment, the interception of the screen touch event is removed, the screen false touch event of the user is avoided, and the use state of the mobile terminal is further confirmed through the camera, in the whole process, the physical keys of the mobile terminal do not need to be operated, so that the damage to the physical keys is reduced.
Fig. 2 is a flowchart illustrating a control method of a mobile terminal according to another embodiment of the present application, where as shown in the figure, the method may include the following steps:
step S201, if the preset application of the mobile terminal is in a foreground running state and the azimuth change amplitude of the mobile terminal is monitored to be larger than a preset value, intercepting a screen touch event of the mobile terminal, starting a front camera and starting a timer.
In the embodiment of the application, the preset application is in a foreground running state, which means that the preset application runs directly in a window, and the mobile terminal displays an interface interacting with a user of the mobile terminal.
The azimuth change amplitude of the mobile terminal is obtained through the following modes:
Fj+1=αj+1j
Figure BDA0001592379200000071
wherein j represents the jth time period, Fj+1Indicating the magnitude of the azimuth change of the mobile terminal at time period j +1, αjRepresenting the mean value of the azimuth angles of the mobile terminal, which are obtained by the magnetic field sensor and the acceleration sensor of the mobile terminal in the jth time period, n representing the azimuth angles of n mobile terminals obtained by sampling the magnetic field sensor and the acceleration sensor of the mobile terminal in each time period, and thetaiAnd sampling a magnetic field sensor and an acceleration sensor representing the mobile terminal to obtain the ith azimuth angle.
As an example, assuming that the time period is 1 minute, the magnetic field sensor and the acceleration sensor of the mobile terminal sample and obtain azimuth angles of 6 mobile terminals in each time period (obtained every 10 seconds), where n is 6, and the start time of monitoring the azimuth change amplitude of the mobile terminal is: 12:00:00, then:
the first time period is 12:00:00-12:01:00, and the obtained sampling point is theta1、θ2、θ3、θ4、θ5、θ6
The second time period is 12:01:00-12:02:00, and the obtained sampling point is theta7、θ8、θ9、θ10、θ11、θ12
The third time period is 12:02:00-12:03:00, and the obtained sampling point is theta13、θ14、θ15、θ16、θ17、θ18
The average value of the azimuth angles sampled in the third time period is α3=(θ131415161718) 6 according to the formula
Figure BDA0001592379200000072
It can be derived that:
Figure BDA0001592379200000073
similarly, the average value of the azimuth angles sampled in the second time period is α2=(θ789101112) 6 according to the formula
Figure BDA0001592379200000081
It can be derived that:
Figure BDA0001592379200000082
subtracting the average value of the azimuth angles of the second time period from the average value of the azimuth angles of the third time period to obtain the azimuth change amplitude F of the mobile terminal in the current third time period3=α32
It should be noted that, the starting time for monitoring the azimuth variation amplitude of the mobile terminal may be a time when the preset application starts to start, and when each time period ends (for example, a third time period ends), the azimuth variation amplitude of the mobile terminal in the current time period (the third time period) may be obtained according to the average value of the azimuth in the current time period (the third time period) and the previous time period (the second time period).
The intercepting of the screen touch event of the mobile terminal comprises:
and adding a frame layout in a system unlocking screen interface to prevent the transmission of a screen touch event of the mobile terminal.
Specifically, a layer of FrameLayout is added to keyguard host to serve as a mask, so that gesture events of the touch screen are prevented from being transmitted downwards (to an underlying system).
Step S202, carrying out face detection on the preview image acquired by the front-facing camera in real time;
the step is consistent with the content of step S102, and the description of step S202 may be specifically referred to, which is not repeated herein.
Step S203, before the timing time of the timer is greater than a first preset time, if a human face image is detected in a preview image acquired by the front camera in real time, the interception of the screen touch event is removed.
In the embodiment of the application, a timer is also required to be started when the screen touch event of the mobile terminal is intercepted and the front camera is started. Before the timing time of the timer is longer than a first preset time, if a face image is detected in a preview image acquired by the front-facing camera in real time, the user is indicated to be currently using the mobile terminal, and at this time, the interception of the screen touch event needs to be removed, so that the user can continue to operate the mobile terminal.
The timer is started because: if the user does not use the mobile terminal for a long time, that is, the face image is not detected in the preview image acquired by the front camera of the mobile terminal, the front camera is required to acquire the preview image and perform face detection until the face image is detected in the preview image acquired by the front camera in real time, and a time value (first preset time) can be set to avoid a large amount of memory consumption.
It should be noted that, after the face image is detected, it indicates that the user is currently using the mobile terminal, and the front-facing camera may be closed, and meanwhile, whether the preset trigger mechanism is met or not is continuously monitored.
And step S204, before the timing time of the timer is greater than a first preset time, if no human face image is detected in the preview image acquired by the front camera in real time, the interception of the screen touch event is released when the timing time of the timer is greater than the first preset time.
In the embodiment of the application, if no human face image is detected in the preview image acquired by the front camera in real time before the timing time of the timer is greater than a first preset time, the interception of the screen touch event is released when the timing time of the timer is greater than the first preset time.
It should be noted that, because the face is not detected all the time, it may be that the user is not currently using the mobile terminal, at this time, the turned-on front-facing camera may be turned off, and whether the preset trigger mechanism is met is no longer monitored.
As can be seen from the descriptions of step S203 and step S204, the trigger condition for releasing the interception of the screen touch event is: 1) detecting a face image in a preview image acquired by the front-facing camera in real time before a first preset time, and removing the face image when the face image is detected; 2) and before the first preset time, the preview images acquired by the front camera in real time are all free of human face images, and then the preview images are released after the first preset time.
For example, assuming that the first preset time is set to be 2 minutes, if a face image is detected in the 1 st minute, the interception of the screen touch event is released in the 1 st minute, and meanwhile, the front-facing camera can be turned off and whether a preset trigger mechanism is met or not can be continuously confirmed; if the face image is not detected within 2 minutes, the interception of the screen touch event is released in the 2 nd minute, and meanwhile, the front camera can be closed, but whether a preset trigger mechanism is met or not can not be continuously confirmed.
Step S205, if the user operation is not detected within a second preset time after the interception of the screen touch event is released, performing a screen-off or screen-locking operation on the mobile terminal.
In the embodiment of the application, after interception of the screen touch event is released, no matter whether a user uses the mobile terminal currently, as long as an operation of the user is not detected within a second preset time, screen turning or screen locking operation is performed on the mobile terminal.
The embodiment of the application further details how to determine the azimuth change amplitude of the mobile terminal on the basis of the embodiment shown in fig. 1, and also increases the limit of the first preset time, so that the real use state of the mobile terminal can be more accurately judged.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Fig. 3 is a schematic block diagram of a mobile terminal according to an embodiment of the present application, and only a portion related to the embodiment of the present application is shown for convenience of description.
The mobile terminal 3 may be a software unit, a hardware unit, or a combination of software and hardware unit built in an existing mobile terminal such as a mobile phone and a notebook, may be integrated into the existing mobile terminal such as a mobile phone and a notebook as an independent pendant, or may exist as an independent mobile terminal.
The mobile terminal 3 includes:
the camera starting module 31 is configured to intercept a screen touch event of the mobile terminal under a preset trigger mechanism, and start a front camera;
the detection module 32 is used for performing face detection on the preview image acquired by the front camera in real time;
and the first interception releasing module 33 is configured to release interception of the screen touch event if a face image is detected in a preview image acquired by the front-facing camera in real time.
Optionally, the camera opening module 31 is further configured to:
if the preset application of the mobile terminal is in a foreground running state and the azimuth change amplitude of the mobile terminal is monitored to be larger than a preset value, intercepting a screen touch event of the mobile terminal and starting a front camera.
Optionally, the mobile terminal 3 further includes:
the timing module 34 is used for starting a timer after the front camera is started;
the second interception releasing module 35 is further configured to release interception of the screen touch event when the timing time of the timer is greater than the first preset time and if no face image is detected in the preview image acquired by the front-facing camera in real time, the timing time of the timer is greater than the first preset time.
Optionally, the first unblocking module 33 is further configured to:
before the timing time of the timer is longer than a first preset time, if a human face image is detected in a preview image acquired by the front-facing camera in real time, the interception of the screen touch event is removed.
Optionally, the mobile terminal 3 further includes:
and the screen saver module 36 is configured to, after the interception of the screen touch event is released, execute screen saving or screen locking operation on the mobile terminal if the operation of the user is not detected within a second preset time.
Optionally, the camera opening module 31 is further configured to:
and adding a frame layout in a system unlocking screen interface to prevent the transmission of a screen touch event of the mobile terminal.
Optionally, the azimuth change amplitude of the mobile terminal is obtained through the following method:
Fj+1=αj+1j
Figure BDA0001592379200000111
wherein j represents the jth time period, Fj+1Indicating the magnitude of the azimuth change of the mobile terminal at time period j +1, αjRepresenting the mean value of the azimuth angles of the mobile terminal, which are obtained by the magnetic field sensor and the acceleration sensor of the mobile terminal in the jth time period, n representing the azimuth angles of n mobile terminals obtained by sampling the magnetic field sensor and the acceleration sensor of the mobile terminal in each time period, and thetaiAnd the ith azimuth angle sampled by the magnetic field sensor and the acceleration sensor of the mobile terminal is represented.
It will be apparent to those skilled in the art that, for convenience and simplicity of description, the foregoing functional units and modules are merely illustrated in terms of division, and in practical applications, the foregoing functional allocation may be performed by different functional units and modules as needed, that is, the internal structure of the mobile terminal is divided into different functional units or modules to perform all or part of the above described functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the above-mentioned apparatus may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
Fig. 4 is a schematic block diagram of a mobile terminal according to another embodiment of the present application. As shown in fig. 4, the mobile terminal 4 of this embodiment includes: one or more processors 40, a memory 41, and a computer program 42 stored in the memory 41 and executable on the processors 40. The processor 40, when executing the computer program 42, implements the steps in the above-described control method embodiments of each mobile terminal, such as the steps S101 to S103 shown in fig. 1. Alternatively, the processor 40, when executing the computer program 42, implements the functions of the modules/units in the above-described mobile terminal embodiments, such as the functions of the modules 31 to 33 shown in fig. 3.
Illustratively, the computer program 42 may be partitioned into one or more modules/units that are stored in the memory 41 and executed by the processor 40 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 42 in the mobile terminal 4. For example, the computer program 42 may be divided into a camera turn-on module, a detection module, a first unblocking module.
The camera starting module is used for intercepting a screen touch event of the mobile terminal under a preset trigger mechanism and starting a front camera;
the detection module is used for carrying out face detection on the preview image acquired by the front camera in real time;
and the first interception releasing module is used for releasing the interception of the screen touch event if a human face image is detected in the preview image acquired by the front camera in real time.
Other modules or units can refer to the description of the embodiment shown in fig. 3, and are not described again here.
The mobile terminal includes, but is not limited to, a processor 40, a memory 41. Those skilled in the art will appreciate that fig. 4 is only one example of a mobile terminal 4 and is not intended to limit the mobile terminal 4 and may include more or fewer components than shown, or some components may be combined, or different components, for example, the mobile terminal may also include input devices, output devices, network access devices, buses, etc.
The Processor 40 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 41 may be an internal storage unit of the mobile terminal 4, such as a hard disk or a memory of the mobile terminal 4. The memory 41 may also be an external storage device of the mobile terminal 4, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, provided on the mobile terminal 4. Further, the memory 41 may also include both an internal storage unit and an external storage device of the mobile terminal 4. The memory 41 is used for storing the computer program and other programs and data required by the mobile terminal. The memory 41 may also be used to temporarily store data that has been output or is to be output.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed mobile terminal and method may be implemented in other ways. For example, the above-described embodiments of the mobile terminal are merely illustrative, and for example, the division of the modules or units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain other components which may be suitably increased or decreased as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media which may not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (8)

1. A control method of a mobile terminal, comprising:
intercepting a screen touch event of the mobile terminal under a preset trigger mechanism, and starting a front camera;
performing face detection on the preview image acquired by the front camera in real time;
if a face image is detected in a preview image acquired by the front camera in real time, the interception of the screen touch event is released;
wherein, under the trigger mechanism of predetermineeing, intercept mobile terminal's screen touch event to open leading camera includes:
if the preset application of the mobile terminal is in a foreground running state and the azimuth change amplitude of the mobile terminal is monitored to be larger than a preset value, intercepting a screen touch event of the mobile terminal and starting a front camera; the azimuth change amplitude of the mobile terminal is obtained through the following modes:
Fj+1=αj+1j
Figure FDA0002537226560000011
wherein j represents the jth time period, Fj+1Indicating the magnitude of the azimuth change of the mobile terminal at time period j +1, αjRepresenting the mean value of the azimuth angles of the mobile terminal, which are obtained by the magnetic field sensor and the acceleration sensor of the mobile terminal in the jth time period, n representing the azimuth angles of n mobile terminals obtained by sampling the magnetic field sensor and the acceleration sensor of the mobile terminal in each time period, and thetaiAnd the ith azimuth angle sampled by the magnetic field sensor and the acceleration sensor of the mobile terminal is represented.
2. The method for controlling a mobile terminal according to claim 1, wherein after turning on a front camera, the method further comprises:
starting a timer;
before the timing time of the timer is longer than a first preset time, if no human face image is detected in the preview image acquired by the front camera in real time, the interception of the screen touch event is released when the timing time of the timer is longer than the first preset time.
3. The method for controlling a mobile terminal according to claim 2, wherein the releasing the interception of the screen touch event if the face image is detected in the preview image collected by the front-facing camera in real time comprises:
before the timing time of the timer is longer than a first preset time, if a human face image is detected in a preview image acquired by the front-facing camera in real time, the interception of the screen touch event is removed.
4. The control method of a mobile terminal according to claim 1, wherein after the interception of the screen touch event is released, the control method further comprises:
and if the operation of the user is not detected within the second preset time, performing screen-off or screen-locking operation on the mobile terminal.
5. The method for controlling a mobile terminal according to any one of claims 1 to 4, wherein the intercepting a screen touch event of the mobile terminal comprises:
and adding a frame layout in a system unlocking screen interface to prevent the transmission of a screen touch event of the mobile terminal.
6. A mobile terminal, comprising:
the camera opening module is used for intercepting a screen touch event of the mobile terminal under a preset trigger mechanism and opening a front camera, and specifically comprises: if the preset application of the mobile terminal is in a foreground running state and the azimuth change amplitude of the mobile terminal is monitored to be larger than a preset value, intercepting a screen touch event of the mobile terminal and starting a front camera; the azimuth change amplitude of the mobile terminal is obtained through the following modes:
Fj+1=αj+1j
Figure FDA0002537226560000021
wherein j represents the jth time period, Fj+1Indicating the magnitude of the azimuth change of the mobile terminal at time period j +1, αjRepresenting the mean value of the azimuth angles of the mobile terminal, which are obtained by the magnetic field sensor and the acceleration sensor of the mobile terminal in the jth time period, n representing the azimuth angles of n mobile terminals obtained by sampling the magnetic field sensor and the acceleration sensor of the mobile terminal in each time period, and thetaiRepresenting the ith azimuth angle sampled by a magnetic field sensor and an acceleration sensor of the mobile terminal;
the detection module is used for carrying out face detection on the preview image acquired by the front camera in real time;
and the first interception releasing module is used for releasing the interception of the screen touch event if a human face image is detected in the preview image acquired by the front camera in real time.
7. A mobile terminal comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 2 to 5 when executing the computer program.
8. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which, when executed by one or more processors, implements the steps of the method according to any one of claims 1 to 5.
CN201810193462.8A 2018-03-09 2018-03-09 Control method of mobile terminal, mobile terminal and storage medium Expired - Fee Related CN108491142B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810193462.8A CN108491142B (en) 2018-03-09 2018-03-09 Control method of mobile terminal, mobile terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810193462.8A CN108491142B (en) 2018-03-09 2018-03-09 Control method of mobile terminal, mobile terminal and storage medium

Publications (2)

Publication Number Publication Date
CN108491142A CN108491142A (en) 2018-09-04
CN108491142B true CN108491142B (en) 2020-08-11

Family

ID=63338427

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810193462.8A Expired - Fee Related CN108491142B (en) 2018-03-09 2018-03-09 Control method of mobile terminal, mobile terminal and storage medium

Country Status (1)

Country Link
CN (1) CN108491142B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110647277A (en) * 2019-08-28 2020-01-03 维沃移动通信有限公司 Control method and terminal equipment
CN113138662A (en) * 2020-01-19 2021-07-20 珠海格力电器股份有限公司 Method and device for preventing mistaken touch of touch equipment, electronic equipment and readable storage medium
CN114578998A (en) * 2022-03-23 2022-06-03 深圳传音控股股份有限公司 Touch processing method, intelligent terminal and computer readable storage medium
CN117850614A (en) * 2022-09-30 2024-04-09 华为技术有限公司 Touch processing method and related equipment

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012208795A (en) * 2011-03-30 2012-10-25 Ntt Docomo Inc Portable terminal and operation control method

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8019390B2 (en) * 2009-06-17 2011-09-13 Pradeep Sindhu Statically oriented on-screen transluscent keyboard
CN103067600A (en) * 2013-01-14 2013-04-24 成都西可科技有限公司 Error touching preventing system and implementation method of smart phone
CN103412719A (en) * 2013-08-30 2013-11-27 深圳市中兴移动通信有限公司 Method and device for preventing touch screen terminal from misoperation
CN104238912B (en) * 2014-08-22 2018-02-23 小米科技有限责任公司 application control method and device
WO2016029449A1 (en) * 2014-08-29 2016-03-03 华为技术有限公司 Method and apparatus for preventing false touch of touchscreen
CN105630352A (en) * 2015-12-24 2016-06-01 天脉聚源(北京)科技有限公司 Mistaken touch prevention method and device
CN105635483B (en) * 2016-01-26 2019-06-25 Oppo广东移动通信有限公司 A kind of processing method, device and storage medium operating mobile terminal
CN106020510B (en) * 2016-05-17 2019-05-03 Oppo广东移动通信有限公司 The control method and device of terminal

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012208795A (en) * 2011-03-30 2012-10-25 Ntt Docomo Inc Portable terminal and operation control method

Also Published As

Publication number Publication date
CN108491142A (en) 2018-09-04

Similar Documents

Publication Publication Date Title
CN108491142B (en) Control method of mobile terminal, mobile terminal and storage medium
US10699103B2 (en) Living body detecting method and apparatus, device and storage medium
US20150074418A1 (en) Method and apparatus for outputting recognized error of sensor in electronic device
US10169637B2 (en) On-screen optical fingerprint capture for user authentication
Ijiri et al. Security management for mobile devices by face recognition
CN110457963B (en) Display control method, display control device, mobile terminal and computer-readable storage medium
CN111489290A (en) Face image super-resolution reconstruction method and device and terminal equipment
WO2020253495A1 (en) Screen lock control method, device, handheld terminal, and storage medium
WO2017161824A1 (en) Method and device for controlling terminal
CN113486377A (en) Image encryption method and device, electronic equipment and readable storage medium
CN111062248A (en) Image detection method, device, electronic equipment and medium
WO2022268023A1 (en) Fingerprint recognition method and apparatus, and electronic device and readable storage medium
US20240143711A1 (en) Screen unlocking method and apparatus, and electronic device
CN115661917A (en) Gesture recognition method and related product
CN112633218A (en) Face detection method and device, terminal equipment and computer readable storage medium
CN110766837A (en) Control method and device for passing equipment, machine readable medium and equipment
CN114943872A (en) Training method and device of target detection model, target detection method and device, medium and equipment
US11392789B2 (en) Fingerprint authentication using a synthetic enrollment image
KR20150029251A (en) Method for securing object of electronic device and the electronic device therefor
CN111079662A (en) Figure identification method and device, machine readable medium and equipment
CN110688035B (en) Photo album processing method, photo album processing device and mobile terminal
CN112150685A (en) Vehicle management method, system, machine readable medium and equipment
JP5891898B2 (en) Information processing apparatus, program, and information processing method
CN112565601B (en) Image processing method, image processing device, mobile terminal and storage medium
CN108227906B (en) Man-machine interaction method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant after: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd.

Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant before: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd.

GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20200811