CN112468643A - Control method of mobile terminal, mobile terminal and storage medium - Google Patents

Control method of mobile terminal, mobile terminal and storage medium Download PDF

Info

Publication number
CN112468643A
CN112468643A CN202011326402.2A CN202011326402A CN112468643A CN 112468643 A CN112468643 A CN 112468643A CN 202011326402 A CN202011326402 A CN 202011326402A CN 112468643 A CN112468643 A CN 112468643A
Authority
CN
China
Prior art keywords
application
identification data
mobile terminal
data
icon
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011326402.2A
Other languages
Chinese (zh)
Inventor
包永强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Microphone Holdings Co Ltd
Shenzhen Transsion Holdings Co Ltd
Original Assignee
Shenzhen Microphone Holdings Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Microphone Holdings Co Ltd filed Critical Shenzhen Microphone Holdings Co Ltd
Priority to CN202011326402.2A priority Critical patent/CN112468643A/en
Publication of CN112468643A publication Critical patent/CN112468643A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a control method of a mobile terminal, which comprises the following steps: receiving identification data; determining a separate application and/or display position associated with the identification data; and starting the body-splitting application and/or displaying the body-splitting application at the display position. The application also discloses a mobile terminal and a computer storage medium, which aim to provide data security of the personal application.

Description

Control method of mobile terminal, mobile terminal and storage medium
Technical Field
The present application relates to the field of mobile terminal technologies, and in particular, to a control method for a mobile terminal, and a storage medium.
Background
With the rapid development of internet technology, mobile terminals such as mobile phones and tablet computers have become indispensable information processing tools in people's life and work. With the increasing abundance of functions of mobile terminals such as mobile phones and tablet computers, many mobile terminals have the function of application individualization. However, in the conventional mobile terminal, the main application and the separate application are started in the same manner, and are generally set to be started by clicking a desktop icon, which results in poor data security of the separate application.
The foregoing description is provided for general background information and is not admitted to be prior art.
Disclosure of Invention
In view of the above technical problems, the present application provides a control method for a mobile terminal, a mobile terminal and a storage medium, so that a user can quickly start an application and enjoy a service, and the operation is simple and convenient.
In order to solve the above technical problem, the present application provides a method for controlling a mobile terminal, which is applied to the mobile terminal, and includes:
receiving identification data;
determining a separate application and/or display position associated with the identification data;
and starting the body-splitting application and/or displaying the body-splitting application at the display position.
Optionally, the display position includes at least one of the following interfaces of the mobile terminal: main interface, negative one screen, lock screen interface, right one screen.
Optionally, when it is detected that a preset rule is met, displaying an application icon of the body-splitting application on the screen-locking interface, so as to trigger the body-splitting application through the application icon.
Optionally, an application icon of the self-body-splitting application is displayed in an operation interface of the mobile terminal, so that the self-body-splitting application is triggered based on the application icon.
Optionally, the step of starting the avatar application and/or displaying the avatar application at the display position includes:
directly starting the body-divided application, and displaying the running interface of the body-divided application at the display position; or
And displaying an application icon of the personal application at the display position, and starting the personal application corresponding to the selected application icon when the application icon is selected.
In order to solve the above technical problem, the present application provides a method for controlling a mobile terminal, which is applied to the mobile terminal, and includes:
receiving identification data;
and generating and/or starting the body-separating application corresponding to the identification data, and/or hiding and displaying the body-separating application.
Optionally, the personal application is configured to be triggered through an application icon of a corresponding main application, or triggered through a function button of the main application.
Optionally, the step of generating and/or starting the avatar application corresponding to the identification data, and/or hiding and displaying the avatar application includes:
determining the body-separating application corresponding to the identification data;
and when the body-separating application corresponding to the identification data is not installed in the mobile terminal, generating the body-separating application corresponding to the identification data, and/or hiding and displaying the body-separating application.
Optionally, after the step of generating the avatar application corresponding to the identification data and/or hiding and displaying the avatar application, the method further includes:
and setting the identification data as starting data of the body-divided application, or setting the identification data as calling-out data of an application icon of the body-divided application.
Optionally, after the step of determining the avatar application corresponding to the identification data, the method further includes:
and when the body-separating application corresponding to the identification data is installed in the mobile terminal, starting the body-separating application corresponding to the identification data.
In order to solve the above technical problem, the present application provides a method for controlling a mobile terminal, which is applied to the mobile terminal, and includes:
setting identification data as a generation and/or starting application program and/or an individuation application;
and generating and/or starting an application program and/or an avatar application when the identification data is received, optionally hiding and displaying an application icon of the avatar application when the avatar application is generated and/or started based on the identification data, and/or displaying the application icon and/or an operation page of the application program when the application program is generated and/or started based on the identification data.
Optionally, the body-divided application generates based on the identification data, and/or sets the identification data as start data of the body-divided application after an application icon of the body-divided application is hidden and displayed, or sets the identification data as call-out data of the application icon of the body-divided application, so that the application icon of the body-divided application is displayed at a preset position when the identification data is received again.
Optionally, after hiding and displaying the application icon of the split application, the method further includes:
when the identification data is received, determining a display position associated with the identification data;
displaying an application icon of the personal application at the display position; or
And starting the body-splitting application, and displaying the running interface of the body-splitting application at the display position.
Optionally, the identification data may comprise at least first identification data and/or second identification data, optionally the first identification data is used to generate and/or launch the application and/or the avatar application; and/or the second identification data is used to evoke an icon of the avatar application.
Optionally, the identification data at least comprises first identification data and/or second identification data, optionally, the first identification data is used for generating and/or launching the application program and/or the self-body application, and/or evoking an icon of the self-body application; and/or the second identification data is used to generate and/or launch the application.
In order to solve the above technical problem, the present application provides a method for controlling a mobile terminal, which is applied to the mobile terminal, and includes:
receiving identification data;
and starting a folder or an application program associated with the identification data, wherein optionally, the folder is set as an application icon for displaying the personal application.
Optionally, the identification data comprises at least one of: fingerprint data, face data, positioning data, networking data, bluetooth connection data, near field communication data and touch tracks.
Optionally, the step of starting the folder or the application program associated with the identification data includes:
determining a target application program associated with the identification data;
and starting the target application program associated with the identification data.
Optionally, the step of determining the target application program associated with the identification data comprises at least one of:
acquiring an application program associated with the current position as the target application program;
acquiring an application program associated with a currently connected network as the target application program;
acquiring an application program associated with the currently connected Bluetooth device as the target application program;
and acquiring the application program associated with the currently received fingerprint as the target application program.
Optionally, when the application icon is triggered, the avatar application corresponding to the triggered application icon is started.
Optionally, the step of receiving identification data is performed when a preset control gesture is received.
Optionally, the step of receiving an identification data comprises at least one of:
displaying a touch track acquisition page, and acquiring a touch track as the identification data through the touch track acquisition page;
starting a fingerprint collector, and collecting fingerprints as the identification data;
starting a human face data acquisition device, and acquiring human face data as the identification data;
and acquiring terminal data as the identification data.
Optionally, the mobile terminal is a folding screen mobile terminal, and optionally, the identification data includes a screen folding angle of the folding screen mobile terminal.
In order to solve the above technical problem, the present application provides a method for controlling a mobile terminal, which is applied to the mobile terminal, and includes:
receiving identification data;
determining a target operation mode corresponding to the identification data;
and entering the target operation mode.
Optionally, the mobile terminal is provided with at least one operation mode, and the application program corresponding to each operation mode is different.
Optionally, after the step of entering the target operation mode, the method further includes:
determining an application program corresponding to the target operation mode;
and displaying the program entry of the application program corresponding to the target running mode.
Optionally, the identification data comprises at least one of user data and terminal data; optionally, the user data includes fingerprint data and/or face data, and optionally, the terminal data includes positioning data, networking data, bluetooth connection data, near field communication data, and/or a touch trajectory.
Optionally, the method for controlling the mobile terminal further includes:
and displaying a setting page so that a user can set the application program corresponding to each running mode through the setting page.
Optionally, an application entry corresponding to the main application and an application entry of the self-body application of the main application are respectively associated with different operation modes.
Optionally, the step of receiving an identification data comprises:
and when receiving unlocking data, taking the unlocking data as the identification data, and optionally, receiving the unlocking data under a screen locking interface.
Optionally, the mobile terminal is provided with at least two kinds of unlocking data; or at least two different data of the same kind are set in the mobile terminal as unlocking data.
The present application further provides a mobile terminal, including: the control program of the mobile terminal is stored on the memory, and when the control program of the mobile terminal is executed by the processor, the control method of the mobile terminal is realized.
The present application also provides a computer storage medium having a computer program stored thereon, which, when being executed by a processor, carries out the steps of the method as described above. Control method of mobile terminal
As described above, the control method of the mobile terminal of the present application is applied to the mobile terminal, and when receiving an identification data, the control method determines the body-separated application and/or the display position associated with the identification data, and then starts the body-separated application and/or displays the body-separated application at the display position, thereby achieving an effect of simplifying a start path of an application program. Meanwhile, the data security of the mobile terminal is improved by identifying the data to start the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application. In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the description of the embodiments will be briefly described below, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic hardware structure diagram of a mobile terminal implementing various embodiments of the present application;
fig. 2 is a communication network system architecture diagram according to an embodiment of the present application;
fig. 3 is a flowchart illustrating a control method of a mobile terminal according to a first embodiment;
fig. 4 is an interface diagram showing a control method of the mobile terminal according to the first embodiment;
fig. 5 is another interface diagram of a control method of the mobile terminal shown according to the first embodiment;
fig. 6 is a flowchart illustrating a control method of a mobile terminal according to a second embodiment;
fig. 7 is a flowchart illustrating a control method of a mobile terminal according to a third embodiment;
fig. 8 is a flowchart illustrating a control method of a mobile terminal according to a fourth embodiment;
fig. 9 is a flowchart illustrating a control method of a mobile terminal according to a fifth embodiment.
The implementation, functional features and advantages of the objectives of the present application will be further explained with reference to the accompanying drawings. With the above figures, there are shown specific embodiments of the present application, which will be described in more detail below. These drawings and written description are not intended to limit the scope of the inventive concepts in any manner, but rather to illustrate the inventive concepts to those skilled in the art by reference to specific embodiments.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, the recitation of an element by the phrase "comprising an … …" does not exclude the presence of additional like elements in the process, method, article, or apparatus that comprises the element, and further, where similarly-named elements, features, or elements in different embodiments of the disclosure may have the same meaning, or may have different meanings, that particular meaning should be determined by their interpretation in the embodiment or further by context with the embodiment.
It should be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope herein. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context. Also, as used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context indicates otherwise. It will be further understood that the terms "comprises," "comprising," "includes" and/or "including," when used in this specification, specify the presence of stated features, steps, operations, elements, components, items, species, and/or groups, but do not preclude the presence, or addition of one or more other features, steps, operations, elements, components, species, and/or groups thereof. The terms "or" and/or "as used herein are to be construed as inclusive or meaning any one or any combination. The terms "or," "and/or," "including at least one of the following," and the like, as used herein, are to be construed as inclusive or mean any one or any combination. For example, "includes at least one of: A. b, C "means" any of the following: a; b; c; a and B; a and C; b and C; a and B and C ", again for example," A, B or C "or" A, B and/or C "means" any of the following: a; b; c; a and B; a and C; b and C; a and B and C'. An exception to this definition will occur only when a combination of elements, functions, steps or operations are inherently mutually exclusive in some way.
It should be understood that, although the steps in the flowcharts in the embodiments of the present application are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless explicitly stated herein. Moreover, at least some of the steps in the figures may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, in different orders, and may be performed alternately or at least partially with respect to other steps or sub-steps of other steps.
The words "if", as used herein, may be interpreted as "at … …" or "at … …" or "in response to a determination" or "in response to a detection", depending on the context. Similarly, the phrases "if determined" or "if detected (a stated condition or event)" may be interpreted as "when determined" or "in response to a determination" or "when detected (a stated condition or event)" or "in response to a detection (a stated condition or event)", depending on the context.
It should be noted that step numbers such as S11 and S12 are used herein for the purpose of more clearly and briefly describing the corresponding content, and do not constitute a substantial limitation on the sequence, and those skilled in the art may perform S12 first and then S11 in specific implementation, which should be within the scope of the present application.
It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for the convenience of description of the present application, and have no specific meaning in themselves. Thus, "module", "component" or "unit" may be used mixedly.
The mobile terminal may be implemented in various forms. For example, the mobile terminal described in the present application may include mobile terminals such as a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a Personal Digital Assistant (PDA), a Portable Media Player (PMP), a navigation device, a wearable device, a smart band, a pedometer, and the like, and fixed terminals such as a Digital TV, a desktop computer, and the like.
The following description will be given taking a mobile terminal as an example, and it will be understood by those skilled in the art that the configuration according to the embodiment of the present application can be applied to a fixed type terminal in addition to elements particularly used for mobile purposes.
Referring to fig. 1, which is a schematic diagram of a hardware structure of a mobile terminal for implementing various embodiments of the present application, the mobile terminal 100 may include: RF (Radio Frequency) unit 101, WiFi module 102, audio output unit 103, a/V (audio/video) input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 1 is not intended to be limiting of mobile terminals, which may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile terminal in detail with reference to fig. 1:
the radio frequency unit 101 may be configured to receive and transmit signals during information transmission and reception or during a call, and specifically, receive downlink information of a base station and then process the downlink information to the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for Mobile communications), GPRS (General Packet Radio Service), CDMA2000(Code Division Multiple Access 2000), WCDMA (Wideband Code Division Multiple Access), TD-SCDMA (Time Division-Synchronous Code Division Multiple Access), FDD-LTE (Frequency Division duplex Long Term Evolution), and TDD-LTE (Time Division duplex Long Term Evolution).
WiFi belongs to short-distance wireless transmission technology, and the mobile terminal can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 102, and provides wireless broadband internet access for the user. Although fig. 1 shows the WiFi module 102, it is understood that it does not belong to the essential constitution of the mobile terminal, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the WiFi module 102 or stored in the memory 109 into an audio signal and output as sound when the mobile terminal 100 is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the mobile terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 may include a speaker, a buzzer, and the like.
The a/V input unit 104 is used to receive audio or video signals. The a/V input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, the Graphics processor 1041 Processing image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the WiFi module 102. The microphone 1042 may receive sounds (audio data) via the microphone 1042 in a phone call mode, a recording mode, a voice recognition mode, or the like, and may be capable of processing such sounds into audio data. The processed audio (voice) data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode. The microphone 1042 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
The mobile terminal 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Optionally, the light sensor includes an ambient light sensor that may adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that may turn off the display panel 1061 and/or the backlight when the mobile terminal 100 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Alternatively, the user input unit 107 may include a touch panel 1071 and other input devices 1072. The touch panel 1071, also referred to as a touch screen, may collect a touch operation performed by a user on or near the touch panel 1071 (e.g., an operation performed by the user on or near the touch panel 1071 using a finger, a stylus, or any other suitable object or accessory), and drive a corresponding connection device according to a predetermined program. The touch panel 1071 may include two parts of a touch detection device and a touch controller. Optionally, the touch detection device detects a touch orientation of a user, detects a signal caused by a touch operation, and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and can receive and execute commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. Optionally, other input devices 1072 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like, and are not limited thereto.
Further, the touch panel 1071 may cover the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although the touch panel 1071 and the display panel 1061 are shown in fig. 1 as two separate components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the mobile terminal, and is not limited herein.
The interface unit 108 serves as an interface through which at least one external device is connected to the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal 100 and external devices.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a program storage area and a data storage area, and optionally, the program storage area may store an operating system, an application program (such as a sound playing function, an image playing function, and the like) required by at least one function, and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the mobile terminal. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor and a modem processor, optionally, the application processor mainly handles operating systems, user interfaces, application programs, etc., and the modem processor mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The mobile terminal 100 may further include a power supply 111 (e.g., a battery) for supplying power to various components, and preferably, the power supply 111 may be logically connected to the processor 110 via a power management system, so as to manage charging, discharging, and power consumption management functions via the power management system.
Although not shown in fig. 1, the mobile terminal 100 may further include a bluetooth module or the like, which is not described in detail herein.
In order to facilitate understanding of the embodiments of the present application, a communication network system on which the mobile terminal of the present application is based is described below.
Referring to fig. 2, fig. 2 is an architecture diagram of a communication Network system according to an embodiment of the present disclosure, where the communication Network system is an LTE system of a universal mobile telecommunications technology, and the LTE system includes a UE (User Equipment) 201, an E-UTRAN (Evolved UMTS Terrestrial Radio Access Network) 202, an EPC (Evolved Packet Core) 203, and an IP service 204 of an operator, which are in communication connection in sequence.
Optionally, the UE201 may be the terminal 100 described above, and is not described herein again.
The E-UTRAN202 includes eNodeB2021 and other eNodeBs 2022, among others. Alternatively, the eNodeB2021 may be connected with other enodebs 2022 through a backhaul (e.g., X2 interface), the eNodeB2021 is connected to the EPC203, and the eNodeB2021 may provide the UE201 access to the EPC 203.
The EPC203 may include an MME (Mobility Management Entity) 2031, an HSS (Home Subscriber Server) 2032, other MMEs 2033, an SGW (Serving gateway) 2034, a PGW (PDN gateway) 2035, and a PCRF (Policy and Charging Rules Function) 2036, and the like. Optionally, the MME2031 is a control node that handles signaling between the UE201 and the EPC203, providing bearer and connection management. HSS2032 is used to provide registers to manage functions such as home location register (not shown) and holds subscriber specific information about service characteristics, data rates, etc. All user data may be sent through SGW2034, PGW2035 may provide IP address assignment for UE201 and other functions, and PCRF2036 is a policy and charging control policy decision point for traffic data flow and IP bearer resources, which selects and provides available policy and charging control decisions for a policy and charging enforcement function (not shown).
The IP services 204 may include the internet, intranets, IMS (IP Multimedia Subsystem), or other IP services, among others.
Although the LTE system is described as an example, it should be understood by those skilled in the art that the present application is not limited to the LTE system, but may also be applied to other wireless communication systems, such as GSM, CDMA2000, WCDMA, TD-SCDMA, and future new network systems.
Based on the above mobile terminal hardware structure and communication network system, various embodiments of the present application are provided.
First embodiment
Referring to fig. 3, in a first embodiment, the method for controlling a mobile terminal includes the steps of:
step S11, receiving identification data;
step S12, determining the personal application and/or display position associated with the identification data;
and step S13, starting the body-divided application and/or displaying the body-divided application at the display position.
The control method of the mobile terminal is applied to the mobile terminal.
Illustratively, the mobile terminal may be a mobile terminal such as a mobile phone, a tablet computer, a notebook computer, an ultrabook, and/or a game console. The mobile terminal can be an intelligent mobile terminal which realizes a response function based on systems such as an android system, an ios system and/or a Windows system. The mobile terminal is also provided with an application program loading function, so that the mobile terminal can install the application program. And the mobile terminal is also provided with an application body-separating establishing function, so that the mobile terminal can establish corresponding application body-separating for all or part of the applications installed by the mobile terminal.
In some implementations, after the application program establishes the application identity, the starting mode of the application program, whether the application program is a corresponding main application or an identity-based application, is started through an application icon arranged on the desktop. When there are many applications loaded in the mobile terminal. When starting an application, a user may need to switch a desktop interface or open other storage windows to find a corresponding application entry (generally, an application icon), which causes a defect that a starting path of a related application starting method is complicated. Moreover, the starting mode has poor privacy, and when a non-terminal owner uses the mobile phone, the application can be started in a mode of clicking the application icon. This has the drawback of low data security.
To solve the above-mentioned drawbacks, the present embodiment provides a method for controlling a mobile terminal.
In this embodiment, the mobile terminal is provided with a user interface, so that a user can perform human-computer interaction with the mobile terminal through the user interface. So that the mobile terminal can receive identification data input by a user based on the user interface. Optionally, the identification data is user input information that can be distinguished by the mobile terminal and is set by the user. For example, the identification data may be set as fingerprint data, voice print data, face data, and/or a preset touch trajectory. The identification data may also be terminal status data detected by the terminal, for example, the terminal status data may include a current time, a current location, and/or a current operation mode.
When the mobile terminal receives identification data, the body-divided application and/or the display position associated with the identification data are determined.
Optionally, the user may set the association between the identification data and the self-identified application in a customized manner, or the association between the identification data and the self-identified application may be determined by a system. This is not limited to the embodiment. Thus, when an identification data is received, the personal application associated with the identification data may be determined. And/or, the user can also customize the association between the display position and the identification data, or the system can define the association between the identification data and the display position. Then, after receiving an identification data, the display position corresponding to the identification data can be determined.
Optionally, the identification data may include at least fingerprint data, face data, positioning data, networking data, bluetooth connection data, near field communication data, and touch trajectory. The display position comprises a main interface, a negative screen, a screen locking interface and/or a right screen of the mobile terminal.
In order to enable those skilled in the art to better understand the scope of the claims herein. The technical solutions recited in the claims of the present application are explained below by specific implementation examples in specific application scenarios, and it should be understood that the following examples are only used for explaining the present application, and are not used for limiting the scope of the claims of the present application.
Example 1, a user may first enter a fingerprint, e.g., a right-hand thumb fingerprint, in a mobile terminal. And the fingerprint of the thumb of the right hand is set to be associated with the WeChat body by self. And entering another fingerprint, such as a right-hand index finger fingerprint, and setting the right-hand index finger fingerprint to be associated with the qq body. Under a predetermined scenario, when the mobile terminal receives an action of inputting a right-hand thumb fingerprint by a user, it may be determined that the currently received right-hand thumb fingerprint is associated with a qq body. Optionally, the right-hand thumb fingerprint is arranged to be associated with a lock screen interface.
Example 2, the mobile terminal may be provided with an identification data acquisition trigger condition. Optionally, the recognition data acquisition triggering condition may be set to trigger a recognition data acquisition action when a preset control gesture is received, or set to trigger a recognition data acquisition action in a preset mode. In this example, the user may trigger a face data acquisition action and/or trigger a voiceprint data acquisition action by pressing the power key 3 times in succession. After the human face data acquisition action is triggered, the mobile terminal can start a camera device so as to acquire the human face data through the camera device. And after face data are collected, identifying the face data. And then determining prestored face data corresponding to the currently acquired face data according to the recognition result. And determining the body-splitting application and/or display position associated with the pre-stored face data.
Example 3, the mobile terminal may also be configured to associate with a particular network, particular bluetooth connection data, and/or airport communication data, etc. So that specific network, specific bluetooth connection data and/or near field communication data can be used as identification data. For example, the mobile terminal presets a WiFi network named "TECNO _ office _ 01" associated with the nail split and/or negative one screen. When the mobile terminal detects that the mobile terminal is connected to a WiFi network named "TECNO _ office _ 01", the identification data is associated with a body-splitting application which is determined to be a nailed body-splitting application, and/or the associated display position is determined to be minus one screen. Optionally, the mobile terminal sets a bluetooth device named "hipeds H2" to be associated with qq music avatar and/or home interface, and when the mobile terminal detects that a bluetooth device named "hipeds H2" is connected, qq music avatar and/or home interface, respectively, of the avatar application and/or display location associated with the identification data is determined.
Example 4, when the mobile terminal receives a specific touch trajectory, the specific touch trajectory may be used as the identification data. Referring to fig. 4, the touch trajectory C may be associated with the Facebook identity and/or the right screen, and then, when the control operation at the screen drawing C is detected, the Facebook identity and/or the right screen of the identity application and/or the display position associated with the identification data may be determined, respectively.
Example 5, a preset image drawn by a user may be received as the identification data. The graph shown in the figure 5 is set to be associated with the Facebook identity and/or the right screen, and when the graph drawn by the user and shown in the figure 5 is received, the Facebook identity and/or the right screen of the identity-identifying application and/or the display position associated with the identification data are determined.
Optionally, after determining the avatar application and/or the display location associated with the identification data, the avatar application may be started and/or the avatar application may be displayed at the display location.
Optionally, in an implementation case, the body-splitting application associated with the identification data is determined, and the body-splitting application may be directly started. I.e. the mobile terminal starts running the separate application. Optionally, as an optional embodiment, the split application may be executed in the background after being started. As another implementation scheme, a display position may be determined according to the identification data, so that after the avatar application is started, the running interface of the avatar application is displayed at the display position.
In another implementation, after receiving an identification data, the identification data can be determined to be associated with a display location. As an implementation manner, after the mobile terminal receives the start operation of the body-splitting application, the running interface of the body-splitting application is displayed at the determined display position. As another implementation manner, after the display position is determined, an application icon of the avatar application associated with the display position may be directly displayed in the display position, or an interface may be run.
Example 1, after determining that the display position is the screen lock interface, may display, with the screen lock interface, an operation interface of a split application when receiving a start operation of the split application under the screen lock interface.
Example 2, after the display position is determined to be the screen locking interface, an icon of the body-splitting application associated with the screen locking interface may be displayed in the screen locking interface, so that the user can use the screen locking interface to trigger the corresponding body-splitting application.
Optionally, in some application scenarios, the mobile terminal is configured to hide an application icon of the target avatar application. And under a screen locking interface, when the condition that a preset rule is met is detected, displaying an application icon of the body-splitting application on the screen locking interface so as to trigger the body-splitting application through the application icon. For example, under the screen locking interface, receiving identification data, wherein the identification data is associated with the screen locking interface, and judging that the preset rule is met. Or, under the screen locking interface, receiving a preset control action (pressing the screen for 5 seconds, clicking the screen for three times, and the like), and judging that the preset rule is met. Or detecting that the CPU occupation amount is larger than a preset threshold value under the screen locking interface, judging that a preset rule is met, and displaying an application icon corresponding to the housekeeper of the mobile phone with the screen locking interface.
Optionally, an application icon of the self-body-splitting application is displayed in an operation interface of the mobile terminal, so that the self-body-splitting application is triggered based on the application icon. Optionally, the operation interface includes a main interface, a negative screen and/or a right screen, and the like, which can receive user control operation.
In the technical scheme disclosed in this embodiment, an identification data is received, and then a body-divided application and/or a display position associated with the identification data is determined, so that the body-divided application is started and/or displayed at the display position, thereby achieving an effect of improving data security of the body-divided application.
Second embodiment
Referring to fig. 6, in a second embodiment, the method for controlling a mobile terminal includes the steps of:
step S21, receiving identification data;
and step S22, generating and/or starting the body-separating application corresponding to the identification data, and hiding and displaying the body-separating application.
In this embodiment, the body-splitting application associated with the identification data may be customized first. When receiving an identification data, the self-identification application associated with the identification data is determined. The avatar application is then generated. And hiding and displaying the divided application, namely not displaying a starting entrance (such as an application icon and/or a notification message and the like) of the divided application in the mobile terminal.
Optionally, as an implementation scheme, after receiving an identification data and determining the avatar application corresponding to the identification data, it is first determined whether the avatar application corresponding to the identification data is already installed in the mobile terminal. And when the body-separating application corresponding to the identification data is not installed in the mobile terminal, generating the body-separating application corresponding to the identification data, and hiding and displaying the body-separating application. And setting the identification data as starting data of the body-divided application, or setting the identification data as calling-out data of an application icon of the body-divided application.
And when the body-divided application corresponding to the identification data is installed in the mobile terminal, starting the body-divided application corresponding to the identification data, or calling an application icon of the body-divided application, so that a user can start the body-divided application based on the application icon.
Optionally, after the split application is hidden and displayed, the split application is set to be triggered through an application icon of a main application corresponding to the split application, or triggered through a function button of the main application.
Alternatively, the identification data may be set as start data of the avatar application, or as call-out data of an application icon of the avatar application. And when the mobile terminal receives the identification data again, starting the body-splitting application or controlling the mobile terminal to display an icon of the body-splitting application at a preset position. Optionally, the preset position may be determined according to the identification data. For example, the identification data may be associated with a display position, and when the identification data is received, the associated display position is taken as the preset position.
In the technical scheme disclosed in this embodiment, identification data is received, a body-divided application corresponding to the identification data is generated, and the body-divided application is hidden and displayed. The body-separated application needs to be generated based on the identification data, and the body-separated application is hidden and displayed, so that the effect of improving the data security of the body-separated application is achieved.
Third embodiment
Referring to fig. 7, in a third embodiment, the method for controlling a mobile terminal includes the steps of:
step S31, setting a piece of identification data as the generation and/or starting of an application program and an individuation application;
step S32, when receiving the identification data, generating and/or starting an application program and a self-body-separated application, and optionally, when the self-body-separated application is generated and/or started based on the identification data, hiding and displaying an application icon of the self-body-separated application; and when the application program is generated and/or started based on the identification data, displaying the icon of the application program and the operation page of the application program.
In this embodiment, a user may set an identification data for generating and/or launching the application and the avatar application. Such that upon receipt of the identification data, the application program and the avatar application are generated and/or launched. Optionally, when the avatar application is generated and/or started based on the identification data, an application icon of the avatar application is hidden and displayed; when the application program is generated and/or started based on the identification data, displaying the icon of the application program and the operation page of the application program
Optionally, in the mobile terminal, a starting scheme of an installed application program may be preset, or an application program and/or an avatar application associated with the identification data may be generated when preset identification data is received. Optionally, when an application is set to be started based on the identification data, if the application is a main application, an application icon of the application is displayed in the terminal. So that the user can launch the application in two ways, based on the application icon or the identification data. When the application program is the body-divided application, if the body-divided application is set to be started through the identification data, the icon corresponding to the body-divided application is not displayed.
Optionally, the body-separated application is generated based on the identification data, and after an application icon of the body-separated application is hidden and displayed, the identification data is set as start data of the body-separated application, so that the body-separated application can be started before the identification data is received again. Or setting the identification data as call-out data of the application icon of the body-divided application, so that the application icon of the body-divided application is displayed at a preset position when the identification data is received again.
It is to be understood that the preset position may also be determined based on the identification data, and the acquisition may also be set as a fixed display position. This embodiment is not limited to this. When the identification data is received, determining a display position associated with the identification data, and then displaying an application icon of the personal application at the display position; or starting the body-divided application and displaying the running interface of the body-divided application at the display position.
Example 1, after the wechat credit is generated, the start mode of the wechat credit may be set to start based on the identification data. The WeChat avatar is then associated with a numeric password. And then when the digital password is received, the WeChat is started. Optionally, the user may further set a password input interface for displaying the acquired digital password when receiving the preset control action. For example, upon receiving a double-click on the screen, a double-click on the WeChat running interface, a double-click on the WeChat application icon, and/or a long press on a blank area of the desktop, a password input interface for obtaining the digital password is displayed.
Example 2, after the wechat avatar is set to hide and display the application icon of the avatar application, the preset area may be associated with the wechat avatar and a negative screen. The identification data may then be defined as location information. The mobile terminal may implement monitoring of the current location to determine location information. And then displaying an application icon for repairing the individual at the negative screen of the mobile terminal when the current position is in the preset area.
Optionally, in some embodiments, the identification data may include at least first identification data and second identification data, the first identification data being used to generate and/or launch the application and/or the avatar application; the second identification data is used to evoke an icon of the avatar application.
Illustratively, the first identification data is set as a right thumb fingerprint and the second identification data is set as a right index finger fingerprint. And when the fingerprint of the thumb of the right hand is acquired, generating and/or starting the application program and/or the split application, and acquiring the fingerprint of the index finger of the right hand to call an icon of the split application.
Optionally, in further embodiments, the identification data comprises at least first identification data and second identification data, the first identification data being used to generate and/or launch the application program and/or the avatar application, and to evoke an icon of the avatar application; the second identification data is used to generate and/or launch the application.
In the technical solution disclosed in this embodiment, an identification data is set to generate and/or start an application program and/or an avatar application, and then, when the identification data is received, the application program and/or the avatar application is generated and/or started, optionally, when the avatar application is generated and/or started based on the identification data, an application icon of the avatar application is hidden and displayed; and when the application program is generated and/or started based on the identification data, displaying the icon of the application program and the operation page of the application program.
Fourth embodiment
Referring to fig. 8, in a fourth embodiment, the method for controlling a mobile terminal includes the steps of:
step S41, receiving identification data;
step S42, starting a folder or an application program associated with the identification data, and optionally, setting the folder as an application icon for displaying the personal application.
In the present embodiment, a folder to which identification data is associated is set in advance. And after receiving identification data, starting a folder associated with the identification data. Optionally, the folder is set as an application icon for displaying the personal application.
Optionally, the mobile terminal is loaded with an application. When the mobile terminal receives an individual application creation instruction, an individual application of an application program corresponding to the individual application creation instruction can be created.
For example, when a long-press operation on the WeChat icon is received on the desktop, a secondary menu in which a function option for creating the divided application is displayed may be displayed on the desktop. When the function option of creating the divided body application is triggered, a WeChat divided body is created. Or when the wechat running interface receives a preset control operation (such as two fingers sliding downwards, finger joint knocking, long-press of a volume adjusting key and the like), establishing the wechat split application, and performing wechat split.
As an implementation manner, after an application of an application program is created, an entry of the application is set in an entry folder defined by a user. The entry folder can be set to be normally displayed or hidden displayed according to user-defined setting of a user. When the hidden display is set, the user may associate the entry folder with the identification data, and then open the folder in the hidden state or control the folder to exit the hidden state when receiving the identification data.
Optionally, the entry folder may be controlled to enter the hidden state again after the user locks the screen or exits the folder.
As another implementation, after an application's avatar application is created, the just created entry for the application's avatar is placed in a system folder. The system folder is a hidden folder. It can only be activated by entering preset identification data.
Optionally, when the mobile terminal is a folding mobile terminal, the identification data may further include a screen folding angle. When the screen folding angle meets the preset angle setting, an application program can be started, or an entrance of the corresponding application program is displayed.
Alternatively, the identification data may be directly associated with the avatar application of the application program, and the corresponding entry may not be created after the avatar application is created. And after receiving the identification data, directly starting the body-separating application.
In the technical scheme disclosed in this embodiment, a new method for starting the body-separated application is provided, so that an effect of simplifying a starting path of the body-separated application is achieved. Meanwhile, the data security of the mobile terminal is improved by starting the personal application through the identification data.
Alternatively, an application program associated with the identification data may be set in advance. And after receiving identification data, starting an application program associated with the identification data.
In order to enable those skilled in the art to better understand the scope of the claims herein. The technical solutions recited in the claims of the present application are explained below by specific implementation examples in specific application scenarios, and it should be understood that the following examples are only used for explaining the present application, and are not used for limiting the scope of the claims of the present application.
Example 1, a user may first enter a fingerprint, e.g., a right-hand thumb fingerprint, in a mobile terminal. And the fingerprint of the thumb of the right hand is set to be associated with the WeChat body by self. And entering another fingerprint, such as a right-hand index finger fingerprint, and setting the right-hand index finger fingerprint to be associated with the qq body. And under a preset scene, when the mobile terminal receives an action of inputting a fingerprint of a right thumb of the user, starting the wechat split body or displaying an entrance of the wechat split body. When the input action of the finger print of the index finger of the right hand is received, qq body division is started, or the entry of qq body division is displayed. Optionally, the predetermined scene may be set by a user in a customized manner, for example, set under a desktop interface; setting the application to be under the running interface of the main application of the corresponding application body; setting the screen to be under a screen locking interface; a state in which the current position of the terminal is set to be at a preset position, and so on. This example is not enumerated again. Different preset scenes are set, so that the mobile terminal has different functions. For example, when a predetermined scene is set in a state where the current position of the terminal is at a preset position, the corresponding application is started according to the received fingerprint. For example, if the preset position is set as the position of the home, only the mobile terminal is located at the preset position, and when the input action of the finger fingerprint of the index finger of the right hand is received, the qq body-splitting is started, or the entry of the qq body-splitting is displayed. Otherwise, qq cannot be started for differentiation. This avoids further improving the security of the data.
Example 2, the mobile terminal may be provided with an identification data acquisition trigger condition. Optionally, the recognition data acquisition triggering condition may be set to trigger a recognition data acquisition action when a preset control gesture is received, or set to trigger a recognition data acquisition action in a preset mode. In this example, the user may trigger a face data acquisition action and/or trigger a voiceprint data acquisition action by pressing the power key 3 times in succession. After the human face data acquisition action is triggered, the mobile terminal can start a camera device so as to acquire the human face data through the camera device. And after face data are collected, identifying the face data. And then determining prestored face data corresponding to the currently acquired face data according to the recognition result. And determining an application program associated with the pre-stored face data. Optionally, the application program may include a main program and/or an avatar application of an application program loaded by the mobile terminal. And then starting an application program associated with the pre-stored face data and/or displaying an entrance of the application program to drink.
Example 3, the mobile terminal may also be configured to associate with a particular network, particular bluetooth connection data, and/or airport communication data, etc. So that specific network, specific bluetooth connection data and/or near field communication data can be used as identification data for starting an application or for exposing an entry to an application in dependence of said identification data. For example, the mobile terminal presets a WiFi network named "TECNO _ office _ 01" in association with the nail. When the mobile terminal detects that the mobile terminal is connected to a WiFi network named "TECNO _ office _ 01", the stapling is started or the entry of the stapling is shown. Alternatively, the mobile terminal sets a bluetooth device named "hipeds H2" to be associated with qq music, and when the mobile terminal detects a connected bluetooth device named "hipeds H2", qq music is started, or a qq music entry is temporarily made.
Example 4, when a mobile terminal receives a particular touch trajectory, an application associated with the particular touch trajectory may be launched or presented, or an entry to the application may be presented. Referring to fig. 4, upon detecting a control operation at the screen painting c, Facebook avatar may be started, or an entry of Facebook avatar may be presented. Referring to fig. 5, a squared figure drawing interface may be displayed when a preset condition is satisfied. When a user-drawn graphic as shown in fig. 5 is received, an application associated with the graphic is launched or a portal of the application associated with the graphic is exposed.
Example 5, the user may further set an association relationship between the area and the application program, and the mobile terminal may acquire the self-positioning data in real time or at regular time. And determining the position of the user according to the positioning data. And when the position of the user self is judged to fall into a preset area, starting the application program associated with the preset area, or displaying an entrance of the application program associated with the preset area.
Example 6, an interface may be associated with one of the identification data by a user-defined setting, or based on a system definition. And when the mobile terminal receives the identification data, the interface associated with the identification data can be displayed. Alternatively, the interface may be provided as a portal display interface. Optionally, the portal display interface is used for displaying a portal of an application program (generally configured as an application icon, but may be configured in other forms, and this embodiment is not limited). When an application entrance in the entrance display interface is triggered, the application program corresponding to the entrance is started. In this example, different identification data may be associated with one or more applications, and only the entries for the applications associated with the currently received identification data are exposed in the portal exposure interface. It is of course also understood that different identification data are associated with different interfaces, in which different application entries are provided. Different interfaces may be presented when different identification data is received.
In the technical solution disclosed in this embodiment, since the application program can be directly started according to the identification data, an effect of simplifying the start path of the application program is achieved. Meanwhile, the data security of the mobile terminal is improved by identifying the data to start the application.
Fifth embodiment
Referring to fig. 9, in a third embodiment, a method for controlling a mobile terminal includes:
step S51, receiving identification data;
step S52, determining a target operation mode corresponding to the identification data;
and step S53, entering the target operation mode.
In this embodiment, the mobile terminal may receive the identification data, determine a target operation mode corresponding to the identification data after receiving the identification data, and then enter the target operation mode.
In one implementation scheme, in a screen locking interface of the mobile terminal, a user can select a plurality of different unlocking modes, and then the unlocking data is used as the identification data.
Example 1, the mobile terminal is provided with a plurality of unlocking schemes such as fingerprint unlocking, face unlocking, pattern unlocking and/or password unlocking. Optionally, a fingerprint unlock is associated with run mode 1, a face unlock is associated with run mode 2, a pattern unlock is associated with run mode 3, and/or a password unlock is associated with run mode 4. After the mobile terminal is unlocked in different unlocking modes, the mobile terminal can enter different operation modes.
Example 2, the mobile terminal is provided with a plurality of unlocking schemes such as fingerprint unlocking, face unlocking, pattern unlocking, and/or password unlocking. For fingerprint unlocking, a user may pre-register a number of fingerprints that are available to unlock the mobile terminal. For example, in a mobile terminal, the mobile terminal may be unlocked by a right index finger fingerprint and a right thumb fingerprint. While different fingerprints are associated with different modes of operation. After the mobile terminal is unlocked through the fingerprint of the index finger of the right hand, the mobile terminal enters the first running mode, and after the mobile terminal is unlocked through the fingerprint of the thumb of the right hand, the mobile terminal is controlled to enter the second running mode.
It will be appreciated that the user may also register a plurality of face data available for receipt, a plurality of patterns available for unlocking, and/or a plurality of unlock codes, etc. After the unlocking is carried out through different face data, pattern data and cooperation unlocking passwords, different operation modes can be entered.
In another implementation scheme, in the using process of the mobile terminal, an identification data office can be collected, and after receiving the identification data, the mobile terminal is controlled to be switched from the current mode to the operation mode corresponding to the identification data.
Example 1, when the mobile terminal detects that the current positioning data is located in a preset area, determining a target operation mode corresponding to the preset area. And if the current running mode and the target running mode are running, no response is made. Otherwise, switching from the current mode to the target operation mode.
Example 2, when networking data, bluetooth connection data, and/or near field communication data of the mobile terminal change, a target operation mode associated with the networking data, the bluetooth connection data, and/or the near field communication data corresponding to the current time is obtained. And if the current running mode and the target running mode are running, no response is made. Otherwise, switching from the current mode to the target operation mode.
Example 3, the mobile terminal may obtain a touch trajectory of a touch operation pair of the user, and when a preset touch trajectory is received, obtain a target operation mode corresponding to the touch trajectory. And if the current running mode and the target running mode are running, no response is made. Otherwise, switching from the current mode to the target operation mode.
Example 4, a user may custom set a control action that triggers the display of a Sudoku drawing page as shown in FIG. 5. For example, a long press of a volume key, a double click on a screen, or the like may be set. And when the control action set by the user is received, displaying a nine-square grid pattern drawing page. And receives a user drawn pattern. And further acquiring a target operation mode associated with the pattern. And if the current running mode and the target running mode are running, no response is made. Otherwise, switching from the current mode to the target operation mode.
Optionally, in this embodiment, the application program corresponding to each of the operation modes is different.
Illustratively, the mobile terminal is provided with a main application mode and an individual application mode. When the mobile terminal runs in the main application mode, the desktop does not display the entrance of the personal application, and only displays the entrance of the main application. And/or the master application mode may also restrict the split application launch. When the mobile terminal runs in the split application mode, the desktop does not display the entrance of the main application, only displays the entrance of the split application, and/or also limits the starting of the main application.
It should be noted that, when the mobile terminal operates in the first operation mode, the application corresponding to the first operation mode may normally receive and display the new message. The application corresponding to the second operating mode may be configured to allow receipt of new messages but not display the message alert. Or set to allow receipt of a new message but only display the message alert and not view the new message. Or set to not receive new messages. Optionally, the mobile terminal may further display a setting page, so that the user can set the application program corresponding to each of the operation modes through the setting page.
In the technical solution disclosed in this embodiment, the mobile terminal may enter different operation modes through different identification data. Therefore, the effect of improving the applicable scene of the mobile terminal is achieved. For example, in some african households, parents and children need to share a cell phone. The traditional mobile phone control scheme causes the applications used by parents and children to be shared, so that the privacy protection is poor. The scheme provided by the embodiment enables children and parents to use the mobile terminal completely and independently without mutual interference.
The application also provides a mobile terminal device, the terminal device includes a memory and a processor, the memory stores a control program of the mobile terminal, and the control program of the mobile terminal implements the steps of the control method of the mobile terminal in any of the above embodiments when executed by the processor.
The present application further provides a computer storage medium, in which a control program of the mobile terminal is stored, and the control program of the mobile terminal implements the steps of the control method of the mobile terminal in any of the above embodiments when executed by the processor.
In the embodiments of the mobile terminal and the computer storage medium provided in the present application, all technical features of the embodiments of the control method of the mobile terminal are included, and the expanding and explaining contents of the specification are basically the same as those of the embodiments of the method, and are not described herein again.
Embodiments of the present application also provide a computer program product, which includes computer program code, when the computer program code runs on a computer, the computer is caused to execute the method in the above various possible embodiments.
Embodiments of the present application further provide a chip, which includes a memory and a processor, where the memory is used to store a computer program, and the processor is used to call and run the computer program from the memory, so that a device in which the chip is installed executes the method in the above various possible embodiments.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as within the scope of the present disclosure as long as there is no contradiction between the combinations of the technical features.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, a controlled terminal, or a network device) to execute the method of each embodiment of the present application.
The above description is only a preferred embodiment of the present application, and not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application, or which are directly or indirectly applied to other related technical fields, are included in the scope of the present application.

Claims (18)

1. A control method of a mobile terminal is characterized in that the control method of the mobile terminal comprises the following steps:
receiving identification data;
determining a separate application and/or display position associated with the identification data;
and starting the body-splitting application and/or displaying the body-splitting application at the display position.
2. The method of claim 1, comprising at least one of:
the display position comprises at least one of the following interfaces of the mobile terminal: a main interface, a negative screen, a screen locking interface and a right screen;
under a screen locking interface, when a preset rule is met, displaying an application icon of the personal application on the screen locking interface so as to trigger the personal application through the application icon;
displaying an application icon of the body-divided application in an operation interface of the mobile terminal so as to trigger the body-divided application based on the application icon.
3. The method according to claim 1 or 2, wherein the step of launching the avatar application and/or displaying the avatar application at the display location comprises:
directly starting the body-divided application, and displaying the running interface of the body-divided application at the display position; or displaying the application icon of the personal application at the display position, and starting the personal application corresponding to the selected application icon when the application icon is selected.
4. A control method of a mobile terminal, characterized in that the control method of the mobile terminal comprises the following steps:
receiving identification data;
and generating and/or starting the body-separating application corresponding to the identification data, and/or hiding and displaying the body-separating application.
5. The method of claim 4, comprising at least one of:
the personal application is set to be triggered through an application icon of a corresponding main application, or through a function button of the main application;
the step of generating and/or starting the body-divided application corresponding to the identification data and/or hiding and displaying the body-divided application comprises the following steps: and determining the body-separating application corresponding to the identification data, and generating the body-separating application corresponding to the identification data and/or hiding and displaying the body-separating application when the body-separating application corresponding to the identification data is not installed in the mobile terminal.
6. The method of claim 5, further comprising at least one of:
after the step of generating the body-divided application corresponding to the identification data and/or hiding and displaying the body-divided application, the method further includes: setting the identification data as starting data of the body-divided application, or setting the identification data as calling-out data of an application icon of the body-divided application;
after the step of determining the avatar application corresponding to the identification data, the method further includes: and when the body-separating application corresponding to the identification data is installed in the mobile terminal, starting the body-separating application corresponding to the identification data.
7. A control method of a mobile terminal, characterized in that the control method of the mobile terminal comprises the following steps:
setting identification data as a generation and/or starting application program and/or an individuation application;
and generating and/or starting an application program and/or an avatar application when the identification data is received, wherein an application icon of the avatar application is hidden and displayed when the avatar application is generated and/or started based on the identification data, and/or the icon of the application program and/or an operation page of the application program is displayed when the application program is generated and/or started based on the identification data.
8. The method according to claim 7, wherein the identification data is set as start data of the self-body application after the self-body application is generated based on the identification data and/or an application icon of the self-body application is hidden and displayed, or the identification data is set as call-out data of the application icon of the self-body application, so that the application icon of the self-body application is displayed at a preset position when the identification data is received again.
9. The method of claim 7 or 8, comprising at least one of:
after hiding and displaying the application icon of the divided application, the method further comprises the following steps: when the identification data is received, determining a display position associated with the identification data, displaying an application icon of the body-divided application at the display position, or starting the body-divided application, and displaying an operation interface of the body-divided application at the display position;
the identification data may comprise at least first identification data and/or second identification data, the first identification data being used to generate and/or launch the application program and/or the avatar application, and/or the second identification data being used to evoke an icon of the avatar application;
the identification data comprise at least first identification data and/or second identification data, the first identification data are used for generating and/or starting the application program and/or the body-divided application, and/or evoking an icon of the body-divided application, and/or the second identification data are used for generating and/or starting the application program.
10. A control method of a mobile terminal, characterized in that the control method of the mobile terminal comprises the following steps:
receiving identification data;
and starting a folder or an application program associated with the identification data, wherein the folder is set as an application icon for displaying the personal application.
11. The method of claim 10, comprising at least one of:
the identification data includes at least one of: fingerprint data, face data, positioning data, networking data, Bluetooth connection data, near field communication data and touch tracks;
executing the step of receiving identification data when a preset control gesture is received;
when the application icon is triggered, starting the self-body-splitting application corresponding to the triggered application icon;
the mobile terminal is a folding screen mobile terminal, and/or the identification data comprises a screen folding angle of the folding screen mobile terminal;
the step of starting the folder or the application program associated with the identification data comprises the following steps: determining a target application program associated with the identification data, and starting the target application program associated with the identification data.
12. The method of claim 11, wherein the step of determining the target application to which the identification data is associated comprises at least one of:
acquiring an application program associated with the current position as the target application program;
acquiring an application program associated with a currently connected network as the target application program;
acquiring an application program associated with the currently connected Bluetooth device as the target application program;
and acquiring the application program associated with the currently received fingerprint as the target application program.
13. A control method of a mobile terminal, characterized in that the control method of the mobile terminal comprises the following steps:
receiving identification data;
determining a target operation mode corresponding to the identification data;
and entering the target operation mode.
14. The method of claim 13, comprising at least one of:
the identification data comprises at least one of user data and terminal data;
the user data comprises fingerprint data and/or face data;
the terminal data comprises at least one of positioning data, networking data, Bluetooth connection data, near field communication data and touch tracks;
the mobile terminal is provided with at least one operation mode, and the application program corresponding to each operation mode is different;
after the step of entering the target operation mode, the method further comprises: and determining the application program corresponding to the target operation mode, and displaying the program entry of the application program corresponding to the target operation mode.
15. The method according to claim 13 or 14, wherein the control method of the mobile terminal further comprises:
displaying a setting page for a user to set the application program corresponding to each running mode through the setting page;
an application inlet corresponding to the main application and an application inlet of the personal application of the main application are respectively associated with different running modes;
the step of receiving an identification data comprises: and when receiving unlocking data, taking the unlocking data as the identification data.
16. The method according to claim 15, characterized in that the mobile terminal is provided with at least two unlocking data; or at least two different data of the same kind are set in the mobile terminal as unlocking data.
17. A mobile terminal, characterized in that the mobile terminal comprises a memory, a processor and a control program of the mobile terminal stored on the memory and operable on the processor, the control program of the mobile terminal, when executed by the processor, implementing the steps of the control method of the mobile terminal according to any one of claims 1 to 16.
18. A computer storage medium, characterized in that the computer storage medium has stored thereon a control program of a mobile terminal, which when executed by a processor implements the steps of the control method of the mobile terminal according to any one of claims 1 to 16.
CN202011326402.2A 2020-11-23 2020-11-23 Control method of mobile terminal, mobile terminal and storage medium Pending CN112468643A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011326402.2A CN112468643A (en) 2020-11-23 2020-11-23 Control method of mobile terminal, mobile terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011326402.2A CN112468643A (en) 2020-11-23 2020-11-23 Control method of mobile terminal, mobile terminal and storage medium

Publications (1)

Publication Number Publication Date
CN112468643A true CN112468643A (en) 2021-03-09

Family

ID=74799440

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011326402.2A Pending CN112468643A (en) 2020-11-23 2020-11-23 Control method of mobile terminal, mobile terminal and storage medium

Country Status (1)

Country Link
CN (1) CN112468643A (en)

Citations (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1455333A (en) * 2002-04-30 2003-11-12 富士通株式会社 Environment setting device, environment setting programme storage medium, information processing device and environment setting method
US7702890B2 (en) * 2003-06-05 2010-04-20 Sony Corporation Information processing apparatus and program
CN103024152A (en) * 2012-11-28 2013-04-03 深圳市中兴移动通信有限公司 Method, device and intelligent terminal for switching between multipurpose modes
CN104717344A (en) * 2013-12-12 2015-06-17 联想(北京)有限公司 Application mode switching method and device for mobile terminal and mobile terminal
CN105120111A (en) * 2015-09-24 2015-12-02 联想(北京)有限公司 Application mode switching method and apparatus
CN105426728A (en) * 2015-12-21 2016-03-23 深圳市金立通信设备有限公司 Application starting method and terminal
CN105678141A (en) * 2015-12-30 2016-06-15 魅族科技(中国)有限公司 Information exhibiting method and device and terminal
CN106484468A (en) * 2016-09-23 2017-03-08 宇龙计算机通信科技(深圳)有限公司 The management method, managing device of application program and mobile terminal
CN106484482A (en) * 2016-10-10 2017-03-08 广东欧珀移动通信有限公司 A kind of application management method of attending to anything else opened in application, device and intelligent terminal more
CN106535314A (en) * 2017-01-13 2017-03-22 滁州昭阳电信通讯设备科技有限公司 Intelligent power-saving control method and mobile terminal
CN106951757A (en) * 2017-02-28 2017-07-14 宇龙计算机通信科技(深圳)有限公司 A kind of method and apparatus for operating application program
CN107566638A (en) * 2017-08-31 2018-01-09 维沃移动通信有限公司 The display control method and mobile terminal of a kind of application program
CN107944277A (en) * 2017-11-21 2018-04-20 广东欧珀移动通信有限公司 Using the control method of startup, device, storage medium and intelligent terminal
CN108038360A (en) * 2017-11-15 2018-05-15 维沃移动通信有限公司 The switching method and mobile terminal of a kind of operational mode
CN108845730A (en) * 2018-05-29 2018-11-20 维沃移动通信有限公司 A kind of object control method and terminal
CN108958623A (en) * 2018-06-22 2018-12-07 维沃移动通信有限公司 A kind of application program launching method and terminal device
CN109151153A (en) * 2017-06-19 2019-01-04 中兴通讯股份有限公司 A kind of application management method and device and terminal
CN109274815A (en) * 2018-08-22 2019-01-25 奇酷互联网络科技(深圳)有限公司 Program operation control method, device, readable storage medium storing program for executing and mobile terminal
CN110147186A (en) * 2019-04-12 2019-08-20 维沃移动通信有限公司 A kind of control method and terminal device of application
CN110210195A (en) * 2019-05-07 2019-09-06 珠海格力电器股份有限公司 Fingerprint operation control method and device, storage medium and mobile terminal
CN110826029A (en) * 2019-11-04 2020-02-21 深圳传音控股股份有限公司 Application protection method, control device and readable storage medium
CN110888724A (en) * 2019-11-20 2020-03-17 北京明略软件***有限公司 Method and device for setting personal differentiation, electronic equipment and readable storage medium
CN110941469A (en) * 2019-11-14 2020-03-31 维沃移动通信有限公司 Application body-splitting creating method and terminal equipment thereof

Patent Citations (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1455333A (en) * 2002-04-30 2003-11-12 富士通株式会社 Environment setting device, environment setting programme storage medium, information processing device and environment setting method
US7702890B2 (en) * 2003-06-05 2010-04-20 Sony Corporation Information processing apparatus and program
CN103024152A (en) * 2012-11-28 2013-04-03 深圳市中兴移动通信有限公司 Method, device and intelligent terminal for switching between multipurpose modes
CN104717344A (en) * 2013-12-12 2015-06-17 联想(北京)有限公司 Application mode switching method and device for mobile terminal and mobile terminal
CN105120111A (en) * 2015-09-24 2015-12-02 联想(北京)有限公司 Application mode switching method and apparatus
CN105426728A (en) * 2015-12-21 2016-03-23 深圳市金立通信设备有限公司 Application starting method and terminal
CN105678141A (en) * 2015-12-30 2016-06-15 魅族科技(中国)有限公司 Information exhibiting method and device and terminal
CN106484468A (en) * 2016-09-23 2017-03-08 宇龙计算机通信科技(深圳)有限公司 The management method, managing device of application program and mobile terminal
CN106484482A (en) * 2016-10-10 2017-03-08 广东欧珀移动通信有限公司 A kind of application management method of attending to anything else opened in application, device and intelligent terminal more
CN106535314A (en) * 2017-01-13 2017-03-22 滁州昭阳电信通讯设备科技有限公司 Intelligent power-saving control method and mobile terminal
CN106951757A (en) * 2017-02-28 2017-07-14 宇龙计算机通信科技(深圳)有限公司 A kind of method and apparatus for operating application program
CN109151153A (en) * 2017-06-19 2019-01-04 中兴通讯股份有限公司 A kind of application management method and device and terminal
CN107566638A (en) * 2017-08-31 2018-01-09 维沃移动通信有限公司 The display control method and mobile terminal of a kind of application program
CN108038360A (en) * 2017-11-15 2018-05-15 维沃移动通信有限公司 The switching method and mobile terminal of a kind of operational mode
CN107944277A (en) * 2017-11-21 2018-04-20 广东欧珀移动通信有限公司 Using the control method of startup, device, storage medium and intelligent terminal
CN108845730A (en) * 2018-05-29 2018-11-20 维沃移动通信有限公司 A kind of object control method and terminal
CN108958623A (en) * 2018-06-22 2018-12-07 维沃移动通信有限公司 A kind of application program launching method and terminal device
CN109274815A (en) * 2018-08-22 2019-01-25 奇酷互联网络科技(深圳)有限公司 Program operation control method, device, readable storage medium storing program for executing and mobile terminal
CN110147186A (en) * 2019-04-12 2019-08-20 维沃移动通信有限公司 A kind of control method and terminal device of application
CN110210195A (en) * 2019-05-07 2019-09-06 珠海格力电器股份有限公司 Fingerprint operation control method and device, storage medium and mobile terminal
CN110826029A (en) * 2019-11-04 2020-02-21 深圳传音控股股份有限公司 Application protection method, control device and readable storage medium
CN110941469A (en) * 2019-11-14 2020-03-31 维沃移动通信有限公司 Application body-splitting creating method and terminal equipment thereof
CN110888724A (en) * 2019-11-20 2020-03-17 北京明略软件***有限公司 Method and device for setting personal differentiation, electronic equipment and readable storage medium

Similar Documents

Publication Publication Date Title
CN108287611B (en) Screen touch response method, terminal and computer storage medium
CN109800602B (en) Privacy protection method, mobile terminal and computer readable storage medium
CN107885448B (en) Control method for application touch operation, mobile terminal and readable storage medium
CN109146463B (en) Mobile payment method, mobile terminal and computer readable storage medium
CN108376239B (en) Face recognition method, mobile terminal and storage medium
CN112162870A (en) File processing operation method, mobile terminal and storage medium
CN112181233B (en) Message processing method, intelligent terminal and computer readable storage medium
CN107422956B (en) Mobile terminal operation response method, mobile terminal and readable storage medium
CN108108082B (en) Information processing method, terminal and computer storage medium
CN108012270B (en) Information processing method, equipment and computer readable storage medium
CN112230826B (en) Control method of mobile terminal, mobile terminal and storage medium
CN111427709A (en) Application program body-separating control method and device and computer readable storage medium
CN108810262B (en) Application configuration method, terminal and computer readable storage medium
CN109063444B (en) Mobile terminal screen unlocking method, mobile terminal and computer readable storage medium
CN108282608B (en) Multi-region focusing method, mobile terminal and computer readable storage medium
CN113515254A (en) Interface display method, terminal and storage medium
CN112468647A (en) Control method of mobile terminal, mobile terminal and storage medium
CN109711198B (en) Application management method, mobile terminal and storage medium
CN111949957A (en) Privacy protection method, device and storage medium
CN112306328B (en) Control method of mobile terminal, mobile terminal and storage medium
CN109151175B (en) Mobile terminal safety control method, mobile terminal and computer storage medium
CN112434283A (en) Control method of mobile terminal, mobile terminal and storage medium
CN109104208B (en) Card slot control method, terminal and computer readable storage medium
CN110399083B (en) Game space starting method, terminal and computer readable storage medium
CN113867586A (en) Icon display method, intelligent terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination