CN110691165A - Navigation operation method and electronic equipment - Google Patents

Navigation operation method and electronic equipment Download PDF

Info

Publication number
CN110691165A
CN110691165A CN201910765440.9A CN201910765440A CN110691165A CN 110691165 A CN110691165 A CN 110691165A CN 201910765440 A CN201910765440 A CN 201910765440A CN 110691165 A CN110691165 A CN 110691165A
Authority
CN
China
Prior art keywords
user
screen
curved
interface
electronic equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910765440.9A
Other languages
Chinese (zh)
Inventor
彭玉卓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN110691165A publication Critical patent/CN110691165A/en
Priority to PCT/CN2020/085868 priority Critical patent/WO2020221062A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a navigation operation method and electronic equipment, and relates to the field of electronic equipment. The problems of low navigation function realization efficiency and poor man-machine interaction performance are solved. After detecting that a user inputs an operation in a specific position of the touch screen, such as an arc area of a curved screen with an arc on the side, the electronic device can display a corresponding interface according to the operation. Wherein, the input operation is different, and the displayed interface is different; or the input operation positions are different, and the displayed interfaces are different; or the displayed interface is different due to different input operation degrees.

Description

Navigation operation method and electronic equipment
The present application claims priority from a chinese patent application filed on 30/04/2019 under the name "a method of operating with curved screen" by the national intellectual property office under the application number 201910358348.0, the entire contents of which are incorporated herein by reference.
Technical Field
The present application relates to the field of electronic devices, and in particular, to a navigation operation method and an electronic device.
Background
The mobile phone generally provides a navigation function for a user to quickly open a specific interface, such as a desktop (or called home screen), a previous interface of a current interface, a multitasking interface, and the like. Currently, the navigation function can be realized in the following three ways.
In the method 1, as shown in fig. 1 (a), three physical keys are arranged on the mobile phone, which are respectively: a back key 101, a home key 102, and a recent key 103.
Mode 2, as shown in fig. 1 (b), the navigation bar 100 is displayed on the touch panel of the mobile phone. The navigation bar 100 includes three virtual keys, a return key 104, a home key 105 and a last key 106. In the mode 1 and the mode 2, the user can trigger the mobile phone to open the corresponding interface by clicking the corresponding key. Taking (b) in fig. 1 as an example, the user clicks the return key 104, and the mobile phone displays the interface at the previous stage of the current interface. The user clicks the home key 105 and the handset displays the desktop. The user clicks the last button 106 and the cell phone displays a multitasking interface.
Mode 3, gesture navigation. And the user triggers the mobile phone to open the corresponding interface by inputting the corresponding gesture. For example, as shown in fig. 2, the finger of the user slides into the touch screen from the left edge (or the right edge) of the touch screen (as shown by the operation trace 201 in fig. 2), and the mobile phone displays the interface at the previous level of the current interface. The user's finger slides up from the lower edge of the touch screen (as shown by the operation trace 202 in fig. 2), and the cell phone displays the desktop. The user's finger slides over the lower edge web of the touch screen and stays there for a period of time (as shown by the operational trace 203 in fig. 2), and the cell phone displays a multitasking interface.
However, in the above-described modes 1 and 2, the positions of the keys are fixed. In the above-described mode 3, the finger movement distance is large when the user inputs a gesture. These all can make the navigation function inefficient, resulting in poor human-computer interaction performance.
Disclosure of Invention
The embodiment of the application provides a navigation operation method and electronic equipment, and solves the problems of low navigation function realization efficiency and poor man-machine interaction performance.
In order to achieve the above purpose, the embodiment of the present application adopts the following technical solutions:
in a first aspect, an embodiment of the present application provides a navigation operation method, which is applied to an electronic device, where a touch screen of the electronic device is a curved screen with a radian on a side, and in a vertical screen state, the curved screen displays a first interface, and the method may include: the method comprises the steps that the electronic equipment receives a first operation of a user in an arc area on the side edge of the curved screen, and in response to the first operation, the electronic equipment displays a second interface, wherein the second interface is a previous interface of the first interface; or the electronic equipment receives a second operation of the user in the radian area of the side edge of the curved screen, and responds to the second operation, the electronic equipment displays the main screen; or the electronic equipment receives a third operation of the user in the radian area of the side edge of the curved screen, and in response to the third operation, the electronic equipment displays a multi-task interface, wherein the multi-task interface comprises an identification of an application running in the background in the electronic equipment or an application recently used by the user.
According to the method provided by the embodiment, the navigation function of the electronic equipment is realized by utilizing the characteristic that fingers can contact with the touch screen of the electronic equipment when a user holds the electronic equipment. For example, the user performs different operations in the arc area of the side of the curved-screen electronic device to trigger the curved-screen electronic device to display different interfaces. The navigation function can be realized more quickly and conveniently, the man-machine interaction efficiency is improved, and the use experience of a user is improved.
In one possible implementation, the method may further include: and the electronic equipment starts the navigation mode when receiving the fourth operation of the user, or detecting a preset holding gesture, or detecting that a physical button or a virtual button arranged on the side edge of the curved screen is triggered. Therefore, in the navigation mode, the characteristic that when a user holds the electronic equipment with the curved screen, fingers can contact the radian area of the side edge of the curved screen is utilized to realize the navigation function of the mobile phone. Misoperation can be effectively prevented, and the use experience of a user is improved.
In another possible implementation manner, the first operation, the second operation, and the third operation are different; the first operation, the second operation, and the third operation are each any one of the following operations: a first sliding operation, a second sliding operation, a double-click operation, a single-click operation, a pressing operation and a long-time pressing operation; the first sliding operation is the sliding operation of which the starting point is the radian area of the side edge of the curved screen, the sliding direction points to the lower side edge of the curved screen, the second sliding operation is the sliding operation of which the starting point is the radian area of the first side edge of the curved screen, and the sliding direction points to the second side edge of the curved screen.
In another possible implementation manner, the fourth operation is different from the first operation, the second operation and the third operation, and the fourth operation is any one of the following operations: sliding operation, pressing operation, and long-time pressing operation; the fourth operation and the first operation are continuous operations of not leaving the curved screen by the fingers of the user; the fourth operation and the second operation are continuous operations of not leaving the curved screen by the fingers of the user; the fourth operation and the third operation are continuous operations in which the user's finger does not leave the curved screen.
In a second aspect, a navigation operation method in an embodiment of the present application is applied to an electronic device, where a touch screen of the electronic device is a curved screen with a radian on a side, and in a vertical screen state, the curved screen displays a first interface, and the method may include: the method comprises the steps that electronic equipment receives first operation of a user in a radian area of the side edge of a curved screen; responding to the first operation, the electronic equipment displays an operation menu, wherein the operation menu at least comprises a first button, a second button and a third button; the electronic equipment receives the operation of a user on the first button, and responds to the operation on the first button, the electronic equipment displays a second interface, wherein the second interface is a previous interface of the first interface; or, the electronic device receives the operation of the second button by the user, and in response to the operation of the second button, the electronic device displays the main screen; or the electronic equipment receives the operation of the user on the third button, and in response to the operation of the third button, the electronic equipment displays a multitasking interface which comprises the application running in the background in the electronic equipment or the identification of the application used by the user most recently.
According to the method provided by the embodiment, the navigation function of the electronic equipment is realized by utilizing the characteristic that fingers can contact with the touch screen of the electronic equipment when a user holds the electronic equipment. For example, a user inputs an operation through an arc area on the side of the curved-surface-screen electronic device to trigger the curved-surface-screen electronic device to display a navigation menu, and the user can execute the operation on different buttons in the navigation menu to trigger the curved-surface-screen electronic device to display different interfaces. The navigation function can be realized more quickly and conveniently, the man-machine interaction efficiency is improved, and the use experience of a user is improved.
In one possible implementation, the first button obtains a focus when the electronic device displays the operation menu; the electronic equipment receives the operation of a user on a first button, and comprises the following steps: the electronic equipment receives a second operation of the user in the radian area of the side edge of the curved screen; the second operation and the first operation are continuous operations of not leaving the curved screen by the fingers of the user; or, before the electronic device receives the operation of the second button by the user, the method further includes: the electronic equipment receives a third operation of a user in the radian area of the side edge of the curved screen, and in response to the third operation, the electronic equipment switches the focus of the operation menu to a second button; the electronic equipment receives the operation of the user on the second button, and the operation comprises the following steps: the electronic equipment receives a second operation of the user in the radian area of the side edge of the curved screen; the third operation and the first operation are continuous operations that the user fingers do not leave the curved screen, and the second operation and the third operation are continuous operations that the user fingers do not leave the curved screen; or, before the electronic device receives the operation of the third button by the user, the method further includes: the electronic equipment receives a third operation of the user in the radian area of the side edge of the curved screen, and in response to the third operation, the electronic equipment switches the focus of the operation menu to a third button; the electronic equipment receives the operation of the user on the third button, and the operation comprises the following steps: the electronic equipment receives a second operation of the user in the radian area of the side edge of the curved screen; the third operation and the first operation are continuous operations in which the user's finger does not leave the curved screen, and the second operation and the third operation are continuous operations in which the user's finger does not leave the curved screen. Therefore, the user executes corresponding operation on the side edge of the curved screen, so that the focus is switched in the operation menu, the electronic equipment of the curved screen is triggered to display different interfaces, the human-computer interaction efficiency is further improved, and the use experience of the user is improved.
In a third aspect, an embodiment of the present application provides a navigation operation method, which is applied to an electronic device, where a touch screen of the electronic device is a curved screen with a radian on a side, and in a vertical screen state, the curved screen displays a first interface, and the method may include: the electronic equipment receives a first operation of a user at a first position of an arc area on the side edge of the curved screen, and responds to the first operation at the first position, the electronic equipment displays a second interface, wherein the second interface is a previous interface of the first interface; or the electronic equipment receives a second operation of the user at a second position of the radian area at the side edge of the curved screen, and responds to the second operation at the second position, the electronic equipment displays the main screen; or the electronic equipment receives a third operation of the user at a third position of the radian area of the side edge of the curved screen, and in response to the third operation at the third position, the electronic equipment displays a multitasking interface which comprises an application running in the background of the electronic equipment or an identification of an application recently used by the user.
According to the method provided by the embodiment, the navigation function of the electronic equipment is realized by utilizing the characteristic that fingers can contact with the touch screen of the electronic equipment when a user holds the electronic equipment. For example, the user performs operations at different positions of the arc area on the side of the curved-screen electronic device to trigger the curved-screen electronic device to display different interfaces. The navigation function can be realized more quickly and conveniently, the man-machine interaction efficiency is improved, and the use experience of a user is improved.
In one possible implementation, the method may further include: and the electronic equipment starts the navigation mode when receiving a fourth operation of the user at a fourth position of the radian area of the side edge of the curved screen, or detecting a preset holding gesture, or detecting that a physical button or a virtual button arranged on the side edge of the curved screen is triggered. Therefore, in the navigation mode, the characteristic that when a user holds the electronic equipment with the curved screen, fingers can contact the radian area of the side edge of the curved screen is utilized to realize the navigation function of the mobile phone. Misoperation can be effectively prevented, and the use experience of a user is improved.
In another possible implementation, the fourth position is the same as the first position, the second position, or the third position, and the fourth operation is different from the first operation, the second operation, and the third operation.
In another possible implementation, the first operation, the second operation, and the third operation are different; or the first operation, the second operation and the third operation are the same; alternatively, the first operation, the second operation, and the third operation are not identical.
In a fourth aspect, an embodiment of the present application provides a navigation operation method, which is applied to an electronic device, where a touch screen of the electronic device is a curved screen with a radian on a side, and in a vertical screen state, the curved screen displays a first interface, and the method may include: the electronic equipment receives the operation of different degrees of the radian area of the side edge of the curved screen of a user; responding to the operations of different degrees, and displaying different interfaces by the electronic equipment; wherein the different interfaces include: a second interface, a main screen, and a multitasking interface; the second interface is a superior interface of the first interface.
According to the method provided by the embodiment, the navigation function of the electronic equipment is realized by utilizing the characteristic that fingers can contact with the touch screen of the electronic equipment when a user holds the electronic equipment. For example, the user performs operations of different degrees in the arc area of the side of the curved screen electronic device to trigger the curved screen electronic device to display different interfaces. The navigation function can be realized more quickly and conveniently, the man-machine interaction efficiency is improved, and the use experience of a user is improved.
In one possible implementation, the different degrees of operation are: pressing operations with different pressure magnitudes, sliding operations with different sliding distances or pressing operations with different pressing duration.
Of course, in the landscape state, the electronic device may also implement the navigation function of the electronic device according to the method provided in the first aspect, the second aspect, the third aspect, or the fourth aspect. The specific implementation is the same as the implementation of the above aspects, and details are not repeated here.
In a fifth aspect, an embodiment of the present application provides an electronic device, which may include: a processor, a memory, and a touch screen; the touch screen is a curved screen with radian on the side edge, and the curved screen displays a first interface in a vertical screen state; a processor, a touch screen and a memory coupled to the processor, the memory for storing computer program code, the computer program code comprising computer instructions that, when executed by the electronic device, cause the electronic device to perform operations comprising: receiving a first operation of a user in a radian area of the side edge of the curved screen, responding to the first operation, and displaying a second interface, wherein the second interface is a previous-level interface of the first interface; or receiving a second operation of the user in the radian area of the side edge of the curved screen, and responding to the second operation to display the main screen; or receiving a third operation of the user in the radian area of the side of the curved screen, and responding to the third operation to display a multi-task interface, wherein the multi-task interface comprises an application running in the background of the electronic equipment or an identification of the application used by the user most recently.
In one possible implementation, the computer instructions, when executed by the electronic device, cause the electronic device to further perform the following: and starting the navigation mode when receiving a fourth operation of the user, or detecting a preset holding gesture, or detecting that a physical button or a virtual button arranged on the side of the curved screen is triggered.
In another possible implementation, the first operation, the second operation, and the third operation are different; the first operation, the second operation, and the third operation are each any one of the following operations: a first sliding operation, a second sliding operation, a double-click operation, a single-click operation, a pressing operation and a long-time pressing operation; the first sliding operation is the sliding operation of which the starting point is the radian area of the side edge of the curved screen, the sliding direction points to the lower side edge of the curved screen, the second sliding operation is the sliding operation of which the starting point is the radian area of the first side edge of the curved screen, and the sliding direction points to the second side edge of the curved screen.
In another possible implementation manner, the fourth operation is different from the first operation, the second operation and the third operation, and the fourth operation is any one of the following operations: sliding operation, pressing operation, and long-time pressing operation; the fourth operation and the first operation are continuous operations of not leaving the curved screen by the fingers of the user; the fourth operation and the second operation are continuous operations of not leaving the curved screen by the fingers of the user; the fourth operation and the third operation are continuous operations in which the user's finger does not leave the curved screen.
In a sixth aspect, an embodiment of the present application provides an electronic device, which may include: a processor, a memory, and a touch screen; the touch screen is a curved screen with radian on the side edge, and the curved screen displays a first interface in a vertical screen state; a processor, a touch screen and a memory coupled to the processor, the memory for storing computer program code, the computer program code comprising computer instructions that, when executed by the electronic device, cause the electronic device to perform operations comprising: receiving a first operation of a user in a radian area of the side edge of the curved screen; responding to the first operation, displaying an operation menu, wherein the operation menu at least comprises a first button, a second button and a third button; receiving the operation of a user on the first button, and responding to the operation on the first button to display a second interface, wherein the second interface is a previous interface of the first interface; or, receiving the operation of the second button by the user, and responding to the operation of the second button to display the main screen; or receiving the operation of the user on the third button, and responding to the operation on the third button to display a multi-task interface, wherein the multi-task interface comprises an application running in the background in the electronic equipment or an identification of the application used by the user recently.
In one possible implementation, the first button obtains a focus while displaying the operation menu; receiving operation of a first button by a user, comprising: receiving a second operation of the user in a radian area of the side edge of the curved screen; the second operation and the first operation are continuous operations of not leaving the curved screen by the fingers of the user; or, the computer instructions, when executed by the electronic device, cause the electronic device to further perform the following: receiving a third operation of a user in the radian area of the side edge of the curved screen, and switching the focus of the operation menu to a second button in response to the third operation; receiving the operation of a user on a second button, comprising: receiving a second operation of the user in a radian area of the side edge of the curved screen; the third operation and the first operation are continuous operations that the user fingers do not leave the curved screen, and the second operation and the third operation are continuous operations that the user fingers do not leave the curved screen; or, the computer instructions, when executed by the electronic device, cause the electronic device to further perform the following: receiving a third operation of a user in the radian area of the side edge of the curved screen, and switching the focus of the operation menu to a third button in response to the third operation; receiving operation of a third button by a user, wherein the operation comprises the following steps: receiving a second operation of the user in a radian area of the side edge of the curved screen; the third operation and the first operation are continuous operations in which the user's finger does not leave the curved screen, and the second operation and the third operation are continuous operations in which the user's finger does not leave the curved screen.
In a seventh aspect, an embodiment of the present application provides an electronic device, which may include: a processor, a memory, and a touch screen; the touch screen is a curved screen with radian on the side edge, and the curved screen displays a first interface in a vertical screen state; a processor, a touch screen and a memory coupled to the processor, the memory for storing computer program code, the computer program code comprising computer instructions that, when executed by the electronic device, cause the electronic device to perform operations comprising: receiving a first operation of a user at a first position of an arc area at the side edge of the curved screen, and responding to the first operation at the first position to display a second interface, wherein the second interface is a previous-level interface of the first interface; or, receiving a second operation of the user at a second position of the radian area at the side of the curved screen, and responding to the second operation at the second position to display the main screen; or receiving a third operation of the user at a third position of the radian area of the side edge of the curved screen, and responding to the third operation at the third position to display a multi-task interface, wherein the multi-task interface comprises an application running in the background in the electronic equipment or an identification of the application used by the user most recently.
In one possible implementation, the computer instructions, when executed by the electronic device, cause the electronic device to further perform the following: and starting the navigation mode when receiving a fourth operation of the user at a fourth position of the radian area of the side edge of the curved screen, or detecting a preset holding gesture, or detecting that a physical button or a virtual button arranged on the side edge of the curved screen is triggered.
In another possible implementation, the fourth position is the same as the first position, the second position, or the third position, and the fourth operation is different from the first operation, the second operation, and the third operation.
In another possible implementation, the first operation, the second operation, and the third operation are different; or the first operation, the second operation and the third operation are the same; alternatively, the first operation, the second operation, and the third operation are not identical.
In an eighth aspect, an embodiment of the present application provides an electronic device, which may include: a processor, a memory, and a touch screen; the touch screen is a curved screen with radian on the side edge, and the curved screen displays a first interface in a vertical screen state; a processor, a touch screen and a memory coupled to the processor, the memory for storing computer program code, the computer program code comprising computer instructions that, when executed by the electronic device, cause the electronic device to perform operations comprising: receiving the operation of a user in different degrees in the radian area of the side edge of the curved screen; responding to the operation of different degrees, and displaying different interfaces; wherein the different interfaces include: a second interface, a main screen, and a multitasking interface; the second interface is a superior interface of the first interface.
In one possible implementation, the different degrees of operation are: pressing operations with different pressure magnitudes, sliding operations with different sliding distances or pressing operations with different pressing duration.
In a ninth aspect, an embodiment of the present application provides a computer-readable storage medium, including: computer software instructions; the computer software instructions, when executed in an electronic device, cause the electronic device to perform a method of navigation operations as any one of the first aspect or possible approaches of the first aspect, or possible approaches of the second aspect, or possible approaches of the third aspect or possible approaches of the fourth aspect.
In a tenth aspect, embodiments of the present application provide a computer program product, which when run on a computer, causes the computer to execute a navigation operation method according to any one of the first aspect or the possible manner of the first aspect, or the second aspect or the possible manner of the second aspect, or the third aspect or the possible manner of the third aspect, or the fourth aspect or the possible manner of the fourth aspect.
It should be appreciated that the description of technical features, solutions, benefits, or similar language in this application does not imply that all of the features and advantages may be realized in any single embodiment. Rather, it is to be understood that the description of a feature or advantage is intended to include the specific features, aspects or advantages in at least one embodiment. Therefore, the descriptions of technical features, technical solutions or advantages in the present specification do not necessarily refer to the same embodiment. Furthermore, the technical features, technical solutions and advantages described in the present embodiments may also be combined in any suitable manner. One skilled in the relevant art will recognize that an embodiment may be practiced without one or more of the specific features, aspects, or advantages of a particular embodiment. In other instances, additional features and advantages may be recognized in certain embodiments that may not be present in all embodiments.
Drawings
FIG. 1 is a diagram illustrating an implementation example of a navigation function provided in the prior art;
FIG. 2 is a diagram illustrating another implementation of a navigation function provided by the prior art;
fig. 3 is a schematic product form diagram of a curved-surface-screen mobile phone according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 5 is a schematic view illustrating a curved-screen mobile phone provided in an embodiment of the present application being held by a user;
fig. 6 is a schematic interface diagram displayed on an electronic device according to an embodiment of the present disclosure;
fig. 7 is a schematic interface diagram displayed on another electronic device according to an embodiment of the present application;
FIG. 8 is a schematic interface diagram of an electronic device according to an embodiment of the present disclosure;
fig. 9 is a schematic interface diagram displayed on another electronic device according to an embodiment of the present application;
FIG. 10 is a schematic interface diagram of an electronic device according to an embodiment of the present disclosure;
fig. 11 is a schematic view of another curved-screen mobile phone provided in the embodiment of the present application being held by a user;
fig. 12 is a schematic view of a curved-screen mobile phone provided in the embodiment of the present application being held by a user;
fig. 13 is a schematic view of a curved-screen mobile phone provided in the embodiment of the present application being held by a user;
fig. 14 is a schematic interface diagram displayed on another electronic device according to an embodiment of the present application;
fig. 15 is a schematic interface diagram displayed on another electronic device according to an embodiment of the present application.
Detailed Description
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present embodiment, "a plurality" means two or more unless otherwise specified.
The embodiment of the application provides a navigation operation method which can be applied to electronic equipment with a touch screen. In some embodiments of the present application, after detecting an input operation of a user at a specific position of a touch screen, an electronic device may display a corresponding interface according to the operation. In some other embodiments of the present application, the electronic device may initiate the navigation mode upon detecting a specific operation or a specific holding gesture or an operation entered at a specific location by the user. In the navigation mode, the electronic device can display a corresponding interface according to the operation input by the user on the touch screen.
For example, the electronic device described in the embodiment of the present application may be a mobile phone, a tablet computer, a desktop computer, a laptop computer, a handheld computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, and an electronic device having a touch screen, such as a cellular phone, a Personal Digital Assistant (PDA), an Augmented Reality (AR) \ Virtual Reality (VR) device, and the embodiment of the present application does not particularly limit the specific form of the device.
In addition, it should be noted that, in this embodiment of the application, the touch screen of the electronic device may be a planar touch screen, a curved screen with an arc on a side, or a folding screen. For example, the electronic device is a mobile phone with a curved screen as shown in fig. 3, and is referred to as a curved screen mobile phone for short. Fig. 3 (a) is a perspective view of a mobile phone with a curved screen. Fig. 3 (b) shows a front view of the curved screen handset. As shown in fig. 3 (a) and 3 (b), the touch screen of the mobile phone is a curved screen with a curved left side 301 and a curved right side 302.
Embodiments of the present application will be described in detail below with reference to the accompanying drawings.
Please refer to fig. 4, which is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure. As shown in fig. 4, the electronic device may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the present embodiment does not constitute a specific limitation to the electronic device. In other embodiments, an electronic device may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processor (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller may be a neural center and a command center of the electronic device. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, such that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement a touch function of the electronic device.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 via the I2S interface, enabling answering of calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a display screen serial interface (DSI), and the like. In some embodiments, the processor 110 and the camera 193 communicate through a CSI interface to implement the shooting function of the electronic device. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device, and may also be used to transmit data between the electronic device and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the interface connection relationship between the modules illustrated in this embodiment is only an exemplary illustration, and does not constitute a limitation on the structure of the electronic device. In other embodiments, the electronic device may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in an electronic device may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication applied to electronic devices, including Wireless Local Area Networks (WLANs) (such as wireless fidelity (Wi-Fi) networks), Bluetooth (BT), Global Navigation Satellite Systems (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of the electronic device is coupled to the mobile communication module 150 and antenna 2 is coupled to the wireless communication module 160 so that the electronic device can communicate with the network and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou satellite navigation system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device implements the display function through the GPU, the display screen 194, and the application processor, etc. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-o led, a quantum dot light-emitting diode (QLED), or the like. In some embodiments, the electronic device may include 1 or N display screens 194, with N being a positive integer greater than 1.
The electronic device may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display screen 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device selects a frequency point, the digital signal processor is used for performing fourier transform and the like on the frequency point energy.
Video codecs are used to compress or decompress digital video. The electronic device may support one or more video codecs. In this way, the electronic device can play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can realize applications such as intelligent cognition of electronic equipment, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device and data processing by executing instructions stored in the internal memory 121. For example, in the embodiment of the present application, the processor 110 may display a corresponding interface, such as a desktop, a previous interface of a current interface, or a multitasking interface, as a response to an operation input by a user at a specific position of the touch screen by executing instructions stored in the internal memory 121. For another example, the processor 110 may initiate the navigation mode upon receiving a specific operation by the user or an operation input at a specific location, or upon detecting a specific holding gesture by the user by executing instructions stored in the internal memory 121. In the navigation mode, if an operation input by a user on the touch screen is received, a corresponding interface, such as a desktop, a previous interface of a current interface or a multi-task interface, is displayed as a response to the operation. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area can store data (such as audio data, phone book and the like) created in the using process of the electronic device. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The electronic device may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic device answers a call or voice information, it can answer the voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When a call is placed or a voice message is sent or it is desired to trigger the electronic device to perform some function by the voice assistant, the user may speak via his/her mouth near the microphone 170C and input a voice signal into the microphone 170C. The electronic device may be provided with at least one microphone 170C. In other embodiments, the electronic device may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and the like.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronics determine the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic device detects the intensity of the touch operation according to the pressure sensor 180A. The electronic device may also calculate the position of the touch from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion pose of the electronic device. In some embodiments, the angular velocity of the electronic device about three axes (i.e., x, y, and z axes) may be determined by the gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. Illustratively, when the shutter is pressed, the gyroscope sensor 180B detects a shake angle of the electronic device, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, the electronic device calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The electronic device may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device is a flip, the electronic device may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E can detect the magnitude of acceleration of the electronic device in various directions (typically three axes). When the electronic device is at rest, the magnitude and direction of gravity can be detected. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device may measure distance by infrared or laser. In some embodiments, taking a picture of a scene, the electronic device may utilize the distance sensor 180F to range to achieve fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device emits infrared light to the outside through the light emitting diode. The electronic device uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device. When insufficient reflected light is detected, the electronic device may determine that there are no objects near the electronic device. The electronic device can detect that the electronic device is held by a user and close to the ear for conversation by utilizing the proximity light sensor 180G, so that the screen is automatically extinguished, and the purpose of saving power is achieved. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. The electronic device may adaptively adjust the brightness of the display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic equipment can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access to an application lock, fingerprint photographing, fingerprint incoming call answering and the like.
The temperature sensor 180J is used to detect temperature. In some embodiments, the electronic device implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device heats the battery 142 when the temperature is below another threshold to avoid an abnormal shutdown of the electronic device due to low temperatures. In other embodiments, the electronic device performs a boost on the output voltage of the battery 142 when the temperature is below a further threshold to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device at a different position than the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic device may receive a key input, and generate a key signal input related to user settings and function control of the electronic device.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be attached to and detached from the electronic device by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic equipment can support 1 or N SIM card interfaces, and N is a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic equipment realizes functions of conversation, data communication and the like through the interaction of the SIM card and the network. In some embodiments, the electronic device employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device and cannot be separated from the electronic device.
The methods in the following embodiments may be implemented in an electronic device having the above hardware structure. For convenience of description, the electronic device is a mobile phone with a curved screen as shown in fig. 3.
Generally, when a user holds a curved-screen mobile phone, the fingers of the user touch the arc area on the side of the curved-screen. In the following embodiments, the curved-surface-screen mobile phone is held vertically by the right hand of the user, and the curved-surface-screen mobile phone is in a vertical-screen state. For example, as shown in fig. 5 (a), a user holds a curved-screen mobile phone with his right hand. As shown in fig. 5 (b), the thumb of the right hand of the user makes contact with the arc region of the right side of the curved screen, and the contact point is referred to as contact point 1. Other fingers of the right hand of the user can contact with the radian area of the left side edge of the curved screen, and the number of the contact points can be 1-4. In fig. 5, the other fingers of the right hand of the user have 4 contact points with the arc area on the left side of the curved screen, which are respectively contact point 2, contact point 3, contact point 4 and contact point 5. Of course, other fingers of the user's right hand may not make contact with the arc area of the left side of the curved screen.
In some embodiments of the application, the navigation function of the mobile phone can be realized by utilizing the characteristic that when a user holds the curved-screen mobile phone, fingers can contact the arc area of the side edge of the curved-screen mobile phone.
In some embodiments, with reference to fig. 5, when the user holds the curved-screen mobile phone with the right hand, the thumb of the right hand may be used to perform a corresponding operation to implement a navigation function of the curved-screen mobile phone, that is, to trigger the curved-screen mobile phone to display a corresponding interface, such as a previous interface, a desktop, or a multitasking interface of the current interface. The multitasking interface may include an identifier of an application running in the background of the curved-screen mobile phone or an application used by the user most recently, such as an application icon and an application thumbnail.
When the navigation function is realized, interfaces displayed by the curved-surface screen mobile phone in response to different operations can be different. It should be noted that, in this embodiment, in response to different operations performed by a user, an interface displayed by the curved-screen mobile phone may be predefined or configured in the curved-screen mobile phone in advance, or may be set by the user. That is to say, the interfaces displayed by the mobile phone with the curved screen triggered by different operations may be predefined or preconfigured, or may be set by the user. If the information is predefined or pre-configured in the curved-screen mobile phone, the curved-screen mobile phone can display prompt information for prompting a user to execute different operations to trigger an interface correspondingly displayed by the curved-screen mobile phone. For example, the curved-screen mobile phone may display the prompt information when the mobile phone is started, or may display the prompt information when the user opens the function of viewing navigation. If the user setting is carried out, a setting interface can be provided for the user to set an interface which can trigger the curved-surface screen mobile phone to display when different operations are executed.
As an example, in conjunction with fig. 5, when a user holds a curved screen mobile phone in the right hand, the thumb of the user's right hand may make contact with the arc area on the right side of the curved screen.
The user can perform operation 1 using the thumb of the right hand (this operation 1 may be the first operation in this application). When the curved-surface-screen mobile phone receives the operation 1 of the user, the curved-surface-screen mobile phone can display a corresponding interface as a response to the operation 1, for example, the curved-surface-screen mobile phone displays a previous interface of a current interface. The operation 1 may be a sliding operation 1 (the sliding operation 1 may be a first sliding operation in this application), where a starting point of the sliding operation 1 is a curved screen side, such as a right-side arc region, and a sliding direction is directed to a lower side of the curved screen.
For example, as shown in fig. 6 (a), the curved-screen mobile phone currently displays a first interface, such as a setting interface 601 of a wireless local area network. When the user wants to trigger the curved-screen mobile phone to display a second interface, such as the upper-level interface of the setting interface 601, as shown in (a) of fig. 6, when the user holds the curved-screen mobile phone in the right hand, operation 1 may be performed using the thumb of the right hand, that is, the user may perform a sliding operation in which the sliding direction is downward in the arc area on the right side of the curved screen using the thumb of the right hand (e.g., a sliding track 602 shown in (a) of fig. 6). In response to the operation 1, as shown in (b) of fig. 6, the curved-screen mobile phone may display a previous level interface of the setting interface 601, e.g., a home interface 603 of the setting application. That is, when the curved-screen mobile phone detects that the finger of the user slides downward in the arc area of the side (e.g., the right side) of the curved screen, the curved-screen mobile phone may display the previous interface of the current interface in response.
The user may also perform operation 2 using the thumb of the right hand (this operation 2 may be the second operation in this application). When the curved-screen mobile phone receives the operation 2 of the user, the curved-screen mobile phone may display a corresponding interface in response to the operation 2, such as a desktop (or a main screen) displayed by the curved-screen mobile phone. The above operation 2 may be a long press operation.
For example, as shown in fig. 7 (a), the curved screen mobile phone currently displays a first interface, such as a WeChat receipt and payment interface 701. When the user wants to trigger the curved screen mobile phone to display the desktop, as shown in fig. 7 (a), when the user holds the curved screen mobile phone with the right hand, the thumb of the right hand may be used to perform operation 2, that is, the user may perform a long-press operation in the arc area on the right side of the curved screen with the thumb of the right hand. In response to this operation 2, the curved-screen mobile phone may display a desktop 702 as shown in (b) of fig. 7. That is, when the curved-screen mobile phone detects a long-press operation of a finger of a user in an arc region of a side (e.g., a right side) of the curved screen, the curved-screen mobile phone may display the desktop in response.
The user may also perform operation 3 using the thumb of the right hand (this operation 3 may be the third operation in this application). When the curved-surface-screen mobile phone receives the operation 3 of the user, the curved-surface-screen mobile phone can display a corresponding interface as a response to the operation 3, for example, the curved-surface-screen mobile phone displays a multi-task interface. The operation 3 may be a sliding operation 2 (the sliding operation 2 may be a second sliding operation in this application), where a starting point of the sliding operation 2 is a radian area of a first side, such as a right side, of the curved screen, and a sliding direction is directed to a second side, such as a left side, of the curved screen.
For example, as shown in fig. 8 (a), the curved-screen mobile phone currently displays a first interface, such as the interface 801 of the music playing application. When the user wants to trigger the curved-screen mobile phone to display the multitasking interface, as shown in (a) of fig. 8, when the user holds the curved-screen mobile phone in the right hand, operation 3 may be performed using the thumb of the right hand, i.e., the user may perform a sliding operation in which the sliding direction points to the left side (e.g., a sliding trajectory 802 shown in (a) of fig. 8) starting from the arc area of the right side of the curved screen using the thumb of the right hand. In response to this operation 3, the curved-screen mobile phone may display a multitasking interface 803 as shown in (b) of fig. 8. That is, when the curved-screen mobile phone detects that the finger of the user starts sliding towards the inside of the curved screen from the arc area on the side (for example, the left side) of the curved screen, the curved-screen mobile phone can display the multi-task interface in response.
It should be noted that, the operation 1, the operation 2, and the operation 3 may be other operations besides the sliding operation or the long-press operation described in the above example, such as a double-click operation, and the embodiment is not limited in particular herein.
In other embodiments, in order to prevent misoperation and improve the use experience of the user, the curved-screen mobile phone may start the navigation mode according to the operation of the user. In the navigation mode, the characteristic that when a user holds the curved-screen mobile phone, fingers can contact the radian area of the side edge of the curved-screen mobile phone is utilized to realize the navigation function of the mobile phone.
As an example, in conjunction with fig. 5, when a user holds a curved screen mobile phone in the right hand, the thumb of the user's right hand may make contact with the arc area on the right side of the curved screen. The user can perform operation 4 using the thumb of the right hand (this operation 4 may be the fourth operation in this application). In response to this operation 4, the curved-screen handset may initiate a navigation mode. In the navigation mode, the user can use the thumb of the right hand to execute corresponding operation so as to realize the navigation function of the curved-surface-screen mobile phone, namely, the curved-surface-screen mobile phone is triggered to display a corresponding interface.
The operation 4 may be a slide operation, a single click operation, a double click operation, a long press operation, a press operation, or the like. It should be noted that, in this embodiment, the operation 4 for triggering the curved-screen mobile phone to start the navigation mode may be predefined or preconfigured, or may be set by the user. In addition, the operation 4 and the above-described operation 1 may be a continuous operation in which the user's finger does not leave the curved screen. This operation 4 and the above-described operation 2 may be a continuous operation in which the user's finger does not leave the curved screen. This operation 4 and the above-described operation 3 may be a continuous operation in which the user's finger does not leave the curved screen.
For example, referring to fig. 5, operation 4 is a pressing operation. When the user holds the curved-screen mobile phone with the right hand, if the user wants to use the navigation function of the curved-screen mobile phone, the user can use the thumb of the right hand to perform the operation 4, that is, the user can use the thumb of the right hand to perform the pressing operation in the radian area of the right side edge of the curved-screen. The curved screen handset may initiate a navigation mode in response to operation 4. With reference to fig. 6, if the user wants to trigger the mobile phone with the curved screen to display the interface at the previous level of the current interface, as shown in (a) of fig. 6, the thumb of the right hand of the user may not leave the curved screen, and the operation 1 is continuously performed, that is, after the user uses the thumb of the right hand to perform the pressing operation in the arc area of the right side of the curved screen, the user continues to perform the sliding operation in the downward direction, that is, after the user can contact the arc area of the right side of the curved screen with the thumb of the right hand, the user keeps in contact with the curved screen and moves downward. In response to operation 1, the curved-screen cell phone may display a home interface 603 of the setup application, as shown in (b) of fig. 6. With reference to fig. 7, if the user wants to trigger the curved-screen mobile phone to display the desktop, as shown in (a) of fig. 7, the thumb of the right hand of the user may not leave the curved screen, and the operation 2 is continuously performed, that is, after the user uses the thumb of the right hand to perform the pressing operation in the arc area of the right side of the curved screen, the user may continue to perform the long-time pressing operation, that is, after the user may use the thumb of the right hand to contact the arc area of the right side of the curved screen, the user may keep in contact with the curved screen and continue to press the curved screen. In response to operation 2, the curved-screen cell phone may display a desktop 702, as shown in fig. 7 (b). With reference to fig. 8, if the user wants to trigger the curved-screen mobile phone to display the multitask interface, as shown in (a) of fig. 8, the thumb of the right hand of the user may not leave the curved screen, and the operation 3 is continuously performed, that is, after the user uses the thumb of the right hand to perform the pressing operation in the arc region of the right side of the curved screen, the user may continue to perform the sliding operation in which the sliding direction points to the left side, that is, after the user may contact the arc region of the right side of the curved screen with the thumb of the right hand, the user may keep in contact with the curved screen and move in the direction pointing to the left side. In response to operation 3, the curved-screen handset may display a multitasking interface 803 as shown in fig. 8 (b).
Of course, the manner of triggering the curved-screen mobile phone to start the navigation mode may be other manners besides receiving the operation 4 executed by the user as described in the above embodiment. For example, a touchscreen handset initiates a navigation mode when a user-specific grip gesture is detected. If the specific holding gesture can be: one finger of the user is on one side (e.g., the left side) of the curved screen cell phone, and the other four (three, two, or one) fingers are on the other side (e.g., the right side) of the curved screen cell phone. As another example, the particular gripping gesture may be: one finger of the user is positioned at one side edge (such as the left side edge or the right side edge) of the curved screen mobile phone, and the other four fingers (three, two or one) are positioned at the back surface of the curved screen mobile phone (such as a sensor can be arranged at the back surface of the curved screen mobile phone to detect the contact condition of the finger of the user and the curved screen mobile phone). For another example, when the curved-screen mobile phone detects that the user presses two sides of the curved-screen mobile phone at the same time, the navigation mode is started. For another example, a physical or virtual button is arranged on the side of the mobile phone with the curved screen. And when the curved-surface screen mobile phone detects that the user operates the button, starting a navigation mode. In other embodiments, in order to prevent the misoperation, the curved-screen mobile phone may respond to the corresponding operation to display the corresponding interface only when receiving that the user performs the operation 1, the operation 2, or the operation 3 at the specific position of the curved-screen mobile phone. Taking the specific position as a specific area of the curved screen as an example, the edge of the specific area may be uneven, or a lamp may be disposed under the specific area to prompt the user to perform an operation in the specific area to implement a navigation function.
In some other embodiments, in conjunction with fig. 5, when the user holds the curved-screen mobile phone with the right hand, the user may perform a corresponding operation with the right hand to trigger the curved-screen mobile phone to display the operation menu. The operation menu can comprise a button for triggering the curved screen mobile phone to realize the navigation function. The user can trigger the curved-screen mobile phone to display a corresponding interface, such as a previous interface, a desktop or a multi-task interface of the current interface, by operating the button in the operation menu.
As an example, in conjunction with fig. 5, when a user holds a curved screen mobile phone in the right hand, the thumb of the user's right hand may make contact with the arc area on the right side of the curved screen.
The user may perform operation 5 using the thumb of the right hand (this operation 5 may be the first operation in this application). When the curved-screen mobile phone receives the operation 5 of the user, the curved-screen mobile phone can display a corresponding operation menu as a response to the operation 5. The above-described operation 5 may be a slide operation, a click operation, a long-press operation, a double-click operation, a press operation, or the like. It should be noted that, in this embodiment, the operation 5 for triggering the curved-screen mobile phone to display the operation menu may be predefined or preconfigured, or may be set by the user.
For example, referring to fig. 9, operation 5 is taken as an example of the pressing operation. As shown in fig. 9 (a), the curved-screen mobile phone currently displays a first interface, such as a setting interface 901 of a wireless local area network. When the user wants to trigger the curved-screen mobile phone to display the operation menu, as shown in fig. 9 (a), when the user holds the curved-screen mobile phone with the right hand, the user may use the thumb of the right hand to perform the operation 5, that is, the user may use the thumb of the right hand to perform the pressing operation in the arc area on the right side of the curved-screen. In response to this operation 5, as shown in (b) in fig. 9, the curved-screen mobile phone may display an operation menu 902. For example, the operation menu 902 includes buttons for triggering the curved-screen mobile phone to implement a navigation function, such as a first button, e.g., "back" button 903, a second button, e.g., "back to the desktop" button 904, and a third button, e.g., "multitasking interface" button 905. The user operates the buttons in the operation menu 902, and can trigger the mobile phone with the curved screen to display the corresponding interface.
In some embodiments, a user may directly click a button in the operation menu 902 to trigger the mobile phone with the curved-surface screen to display a corresponding interface. Such as the user clicking on the "back" button 903 in the operations menu 902. In response to the click operation of the "back" button 903, the touchscreen handset may display a second interface, such as a previous interface of the settings interface 901 for the wireless local area network. As another example, the user may click on a "back to the desktop" button 904 of the operations menu 902. In response to a click operation of the "go back to desktop" button 904, the curved screen handset may display a desktop. Also for example, the user may click on the "multitasking interface" button 905 in the operation menu 902. In response to a click operation of the "multitasking interface" button 905, the curved screen cell phone may display a multitasking interface.
In other embodiments, the user may also continue to hold the holding gesture and perform the corresponding operation using the thumb of the right hand to trigger the curved-screen mobile phone to display the corresponding interface. For example, continuing with fig. 9, as shown in (b) of fig. 9, when the operation menu 902 is displayed on the curved-screen mobile phone, the first item of the operation menu 902, i.e., the "back" button 903, gets the focus.
If the user wants to trigger the curved-screen mobile phone to display the previous interface of the current interface, operation 6 (this operation 6 may be the second operation in this application) may be executed, for example, a long-press operation, to trigger the curved-screen mobile phone to display the previous interface of the current interface. Operations 6 and 5 may be consecutive operations in which the user's finger does not leave the curved screen. That is, after the user performs the above operation 5 using the thumb of the right hand, i.e., the pressing operation is performed in the arc region of the right side of the curved screen, the finger does not leave the curved screen, and the operation 6 is continuously performed, such as a long press operation, that is, after the user contacts the arc region of the right side of the curved screen using the thumb of the right hand, the contact with the curved screen is maintained and the pressing is continued. In response to the operation 6, the curved-screen mobile phone may display the previous interface of the current interface, that is, the previous interface of the setting interface 901 of the wireless lan.
If the user wants to trigger the curved-screen mobile phone to display the desktop or the multi-task interface, operation 7 (this operation 7 may be the third operation in this application), such as a sliding operation, may be performed, so that the corresponding button gets the focus. Operations 7 and 5 may be a continuous operation in which the user's finger does not leave the curved screen. That is, the user may continue to perform operation 7, such as a sliding operation, without the right thumb leaving the curved screen after performing operation 5 described above using the right thumb. Namely, after the user uses the thumb of the right hand to press the cambered area on the right side of the curved screen, the user continues to perform the sliding operation, namely, after the user can contact the cambered area on the right side of the curved screen by using the thumb of the right hand, the user can keep in contact with the curved screen and move. In this embodiment, as the user's finger slides, the focus moves on each button of the operation menu 902. For example, the slide locus of the slide operation is 906 as shown in (b) of fig. 9. As the user's finger slides, the focus may move from the first item, i.e., the "back" button 903, to the second item, i.e., the "back to the desktop" button 904, as shown in fig. 9 (c). If the user wants to trigger the curved-screen mobile phone to display the desktop, the above operation 6, such as a long-press operation, can be performed after the second item of the operation menu 902, i.e., the "go back to desktop" button 904, gets the focus. As shown in fig. 9 (d), the curved-screen cell phone may display a desktop 907 in response to this operation 6.
If the user wants to trigger the curved screen phone to display the multi-tasking interface, after performing operation 5 using the right thumb, the right thumb does not leave the curved screen, and operation 7, such as a sliding operation, continues until the focus moves from the first item, i.e., the "back" button 903, to the third item, i.e., the "multi-tasking interface" button 905. And performs the above-mentioned operation 6, such as a long press operation, after the third item of the operation menu 902, i.e., the "multitasking interface" button 905, obtains a focus. In response to this operation 6, the curved screen handset may display a multitasking interface.
The operation 6 and the operation 7 may be continuous operations in which the user's finger does not leave the curved screen.
After the user performs the operation 5, if the user does not want to use the navigation function of the curved-screen mobile phone, the user may perform an operation to cancel the navigation operation, that is, the curved-screen mobile phone may cancel the display of the operation menu 902. For example, the operation may be another operation such as a single click operation, which is different from the above-described operations 6 and 7. For another example, in the scenario where the operations 5 and 6, or the operations 5, 7 and 6 are continuous operations in which the user's finger does not leave the curved screen, if the user can directly lift the finger after performing the operation 5 to make the finger leave the curved screen, the curved-screen mobile phone can cancel the navigation operation.
Of course, the manner of triggering the curved-screen mobile phone to display the operation menu may be other manners besides receiving the operation 5 executed by the user as described in the above embodiment. For example, a curved-screen mobile phone displays an operation menu when a user-specific holding gesture is detected. For the description of the specific holding gesture, reference may be made to the detailed description of the corresponding content in other embodiments of the present application, which is not repeated herein. For another example, when the curved-screen mobile phone detects that the user presses two sides of the curved-screen mobile phone at the same time, the operation menu is displayed. For another example, a physical or virtual button is arranged on the side of the mobile phone with the curved screen. And when the curved-surface screen mobile phone detects that the user operates the button, displaying an operation menu.
In addition, it should be noted that, for the buttons included in the operation menu, the description in the above embodiment is only an example. In other embodiments, the operation menu may further include other buttons, such as a button for triggering the curved-screen mobile phone to turn on a voice assistant, in addition to the above-mentioned buttons for triggering the curved-screen mobile phone to implement a navigation function, such as a "back" button, a "desktop back" button, and a "multi-task interface" button. In still other embodiments, the operation menu may include identifications of some applications in the curved-screen mobile phone, such as icons and the like. The applications can be automatically added into the operation menu by the curved-screen mobile phone according to the use frequency or the recent use time of the user, or can be manually added into the operation menu by the user. In the case that the operation menu includes the identifiers of some applications in the curved-surface-screen mobile phone, by the method in the embodiment corresponding to fig. 9, after the identifier of the corresponding application in the operation menu obtains the focus, the user may trigger the curved-surface-screen mobile phone to open the application by executing the operation 6.
In other embodiments of the present application, when a user holds a curved-screen mobile phone with a single hand, such as the right hand, different fingers of the user's right hand may contact different locations on the side of the curved screen. The user can trigger the mobile phone with the curved screen to display the corresponding interface by using different fingers to execute operation at the corresponding positions.
In some embodiments, with reference to fig. 5, when the user holds the curved-screen mobile phone with the right hand, the user may use different fingers of the right hand to perform operations at different positions to trigger the curved-screen mobile phone to display a corresponding interface, such as a previous interface of a current interface, a desktop, a multitasking interface, a payment interface, a scanning interface, or the like.
When the curved-surface screen mobile phone is triggered to display the corresponding interface, the interfaces displayed by the curved-surface screen mobile phone responding to operations on different positions can be different. It should be noted that, in this embodiment, in response to the operations performed by the user's finger at different positions, the interface displayed by the curved-screen mobile phone may be predefined or pre-configured in the curved-screen mobile phone, or may be set by the user. That is to say, the interface displayed by the curved-screen mobile phone triggered by the operation at different positions may be predefined or preconfigured, or may be set by the user. If the information is predefined or pre-configured in the curved-screen mobile phone, the curved-screen mobile phone can display prompt information for prompting a user to execute operations at different positions to trigger interfaces correspondingly displayed by the curved-screen mobile phone. For example, the curved-screen mobile phone may display the prompt information when the mobile phone is started, or may display the prompt information when the user opens the function. If the mobile phone is set by the user, a setting interface can be provided for the user to set an interface which can trigger the display of the mobile phone with the curved screen to execute the operation at different positions. For example, a user can set different positions to execute operations according to own use habits, such as holding habits, and can trigger the corresponding interfaces displayed by the curved-screen mobile phone.
As an example, referring to fig. 5, when the user holds the curved-screen mobile phone in the right hand, the thumb of the user's right hand may contact with the arc area on the right side of the curved-screen, such as the contact point 1. For example, a contact point between the index finger and the radian area of the left side edge of the curved screen is called a contact point 5, a contact point between the middle finger and the radian area of the left side edge of the curved screen is called a contact point 4, a contact point between the ring finger and the radian area of the left side edge of the curved screen is called a contact point 3, and a contact point between the little finger and the radian area of the left side edge of the curved screen is called a contact point 2. The user uses fingers to execute operation at the corresponding positions of the contact points, and the curved-surface screen mobile phone can be triggered to display a corresponding interface. The operations performed by different fingers at corresponding positions may be the same or different. In addition, the operation may be a single click operation, a long press operation, a slide operation, a double click operation, or the like.
Referring to fig. 10, the same operations performed by different fingers at corresponding positions are taken as an example, and all the operations are long press operations. As shown in fig. 10 (a), the curved-screen mobile phone currently displays a setting interface 1001 of the wireless local area network.
While the user is holding the curved-screen phone in the right hand, a long press operation may be performed using the thumb of the right hand at the second location, such as contact point 1 above. As shown in fig. 10 (b), in response to the long press operation at the contact point 1, the curved-screen mobile phone may display a corresponding interface, such as a curved-screen mobile phone display desktop 1002.
The user may perform a long press operation using the index finger of the right hand at a first location, such as contact point 5 described above. As shown in (c) of fig. 10, in response to the long press operation at the contact point 5, the curved-screen mobile phone may display a corresponding interface, for example, the curved-screen mobile phone displays a previous interface to the current interface, for example, the main interface 1003 for setting an application.
The user may perform a long press operation at a third location, such as contact point 4 described above, using the middle finger of the right hand. As shown in fig. 10 (d), in response to the long press operation at contact 4, the curved screen handset may display a corresponding interface, such as the curved screen handset displaying a multitasking interface 1004.
The user can perform a long press operation at the above-described contact point 3 using the ring finger of the right hand. As shown in (e) of fig. 10, in response to the long press operation at the contact point 3, the curved-screen mobile phone may display a corresponding interface, such as a curved-screen mobile phone display payment interface 1005.
The user can perform a long press operation at the above-described contact point 2 using the small thumb of the right hand. As shown in fig. 10 (f), in response to the long press operation at the contact point 2, the curved-screen mobile phone may display a corresponding interface, such as the curved-screen mobile phone displaying a sweep interface 1006.
In some other embodiments, to prevent the misoperation and improve the user experience, the curved-screen mobile phone may start the operation mode (or referred to as the navigation mode) according to the operation of the user. In the operation mode, the mobile phone with the curved screen is triggered to display a corresponding interface according to different operation positions executed by fingers of a user.
With continued reference to fig. 5, while the user is holding the curved-screen phone in the right hand, the user may perform an operation using a finger of the right hand. In response to this operation, the curved screen handset may initiate an operational mode. In the operation mode, the user can use different fingers of the right hand to perform operation at corresponding positions so as to trigger the curved-screen mobile phone to display corresponding interfaces.
The operation is different from the operation of triggering the corresponding interface displayed by the curved-surface screen mobile phone, and can be sliding operation, clicking operation, double-clicking operation, long-time pressing operation and the like. It should be noted that, in this embodiment, the operation for triggering the start operation mode of the curved-screen mobile phone may be predefined or preconfigured, or may be set by the user. In addition, the finger for executing the operation can be the same as or different from the finger for executing the operation of triggering the curved-surface screen mobile phone to display the corresponding interface. If the operating finger is the same as the finger for triggering the operation of displaying the corresponding interface on the curved-surface screen mobile phone, the operation of triggering the operation of displaying the corresponding interface on the curved-surface screen mobile phone can be continuous operation without leaving the curved-surface screen by the finger of the user.
For example, with reference to fig. 5, the operation for triggering the curved-screen mobile phone to start the operation mode is a single-click operation, and the operation for triggering the curved-screen mobile phone to display the corresponding interface is a long-press operation. When the user holds the curved-screen mobile phone in the right hand, a certain finger (such as the thumb) of the right hand can be used to perform a fourth operation, such as a clicking operation, at the fourth position, such as the contact point 1. The curved screen mobile phone can start the operation mode in response to the click operation. In this mode of operation, in connection with the example shown in fig. 10, the user may perform an operation at a corresponding position using different fingers of the right hand to trigger the curved-screen mobile phone to display a corresponding interface. In addition, after the operation mode is started, the user can also trigger the curved-screen mobile phone to exit the operation mode by executing corresponding operation. The operation of triggering the curved-surface screen mobile phone to exit the operation mode is different from the operation of triggering the curved-surface screen mobile phone to start the operation mode and the operation of displaying the corresponding interface. If the operation for triggering the curved-surface-screen mobile phone to start the operation mode is a single-click operation, the operation for triggering the curved-surface-screen mobile phone to display the corresponding interface is a long-press operation, and the operation for triggering the curved-surface-screen mobile phone to exit the operation mode is a sliding operation as an example. Under the condition that a user holds the curved-screen mobile phone by the right hand, after the user performs clicking operation to trigger the starting operation mode of the curved-screen mobile phone, the user can perform sliding operation by using a certain finger. After receiving the sliding operation, the curved-surface screen mobile phone can be triggered to exit the operation mode.
Of course, the manner of triggering the curved-screen mobile phone to start the operation mode may be other manners besides receiving the operation of the user as described in the above embodiment. For example, a touchscreen handset initiates an operational mode when a user-specific grip gesture is detected. For the description of the specific holding gesture, reference may be made to the detailed description of the corresponding content in other embodiments of the present application, which is not repeated herein. For another example, when the curved-screen mobile phone detects that the user presses two sides of the curved-screen mobile phone at the same time, the operation mode is started. For another example, a physical or virtual button is arranged on the side of the mobile phone with the curved screen. And when the curved-surface screen mobile phone detects that the user operates the button, starting the operation mode.
It should be noted that, in the above embodiments, when a user holds the mobile phone with a single hand (e.g., the right hand), the five fingers of the right hand and different positions of the side edge of the curved screen all have contact points. In other embodiments, only some of the five fingers of the user's right hand may have contact points with different locations on the side of the curved screen when the user holds the phone in his right hand. For example, in conjunction with fig. 11, as shown in (a) of fig. 11, the user holds a curved-screen mobile phone with the right hand. The thumb of the user's right hand makes contact with the arc area on the right side of the curved screen. The middle finger, ring finger and little finger of the right hand of the user are in contact with the radian area on the left side of the curved screen. As shown in fig. 11 (b), the contact point between the thumb and the arc region on the right side of the curved screen is referred to as contact point 1, and the contact points between the middle finger, the ring finger, and the little finger and the arc region on the left side of the curved screen are referred to as contact point 2, contact point 3, and contact point 4, respectively. For another example, referring to fig. 12, as shown in (a) of fig. 12, the user holds a curved-screen mobile phone with the right hand. The thumb of the user's right hand makes contact with the arc area on the right side of the curved screen. The middle finger and the ring finger of the right hand of the user are in contact with the radian area on the left side of the curved screen. As shown in fig. 12 (b), the contact point of the thumb with the arc area on the right side of the curved screen is referred to as contact point 1, and the contact points of the middle finger and the ring finger with the arc area on the left side of the curved screen are referred to as contact point 2 and contact point 3, respectively. For another example, in conjunction with fig. 13, as shown in (a) of fig. 13, the user holds a curved-screen mobile phone with the right hand. Only the thumb of the right hand of the user makes contact with the arc area on the right side of the curved screen. As shown in fig. 13 (b), the contact point of the thumb with the arc region of the right side of the curved screen is referred to as contact point 1. Similar to the above embodiment in conjunction with the description of the holding gesture shown in fig. 5, under the holding gesture shown in fig. 11, fig. 12 or fig. 13, the user may also trigger the curved-screen mobile phone to display the corresponding interface by performing an operation at the corresponding position using different fingers. The specific process is similar to that described in the above example, and is not described again here.
In other embodiments of the present application, when a user holds a mobile phone with a curved screen with a single hand, the fingers of the user may contact the sides of the curved screen. The user can use the finger to carry out the operation of different degrees in the position of contact, like the press operation of pressure size difference, like the different slide operation of sliding distance again, trigger the curved surface screen cell-phone and show corresponding interface.
When the curved-surface-screen mobile phone is triggered to display the corresponding interface, the interfaces displayed by the curved-surface-screen mobile phone responding to different degrees of operation (such as pressing operation with different pressure, sliding operation with different sliding distances, and pressing operation with different pressing duration) can be different. It should be noted that, in this embodiment, in response to operations performed by a user at different degrees, an interface displayed by the curved-screen mobile phone may be predefined or configured in the curved-screen mobile phone in advance, or may be set by the user. That is to say, the interfaces displayed by the mobile phone with the curved screen triggered by different degrees of operations may be predefined or preconfigured, or may be set by the user. If the information is predefined or pre-configured in the curved-screen mobile phone, the curved-screen mobile phone can display prompt information for prompting a user to execute operations of different degrees and can trigger an interface displayed correspondingly by the curved-screen mobile phone. For example, the curved-screen mobile phone may display the prompt information when the mobile phone is started, or may display the prompt information when the user opens the setting to view the corresponding function. If the user setting is carried out, a setting interface can be provided for the user to set and execute interfaces which can trigger the curved-surface screen mobile phone to display in different degrees of operation.
As an example, a user uses a finger to press the curved screen at a position in contact with the side edge of the curved screen to different degrees, and the corresponding interface is triggered to be displayed on the curved screen mobile phone. With reference to fig. 5, when the user holds the curved-screen mobile phone with the right hand, the thumb of the right hand may be used to perform pressing operations with different pressures at the position in contact with the right side of the curved-screen mobile phone, so as to trigger the curved-screen mobile phone to display a corresponding interface, such as a previous interface, a desktop, or a multitask interface of the current interface, and implement a navigation function of the curved-screen mobile phone.
For example, as shown in (a) of fig. 14, the user can perform a pressing operation with a pressure of a first pressure value using the thumb of the right hand. When the curved-surface-screen mobile phone receives the pressing operation of the user, the curved-surface-screen mobile phone can display a corresponding interface as a response to the pressing operation, for example, the curved-surface-screen mobile phone displays a previous interface of a current interface.
For another example, as shown in (b) of fig. 14, the user can perform a pressing operation with the pressure at the second pressure value using the thumb of the right hand. When the curved-surface-screen mobile phone receives the pressing operation of the user, the curved-surface-screen mobile phone can display a corresponding interface as a response to the pressing operation, such as a display desktop of the curved-surface-screen mobile phone. Wherein the second pressure value is greater than the first pressure value.
For another example, as shown in (c) of fig. 14, the user may perform a pressing operation with a third pressure value using the thumb of the right hand. When the curved-surface-screen mobile phone receives the pressing operation of the user, the curved-surface-screen mobile phone can display a corresponding interface as a response to the pressing operation, for example, the curved-surface-screen mobile phone displays a multi-task interface. Wherein the third pressure value is greater than the second pressure value.
In addition, when the curved-surface screen mobile phone receives operations of different degrees of the user, such as pressing operations with different pressures, prompt information can be displayed, and the prompt information is used for prompting the pressure of the pressing operations of the user. For example, as shown in fig. 14, the curved screen mobile phone may display an arc line when receiving a pressing operation with different user pressures. The larger the pressure value, the larger the extent of the area formed by the arc and the edge of the touch screen. When the mobile phone with the curved-surface screen receives different degrees of operations of a user, such as pressing operations with different pressures, the mobile phone with the curved-surface screen can also give vibration feedback to the user. The greater the pressure value, the greater the intensity of the vibration.
In other embodiments, in order to prevent misoperation and improve the use experience of the user, the curved-screen mobile phone may start the navigation mode according to the operation of the user. In the navigation mode, the mobile phone with the curved screen is triggered to display different interfaces according to different operation degrees of fingers of a user at the contact positions. For example, a user may use the thumb of the right hand to perform an operation while holding a curved-screen cell phone in the right hand. In response to this operation, the curved screen handset may initiate a navigation mode.
The operation is different from the operation of triggering the corresponding interface displayed by the curved-surface screen mobile phone, and can be sliding operation, clicking operation, double-clicking operation and the like. It should be noted that, in this embodiment, the operation for triggering the curved-screen mobile phone to start the navigation mode may be predefined or preconfigured, or may be set by the user. In addition, after the navigation mode is started, the user can also trigger the curved screen mobile phone to exit the navigation mode by executing corresponding operation. The operation of triggering the curved surface screen mobile phone to exit the navigation mode is different from the operation of triggering the curved surface screen mobile phone to start the navigation mode and the operation of displaying the corresponding interface. If the operation for triggering the navigation operation mode of the curved-surface-screen mobile phone is a clicking operation, and the operation for triggering the curved-surface-screen mobile phone to display the corresponding interface is a pressing operation, the operation for triggering the curved-surface-screen mobile phone to exit the navigation mode can be a sliding operation.
Of course, the manner of triggering the curved-screen mobile phone to start the navigation mode may be other manners besides the manner of receiving the operation of the user described in the above embodiment. For example, a touchscreen handset initiates a navigation mode when a user-specific grip gesture is detected. For the description of the specific holding gesture, reference may be made to the detailed description of the corresponding content in other embodiments of the present application, which is not repeated herein. For another example, when the curved-screen mobile phone detects that the user presses two sides of the curved-screen mobile phone at the same time, the navigation mode is started. For another example, a physical or virtual button is arranged on the side of the mobile phone with the curved screen. And when the curved-surface screen mobile phone detects that the user operates the button, starting a navigation mode.
In addition, the method provided by the embodiment of the present application is described in the above embodiment by taking an example in which a user vertically holds a curved-screen mobile phone and the curved-screen mobile phone is in a vertical-screen state. In other embodiments, when the user holds the mobile phone with the curved screen in the horizontal direction, that is, the curved mobile phone is in the horizontal screen state, the method of the embodiments is also applicable, and the specific implementation is the same as that of the embodiments. For example, when the user holds the mobile phone with the curved screen in the horizontal direction, and the mobile phone with the curved screen is in the horizontal screen state, as shown in fig. 15, fingers of the left hand and/or the right hand of the user may contact with the arc area on the side of the curved screen. The thumb of the right hand can be used for executing corresponding operation so as to realize the navigation function of the mobile phone with the curved surface screen. And triggering the curved screen mobile phone to display a corresponding interface, such as a previous interface, a desktop or a multi-task interface of the current interface. As an example, in conjunction with fig. 15, when a user holds a curved screen cell phone in a landscape orientation, the thumb of the user's right hand may make contact with the curved area of the left side of the curved screen. As shown in fig. 15 (a), the curved-screen mobile phone currently displays a setting interface 1501 of the wireless local area network. The user can use the thumb of the right hand to perform an operation, such as operation 1 in the above embodiment, where the operation 1 can be a sliding operation 1, the starting point of the sliding operation 1 is a curved screen side, such as a radian area of the left side, and the sliding direction is directed to the lower side of the curved screen. That is, the user can perform a sliding operation with the thumb of the right hand pointing to the lower side in the arc region of the left side of the curved screen (e.g., the sliding locus 1502 shown in (a) in fig. 15). When the curved-screen mobile phone receives the operation 1 of the user, as a response to the operation 1, the curved-screen mobile phone may display a corresponding interface, for example, the curved-screen mobile phone displays a previous interface of a current interface, that is, a previous interface of a setting interface 1501 of the wireless local area network, for example, as shown in (b) of fig. 15, the curved-screen mobile phone sets a main interface 1503 of an application. Of course, similar to the examples of fig. 7 and 8, the user may also use the thumb of the right hand to perform a corresponding operation, triggering the curved-screen mobile phone to display a desktop or a multi-task interface. In addition, similar to the example of any one of fig. 9 to 14, when the user holds the curved-screen mobile phone in the horizontal direction, that is, when the curved-screen mobile phone is in the horizontal-screen state, the method in the corresponding embodiment may also be used to implement the corresponding function, which is not described in detail in this embodiment.
In addition, the above embodiment is described by taking a curved-screen mobile phone as an example of the electronic device. The method of the embodiment is also applicable to electronic devices of which the touch screen is a flat touch screen, a folding screen, and the like. The specific implementation process is similar to that in the above embodiments, and is not described in detail here.
According to the method provided by the embodiment, the navigation function of the electronic equipment is realized by utilizing the characteristic that the fingers can contact the touch screen of the electronic equipment when the user holds the electronic equipment, so that the navigation function can be realized more quickly and conveniently, the human-computer interaction efficiency is improved, and the use experience of the user is improved.
Still other embodiments of the present application provide an electronic device, which may include: a processor, a memory, and a touch screen; the touch screen can be a curved screen with radian at the side edge, and the curved screen displays a first interface; the processor, the touch screen and the memory are coupled, the memory is used for storing computer program code, the computer program code comprises computer instructions, when the computer instructions are executed by the electronic device, the electronic device is caused to execute the method in the above embodiments.
Further embodiments of the present application also provide a computer-readable storage medium, which may include computer software instructions, which, when run on an electronic device, cause the electronic device to perform the method of the above-described embodiment.
Further embodiments of the present application also provide a computer program product, which when run on a computer causes the computer to execute the method of the above embodiments in an electronic device of the above embodiments.
Other embodiments of the present application further provide an apparatus having a function of implementing behaviors of the electronic device, such as a curved-screen mobile phone, in the above embodiments. The functions can be realized by hardware, and the functions can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules corresponding to the above functions, for example, an input unit or module, a display unit or module, an activation unit or module, and the like.
Through the above description of the embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical functional division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another device, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, that is, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially or partially contributed to by the prior art, or all or part of the technical solutions may be embodied in the form of a software product, where the software product is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (15)

1. A navigation operation method is applied to electronic equipment, a touch screen of the electronic equipment is a curved screen with radian on the side, and a first interface is displayed on the curved screen in a vertical screen state, and the method comprises the following steps:
the electronic equipment receives a first operation of a user in an arc area on the side of the curved screen, and in response to the first operation, the electronic equipment displays a second interface, wherein the second interface is a previous interface of the first interface; or the like, or, alternatively,
the electronic equipment receives a second operation of a user in an arc area of the side edge of the curved screen, and responds to the second operation, the electronic equipment displays a main screen; or the like, or, alternatively,
the electronic equipment receives a third operation of a user in an arc area at the side of the curved screen, and in response to the third operation, the electronic equipment displays a multitasking interface which comprises an application running in the background of the electronic equipment or an identification of the application recently used by the user.
2. The method of claim 1, further comprising:
and the electronic equipment starts a navigation mode when receiving a fourth operation of a user, or detecting a preset holding gesture, or detecting that a physical button or a virtual button arranged on the side edge of the curved screen is triggered.
3. The method of claim 1 or 2, wherein the first operation, the second operation, and the third operation are all different;
the first operation, the second operation, and the third operation are each any one of: a first sliding operation, a second sliding operation, a double-click operation, a single-click operation, a pressing operation and a long-time pressing operation;
the first sliding operation is performed by taking the starting point as the radian area of the side edge of the curved screen, the sliding direction points to the sliding operation of the lower side edge of the curved screen, the second sliding operation is performed by taking the starting point as the radian area of the first side edge of the curved screen, and the sliding direction points to the sliding operation of the second side edge of the curved screen.
4. The method of claim 2, wherein the fourth operation is different from the first operation, the second operation, and the third operation, and wherein the fourth operation is any one of: sliding operation, pressing operation, and long-time pressing operation;
the fourth operation and the first operation are continuous operations of not leaving the curved screen by the finger of the user;
the fourth operation and the second operation are continuous operations of not leaving the curved screen by the finger of the user;
the fourth operation and the third operation are continuous operations in which the user's finger does not leave the curved screen.
5. A navigation operation method is applied to electronic equipment, a touch screen of the electronic equipment is a curved screen with radian on the side, and a first interface is displayed on the curved screen in a vertical screen state, and the method comprises the following steps:
the electronic equipment receives a first operation of a user in an arc area on the side of the curved screen;
responding to the first operation, the electronic equipment displays an operation menu, wherein the operation menu at least comprises a first button, a second button and a third button;
the electronic equipment receives the operation of a user on the first button, and in response to the operation on the first button, the electronic equipment displays a second interface, wherein the second interface is a higher-level interface of the first interface; or, the electronic device receives the operation of the second button by the user, and in response to the operation of the second button, the electronic device displays the home screen; or the electronic equipment receives the operation of the user on the third button, and in response to the operation on the third button, the electronic equipment displays a multitasking interface which comprises an application running in the background of the electronic equipment or an identification of an application recently used by the user.
6. The method of claim 5,
when the electronic equipment displays the operation menu, the first button obtains a focus; the electronic equipment receives the operation of the first button by the user, and comprises the following steps: the electronic equipment receives a second operation of a user in a radian area of the side edge of the curved screen; the second operation and the first operation are continuous operations of not leaving the curved screen by the finger of the user; or the like, or, alternatively,
before the electronic device receives an operation of the second button by a user, the method further includes: the electronic equipment receives a third operation of a user in an arc area on the side of the curved screen, and in response to the third operation, the electronic equipment switches the focus of the operation menu to the second button; the electronic equipment receives the operation of the second button by the user, and comprises the following steps: the electronic equipment receives a second operation of a user in a radian area of the side edge of the curved screen; the third operation and the first operation are continuous operations of not removing the finger of the user from the curved screen, and the second operation and the third operation are continuous operations of not removing the finger of the user from the curved screen; or the like, or, alternatively,
before the electronic device receives an operation of the third button by a user, the method further includes: the electronic equipment receives a third operation of a user in an arc area on the side of the curved screen, and in response to the third operation, the electronic equipment switches the focus of the operation menu to the third button; the electronic equipment receives the operation of the third button by the user, and comprises the following steps: the electronic equipment receives a second operation of a user in a radian area of the side edge of the curved screen; the third operation and the first operation are continuous operations in which the user's finger does not leave the curved screen, and the second operation and the third operation are continuous operations in which the user's finger does not leave the curved screen.
7. A navigation operation method is applied to electronic equipment, a touch screen of the electronic equipment is a curved screen with radian on the side, and a first interface is displayed on the curved screen in a vertical screen state, and the method comprises the following steps:
the electronic equipment receives a first operation of a user at a first position of an arc area on the side of the curved screen, and in response to the first operation at the first position, the electronic equipment displays a second interface, wherein the second interface is a previous interface of the first interface; or the like, or, alternatively,
the electronic device receives a second operation of the user at a second position of the radian area of the side edge of the curved screen, and the electronic device displays a main screen in response to the second operation at the second position; or the like, or, alternatively,
the electronic equipment receives a third operation of a user at a third position of the radian area on the side of the curved screen, responds to the third operation at the third position, and displays a multi-task interface, wherein the multi-task interface comprises an application running in the background of the electronic equipment or an identification of an application recently used by the user.
8. The method of claim 7, further comprising:
and the electronic equipment starts a navigation mode when receiving a fourth operation of a user at a fourth position of the radian area of the side edge of the curved screen, or detecting a preset holding gesture, or detecting that a physical button or a virtual button arranged on the side edge of the curved screen is triggered.
9. The method of claim 8,
the fourth position is the same as the first position, the second position, or the third position, and the fourth operation is different from the first operation, the second operation, and the third operation.
10. The method according to any one of claims 7 to 9,
the first operation, the second operation, and the third operation are different; or, the first operation, the second operation and the third operation are all the same; alternatively, the first operation, the second operation, and the third operation are not identical.
11. A navigation operation method is applied to electronic equipment, a touch screen of the electronic equipment is a curved screen with radian on the side, and a first interface is displayed on the curved screen in a vertical screen state, and the method comprises the following steps:
the electronic equipment receives the operation of a user in different degrees in the radian area of the side edge of the curved screen;
responding to the operation of different degrees, and displaying different interfaces by the electronic equipment;
wherein the different interfaces comprise: a second interface, a main screen, and a multitasking interface; the second interface is a superior interface of the first interface.
12. The method of claim 11,
the different degrees of operation are: pressing operations with different pressure magnitudes, sliding operations with different sliding distances or pressing operations with different pressing duration.
13. An electronic device, characterized in that the electronic device comprises: a processor, a memory, and a touch screen; the touch screen is a curved screen with radian on the side edge, and a first interface is displayed on the curved screen in a vertical screen state; the processor, the touch screen and the memory coupled for storing computer program code, the computer program code comprising computer instructions that, when executed by the electronic device, cause the electronic device to perform the method of navigation operation of any of claims 1 to 12.
14. A computer-readable storage medium, comprising: computer software instructions;
the computer software instructions, when executed in an electronic device, cause the electronic device to perform a method of navigation operations according to any one of claims 1 to 12.
15. A computer program product, which, when run on a computer, causes the computer to perform a method of navigation operations according to any one of claims 1 to 12.
CN201910765440.9A 2019-04-30 2019-08-19 Navigation operation method and electronic equipment Pending CN110691165A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/085868 WO2020221062A1 (en) 2019-04-30 2020-04-21 Navigation operation method and electronic device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2019103583480 2019-04-30
CN201910358348 2019-04-30

Publications (1)

Publication Number Publication Date
CN110691165A true CN110691165A (en) 2020-01-14

Family

ID=69108326

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910765440.9A Pending CN110691165A (en) 2019-04-30 2019-08-19 Navigation operation method and electronic equipment

Country Status (2)

Country Link
CN (1) CN110691165A (en)
WO (1) WO2020221062A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111459379A (en) * 2020-03-30 2020-07-28 Oppo广东移动通信有限公司 Navigation control method and device, electronic equipment and computer readable storage medium
WO2020221062A1 (en) * 2019-04-30 2020-11-05 华为技术有限公司 Navigation operation method and electronic device
WO2022089180A1 (en) * 2020-10-31 2022-05-05 华为技术有限公司 Interaction method and terminal device

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102232211A (en) * 2011-06-23 2011-11-02 华为终端有限公司 Handheld terminal device user interface automatic switching method and handheld terminal device
KR101155257B1 (en) * 2010-07-28 2012-06-13 엘지전자 주식회사 Method for providing user interface using flexible display and mobile terminal using this method
CN104156073A (en) * 2014-08-29 2014-11-19 深圳市中兴移动通信有限公司 Mobile terminal and operation method thereof
CN105824564A (en) * 2016-03-25 2016-08-03 乐视控股(北京)有限公司 Method for using function of return key and terminal
CN105867813A (en) * 2016-03-25 2016-08-17 乐视控股(北京)有限公司 Method for switching page and terminal
CN105867810A (en) * 2016-03-25 2016-08-17 乐视控股(北京)有限公司 Menu switching method and terminal
CN108845752A (en) * 2018-06-27 2018-11-20 Oppo广东移动通信有限公司 touch operation method, device, storage medium and electronic equipment
CN108874288A (en) * 2018-06-05 2018-11-23 Oppo广东移动通信有限公司 Application programe switch-over method, device, terminal and storage medium
CN109391736A (en) * 2018-09-28 2019-02-26 南昌努比亚技术有限公司 Method for controlling mobile terminal, mobile terminal and computer readable storage medium
CN109683741A (en) * 2018-12-19 2019-04-26 努比亚技术有限公司 Function triggering method, device and computer readable storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101515623B1 (en) * 2012-05-14 2015-04-28 삼성전자주식회사 Method and apparatus for operating functions of portable terminal having bended display
CN106325749A (en) * 2016-08-25 2017-01-11 维沃移动通信有限公司 Operation method of mobile terminal, and mobile terminal
CN106445116B (en) * 2016-08-31 2020-10-27 维沃移动通信有限公司 Method for calling out message notification bar and mobile terminal
CN106874046B (en) * 2017-01-22 2020-02-07 维沃移动通信有限公司 Application program operation method and mobile terminal
CN106980408A (en) * 2017-03-27 2017-07-25 珠海市魅族科技有限公司 touch control method and device
CN110691165A (en) * 2019-04-30 2020-01-14 华为技术有限公司 Navigation operation method and electronic equipment

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101155257B1 (en) * 2010-07-28 2012-06-13 엘지전자 주식회사 Method for providing user interface using flexible display and mobile terminal using this method
CN102232211A (en) * 2011-06-23 2011-11-02 华为终端有限公司 Handheld terminal device user interface automatic switching method and handheld terminal device
CN104156073A (en) * 2014-08-29 2014-11-19 深圳市中兴移动通信有限公司 Mobile terminal and operation method thereof
CN105824564A (en) * 2016-03-25 2016-08-03 乐视控股(北京)有限公司 Method for using function of return key and terminal
CN105867813A (en) * 2016-03-25 2016-08-17 乐视控股(北京)有限公司 Method for switching page and terminal
CN105867810A (en) * 2016-03-25 2016-08-17 乐视控股(北京)有限公司 Menu switching method and terminal
CN108874288A (en) * 2018-06-05 2018-11-23 Oppo广东移动通信有限公司 Application programe switch-over method, device, terminal and storage medium
CN108845752A (en) * 2018-06-27 2018-11-20 Oppo广东移动通信有限公司 touch operation method, device, storage medium and electronic equipment
CN109391736A (en) * 2018-09-28 2019-02-26 南昌努比亚技术有限公司 Method for controlling mobile terminal, mobile terminal and computer readable storage medium
CN109683741A (en) * 2018-12-19 2019-04-26 努比亚技术有限公司 Function triggering method, device and computer readable storage medium

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020221062A1 (en) * 2019-04-30 2020-11-05 华为技术有限公司 Navigation operation method and electronic device
CN111459379A (en) * 2020-03-30 2020-07-28 Oppo广东移动通信有限公司 Navigation control method and device, electronic equipment and computer readable storage medium
WO2022089180A1 (en) * 2020-10-31 2022-05-05 华为技术有限公司 Interaction method and terminal device
CN114527894A (en) * 2020-10-31 2022-05-24 华为终端有限公司 Interaction method and terminal equipment
CN114527894B (en) * 2020-10-31 2024-05-14 华为终端有限公司 Interaction method and terminal equipment

Also Published As

Publication number Publication date
WO2020221062A1 (en) 2020-11-05

Similar Documents

Publication Publication Date Title
CN110244893B (en) Operation method for split screen display and electronic equipment
CN110989852B (en) Touch screen, electronic equipment and display control method
CN110798568B (en) Display control method of electronic equipment with folding screen and electronic equipment
CN110798552A (en) Volume adjusting method and electronic equipment
EP4027230A1 (en) Method for operating electronic device and electronic device
CN112751954B (en) Operation prompting method and electronic equipment
CN111742539B (en) Voice control command generation method and terminal
CN110347269A (en) A kind of sky mouse mode implementation method and relevant device
CN112334860B (en) Touch control method of wearable device, wearable device and system
CN112583957A (en) Display method of electronic device, electronic device and computer-readable storage medium
WO2020221062A1 (en) Navigation operation method and electronic device
CN111492678B (en) File transmission method and electronic equipment
CN114095602B (en) Index display method, electronic device and computer readable storage medium
CN111625175B (en) Touch event processing method, touch event processing device, medium and electronic equipment
CN114089902A (en) Gesture interaction method and device and terminal equipment
CN109285563B (en) Voice data processing method and device in online translation process
EP4057122A1 (en) Screenshot method and related device
CN113867520A (en) Device control method, electronic device, and computer-readable storage medium
CN113821129A (en) Display window control method and electronic equipment
CN112463086A (en) Display control method and electronic equipment
WO2023071497A1 (en) Photographing parameter adjusting method, electronic device, and storage medium
WO2022217969A1 (en) Method and apparatus for enabling function in application
WO2022252786A1 (en) Window split-screen display method and electronic device
WO2023207715A1 (en) Screen-on control method, electronic device, and computer-readable storage medium
CN113325992A (en) Sliding operation processing method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200114

RJ01 Rejection of invention patent application after publication