WO2018082269A1 - Menu display method and terminal - Google Patents

Menu display method and terminal Download PDF

Info

Publication number
WO2018082269A1
WO2018082269A1 PCT/CN2017/081481 CN2017081481W WO2018082269A1 WO 2018082269 A1 WO2018082269 A1 WO 2018082269A1 CN 2017081481 W CN2017081481 W CN 2017081481W WO 2018082269 A1 WO2018082269 A1 WO 2018082269A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture
screen
target
menu item
menu
Prior art date
Application number
PCT/CN2017/081481
Other languages
French (fr)
Chinese (zh)
Inventor
祁云飞
吴思举
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to CN201780006166.6A priority Critical patent/CN108885525A/en
Publication of WO2018082269A1 publication Critical patent/WO2018082269A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus

Definitions

  • the embodiments of the present invention relate to the field of communications technologies, and in particular, to a menu display method and a terminal.
  • the terminal device displays different levels of menus according to user instructions is an important human-computer interaction mode.
  • the existing menu display mode usually displays different levels of menus layer by layer.
  • the level of the interface is at least two layers. For example, when the content displayed on the screen interface needs to be switched from the first level menu to the second level menu, the first level menu will be two. The page coverage of the level menu.
  • the terminal is generally instructed to display a first-level menu or perform an interactive action by a single swipe, for example, selecting a function in a menu. If you want to go to the next level menu or choose another action, the user must raise his finger and complete it with another interactive gesture.
  • the embodiment of the present application provides a menu display method and a terminal, which can realize display of a menu, selection of menu items, and the like without lifting a finger.
  • the embodiment of the present application provides a menu display method, including: displaying a user interface by a terminal; and displaying a main menu on the user interface when the terminal detects the first gesture, the first gesture is a user contact in a target area.
  • the second gesture is: an operation of the user contact swiping to the area where the target main menu item is located after the first gesture;
  • the target main menu item is at least One of the main menu items;
  • the first gesture and the second gesture are continuous operations, and the user contact does not leave the screen during the transition from the first gesture to the second gesture.
  • the embodiment of the present application can realize the continuous stroke of the user, and the display of the multi-level menu can be performed multiple times with one operation, which is convenient and quick, and has higher efficiency.
  • the method further includes: after detecting the second gesture, when the terminal detects the third gesture, performing a target function; wherein the third gesture is: after the second gesture, the user The contact swipes to the target sub-menu item, and the user contact leaves the screen in an area where the target sub-menu item is located; the target sub-menu item is one of at least one sub-menu item, and the target function is the target sub-menu The function corresponding to the item; the second gesture and the third gesture are continuous operations, and the user contact does not leave the screen during the transition from the second gesture to the third gesture.
  • the display of the multi-level menu and the selection of the menu items can be realized without lifting the finger, and the efficiency is higher.
  • the method further includes: after detecting the second gesture, if the fourth gesture is detected, and the user contact is less than or equal to a threshold from the edge of the screen, then the main user interface is closed.
  • Menu the fourth gesture is a gesture of the user contact swiping from the inside of the screen to the edge of the screen; the second gesture and the fourth gesture are continuous operations, and the user contact does not leave during the transition from the second gesture to the fourth gesture screen.
  • the method further includes the target area including any one or more of the locations described below on the screen: a top end of the screen, a bottom end of the screen, a left side of the screen, and a right side of the screen.
  • the target area including any one or more of the locations described below on the screen: a top end of the screen, a bottom end of the screen, a left side of the screen, and a right side of the screen.
  • the main menu is displayed within the target area.
  • the effect achieved by the user operation can be more clearly defined, and the user experience is better.
  • the main menu includes the following main menu items: an icon including a folder of at least one application icon, or an option icon representing a collection of at least one function.
  • the embodiment of the present application provides a menu display method, including: displaying, by a terminal, a user interface on a screen; when the terminal detects the first gesture, displaying a menu on the user interface, the first gesture is a user contact at the target a gesture in the area that is swiped from the screen edge of the terminal to the inside of the screen, the target area is a part of the screen, the menu includes at least one menu item; after detecting the first gesture, when the terminal detects the second gesture, executing the target The second gesture is: after the first gesture, the user contact swipes to the target menu item, and the user contact leaves the screen in the area where the target menu item is located; the target menu item is in at least one menu item One, the target function is a function corresponding to the target menu item; the first gesture and the second gesture are continuous operations, and the user contact does not leave the screen during the transition from the first gesture to the second gesture.
  • the embodiment of the present application can realize the display of the menu and the selection of the menu item by detecting the first gesture,
  • an embodiment of the present invention provides a terminal, where the terminal has a function of implementing a terminal behavior in a method of the foregoing first aspect and/or the second aspect.
  • the functions may be implemented by hardware or by corresponding software implemented by hardware.
  • the hardware or software includes one or more modules corresponding to the functions described above.
  • the embodiment of the present application provides a terminal, including: a touch screen for displaying a user interface; a processor and a memory, and the memory user storing data and a program.
  • the processor executes a memory-stored computer-executable instruction to cause the input device to perform the method of the menu display in the first aspect and in various alternatives of the first aspect and/or in the second aspect.
  • an embodiment of the present application provides a computer readable storage medium, configured to store computer software instructions for use by the terminal, including: performing the foregoing first aspect and optionally implementing a neutral and/or second The program designed in the aspect.
  • an embodiment of the present application provides a computer program product for storing computer software instructions for use in the foregoing terminal, which is included for performing the above first aspect and optionally implementing the middle and/or the second aspect.
  • the program designed.
  • FIG. 1 is a schematic structural diagram of a mobile phone according to an embodiment of the present application.
  • FIG. 2 is a schematic structural diagram of another mobile phone according to an embodiment of the present application.
  • FIG. 3 is a schematic diagram of an interface provided by an embodiment of the present application.
  • FIG. 5 is a schematic diagram of still another interface provided by an embodiment of the present application.
  • FIG. 6 is a schematic diagram of still another interface provided by an embodiment of the present application.
  • FIG. 7 is a schematic diagram of still another interface provided by an embodiment of the present application.
  • FIG. 8 is a schematic diagram of still another interface provided by an embodiment of the present application.
  • FIG. 9 is a schematic diagram of still another interface provided by an embodiment of the present application.
  • FIG. 10 is a schematic diagram of still another interface provided by an embodiment of the present application.
  • FIG. 11 is a schematic diagram of still another interface provided by an embodiment of the present application.
  • FIG. 12 is a schematic diagram of still another interface provided by an embodiment of the present application.
  • FIG. 13 is a schematic diagram of still another interface provided by an embodiment of the present application.
  • FIG. 14 is a schematic diagram of still another interface provided by an embodiment of the present application.
  • FIG. 15 is a schematic diagram of still another interface provided by an embodiment of the present application.
  • FIG. 16 is a schematic diagram of still another interface provided by an embodiment of the present application.
  • FIG. 17 is a schematic diagram of still another interface provided by an embodiment of the present application.
  • FIG. 18 is a schematic flowchart diagram of a menu display method according to an embodiment of the present disclosure.
  • FIG. 19 is a schematic flowchart diagram of another menu display method according to an embodiment of the present application.
  • the embodiment of the present application provides a menu display method and a terminal.
  • the method can realize the display of the menu and the selection of the menu items by detecting the continuous stroke of the user without lifting the finger, and the interaction is more convenient and faster, and the efficiency is higher.
  • the terminal involved in the embodiment of the present application may include a mobile phone, a tablet computer, a personal digital assistant (PDA), a point of sales (POS), an in-vehicle computer, and other terminals capable of displaying a menu.
  • the terminal can include at least a memory, a touch screen, and a processor.
  • the memory can be used to store a software program.
  • the processor executes various functions of the terminal by running a software program stored in the memory, and the touch screen can be used to display information input by the user and provide the information to the user.
  • the information as well as the various menus of the terminal can also accept user input.
  • FIG. 1 is a schematic structural diagram of a mobile phone related to an embodiment of the present application.
  • a mobile phone 100 includes an RF (Radio Frequency) circuit 110, a memory 120, other input devices 130, a touch screen 140, a sensor 150, an audio circuit 160, an I/O subsystem 170, a processor 180, and a power supply 190. And other components.
  • RF Radio Frequency
  • FIG. 1 does not constitute a limitation to the mobile phone, and may include more or less components than those illustrated, or combine some components, or split some components, or Different parts are arranged.
  • the components of the mobile phone 100 will be specifically described below with reference to FIG. 1 :
  • the RF circuit 110 can be used for transmitting and receiving information or during a call, and receiving and transmitting the signal. Specifically, after receiving the downlink information of the base station, the processor 180 processes the data. In addition, the uplink data is designed to be sent to the base station.
  • RF circuits include, but are not limited to, an antenna, at least one amplifier, a transceiver, a coupler, an LNA (Low Noise Amplifier), a duplexer, and the like.
  • RF circuitry 110 can also communicate with the network and other devices via wireless communication.
  • the wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System of Mobile communication), GPRS (General Packet Radio Service), CDMA (Code Division Multiple Access). , code division multiple access), WCDMA (Wideband Code Division Multiple Access), LTE (Long Term Evolution), e-mail, SMS (Short Messaging Service), and the like.
  • the memory 120 can be used to store software programs, and the processor 180 executes various functions of the mobile phone 100 by running software programs stored in the memory 120.
  • the memory 120 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may be stored. Data maintained according to the use of the mobile phone 100 (such as audio data, phone book, etc.).
  • memory 120 can include high speed random access memory, and can also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
  • Other input devices 130 can be used to receive input numeric or character information, as well as generate key signal inputs related to user settings and function controls of the handset 100.
  • other input devices 130 may include, but are not limited to, a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackballs, mice, joysticks, and light mice (the light mouse is not sensitive to display visual output).
  • function keys such as volume control buttons, switch buttons, etc.
  • trackballs mice, joysticks, and light mice (the light mouse is not sensitive to display visual output).
  • Other input devices 130 are coupled to other input device controllers 171 of I/O subsystem 170 for signal interaction with processor 180 under the control of other device input controllers 171.
  • the touch screen 140 can be used to display information input by the user or information provided to the user as well as various menus of the mobile phone 100, and can also accept user input.
  • the specific touch screen 140 may include a display panel 141 and a touch panel 142.
  • the display panel 141 can be configured by using an LCD (Liquid Crystal Display), an OLED (Organic Light-Emitting Diode), or the like.
  • the touch panel 142 also referred to as a display screen, a touch sensitive screen, etc., can collect contact or non-contact operations on or near the user (eg, the user uses any suitable object or accessory such as a finger, a stylus, etc. on the touch panel 142).
  • the operation on or near the touch panel 142 may also include a somatosensory operation; the operation includes a single point control operation, a multi-point control operation, and the like, and drives the corresponding connection device according to a preset program.
  • the touch panel 142 may include two parts: a touch detection device and a touch controller.
  • the touch detection device detects a gesture of the user, that is, touches an orientation, a posture, and detects a signal brought by the touch operation, and transmits a signal to the touch controller; the touch controller receives the touch information from the touch detection device, and converts the information
  • the information that the processor can process is sent to the processor 180 and can receive commands from the processor 180 and execute them.
  • the touch panel 142 can be implemented by using various types such as resistive, capacitive, infrared, and surface acoustic waves, and the touch panel 142 can be implemented by any technology developed in the future.
  • the touch panel 142 can cover the display panel 141, and the user can display the content according to the display panel 141 (the display content includes, but is not limited to, a soft keyboard, a virtual mouse, a virtual button, an icon, etc.) on the display panel 141. Operation is performed on or near the covered touch panel 142.
  • the touch panel 142 After detecting the operation thereon or nearby, the touch panel 142 transmits to the processor 180 through the I/O subsystem 170 to determine user input, and then the processor 180 according to the user The input provides a corresponding visual output on display panel 141 via I/O subsystem 170.
  • the touch panel 142 and the display panel 141 are used as two separate components to implement the input and input functions of the mobile phone 100 in FIG. 1, in some embodiments, the touch panel 142 may be integrated with the display panel 141. The input and output functions of the mobile phone 100 are implemented.
  • the handset 100 can also include at least one type of sensor 150, such as a light sensor, motion sensor, and other sensors.
  • the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel 141 according to the brightness of the ambient light, and the proximity sensor may move to the ear in the mobile phone 100.
  • the display panel 141 and/or the backlight are turned off.
  • the accelerometer sensor can detect the magnitude of acceleration in all directions (usually three axes). When it is stationary, it can detect the magnitude and direction of gravity.
  • the mobile phone 100 can be used to identify the gesture of the mobile phone (such as horizontal and vertical screen switching, related Game, magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tapping), etc.
  • the mobile phone 100 can also be configured with gyroscopes, barometers, hygrometers, thermometers, infrared sensors and other sensors, here Let me repeat.
  • the audio circuit 160, the speaker 161, and the microphone 162 can provide an audio interface between the user and the handset 100.
  • the audio circuit 160 can transmit the converted audio data to the speaker 161 for conversion to the sound signal output by the speaker 161; on the other hand, the microphone 162 converts the collected sound signal into a signal, which is received by the audio circuit 160.
  • the audio data is converted to audio data, which is then output to the RF circuit 108 for transmission to, for example, another mobile phone, or the audio data is output to the memory 120 for further processing.
  • the I/O subsystem 170 is used to control external devices for input and output, and may include other device input controllers 171, sensor controllers 172, and display controllers 173.
  • one or more other input control device controllers 171 receive signals from other input devices 130 and/or send signals to other input devices 130.
  • Other input devices 130 may include physical buttons (press buttons, rocker buttons, etc.) , dial, swipe switch, joystick, click wheel, light mouse (light mouse is a touch-sensitive surface that does not display visual output, or an extension of a touch-sensitive surface formed by a touch screen). It is worth noting that other input control device controllers 171 can be connected to any one or more of the above devices.
  • Display controller 173 in I/O subsystem 170 receives signals from touch screen 140 and/or transmits signals to touch screen 140. After the touch screen 140 detects the user input, the display controller 173 converts the detected user input into an interaction with the user interface object displayed on the touch screen 140, that is, implements human-computer interaction.
  • Sensor controller 172 can receive signals from one or more sensors 150 and/or send signals to one or more sensors 150.
  • the processor 180 is the control center of the handset 100, connecting various portions of the entire handset with various interfaces and lines, by running or executing software programs and/or modules stored in the memory 120, and recalling data stored in the memory 120, The various functions and processing data of the mobile phone 100 are executed to perform overall monitoring of the mobile phone.
  • the processor 180 may include one or more processing units; preferably, the processor 180 may integrate an application processor and a modem processor, where the application processor mainly processes an operating system, a user interface, an application, and the like.
  • the modem processor primarily handles wireless communications. It can be understood that the above modem processor may not be integrated into the processor 180.
  • the handset 100 also includes a power source 190 (such as a battery) that supplies power to the various components.
  • a power source 190 such as a battery
  • the power source can be logically coupled to the processor 180 via a power management system to manage functions such as charging, discharging, and power consumption through the power management system.
  • the mobile phone 100 may further include a camera, a Bluetooth module, and the like, and details are not described herein.
  • the modules stored in the memory 120 may include an operating system, a contact/motion module, a graphics module, an application, and the like.
  • the contact/motion module is used to detect the contact of an object or finger with the touch screen 140 or the point-and-click touch dial, capture the speed (direction and size) of the contact, the acceleration (change in size or direction), and determine the type of contact event.
  • a variety of contact event detection modules figure-down/dragging/up/tap, sometimes combining gestures with elements in the UI interface to achieve some operations: pinching/depinching (finger squeeze/expansion) and so on.
  • the graphics module is used to render and display graphics on a touch screen or other display, including graphics, icons, and numbers. Word images, videos and animations.
  • Applications can include contacts, phones, video conferencing, email clients, instant messaging, personal sports, cameras, image management, video players, music players, calendars, plugins (weather, stocks, calculators, clocks, dictionaries) ), custom plugins, search, notes, maps, online videos, and more.
  • FIG. 2 is a schematic structural diagram of another mobile phone provided by an embodiment of the present application.
  • the handset 200 includes a body 1 and a touch screen 2 (eg, a touch screen 140 as shown in FIG. 1).
  • the touch screen 2 can integrate the touch panel and the display panel to realize the input and output functions of the mobile phone 200.
  • the user can use the finger 3 or the stylus 4 to perform operations such as clicking and swiping on the touch screen, and the touch panel can detect these operations, and the touch screen can be detected. Also known as the screen.
  • the body 1 includes a camera 11, a photosensitive element 12, an earpiece 13, a physical key 14, a power key 15, a volume key 16, and the like.
  • the camera 11 can include a front camera and a rear camera.
  • the photosensitive element 12 is mainly used for sensing the distance between the human body and the mobile phone. For example, when the user is calling, the mobile phone is close to the ear, and after the photosensitive element 12 detects the distance information, the touch screen of the mobile phone 200 can turn off the input function, so that It can prevent accidental touch.
  • the physical key 14 is generally a Home key, and may also be a Home key integrated with a fingerprint recognition module.
  • the physical key 14 may further include a return key, a menu key, and an exit key.
  • the physical key 14 may also be a touch key of a set position on the touch screen.
  • the physical key 14, the power key 15 and the volume key 16 may be specifically referred to the description of the other input devices 130 in the embodiment shown in FIG. 1. It should be noted that the embodiment of the present application may further include a microphone 17, a data interface 18, a Subscriber Identification Module (SIM) card interface (not shown), a headphone interface 19, and the like.
  • SIM Subscriber Identification Module
  • the mobile phone 200 shown in FIG. 2 is merely an example and is not limiting, and may include more or less components than those illustrated, or combine some components, or split certain components, or different components. Arrangement.
  • the mobile phone 200 shown in FIG. 2 is taken as an example to describe the interaction process between the mobile phone 200 and the user in the specific implementation process of the embodiment of the present application.
  • FIG. 3 is a schematic diagram of a user interface provided by an embodiment of the present application. As shown in FIG. 3-6, after the mobile phone 200 is turned on or the screen is unlocked, the mobile phone 200 can display a user interface, and the user interface can display information such as an application icon and a status bar.
  • the status bar can be used to display the operator name 201 (for example, China Mobile, China Unicom, etc.), Wireless-Fidelity (Wi-Fi) status control identifier (for example, Wi-Fi signal strength can be displayed), and mobile
  • the communication signal state control identifier for example, the signal strength of the mobile communication can be displayed, and the signal strength of the plurality of mobile communications can also be displayed for the mobile phone with multiple cards and multiple standbys), the battery state control flag, and the local time. Or multiple information and so on.
  • the mobile phone 200 detects a user's swipe operation.
  • a main menu can be displayed that can be displayed within the target area 230.
  • the user gesture 311 is a gesture in which the user contact is swiped from the screen edge of the mobile phone 200 to the inside of the screen in the target area 230.
  • the inside of the screen here refers to the inside of the surface of the touch screen.
  • the swipe from the edge of the screen to the inside of the screen may refer to a swipe from the edge of the screen where the target area is located to the inside of the screen; the swipe may be another edge parallel to the edge of the target area The direction of the edge is swiped.
  • the swipe from the edge where qp is located to the other edge parallel to the edge may be referred to as swiping from the edge of the screen toward the inside of the screen.
  • the user's contact from the inside of the screen to the edge of the target area can be referred to as swiping from the inside of the screen to the edge of the screen.
  • the user contact 320 is tracked.
  • the process in which the main menu is dragged can be displayed in a dynamic form according to the movement of the user contact 320.
  • the handset 200 determines the distance of the user contact 320 from the edge of the screen during the swipe of the user gesture 311, and displays the main menu on the screen of the handset 200 when the distance reaches or exceeds the threshold.
  • the main menu can be displayed floating on the original user interface.
  • the target area 230 is a part of the screen of the mobile phone 200, and the area is logically present and may not be displayed on the screen of the mobile phone 200.
  • the target area 230 may be an area on the right side of the screen of the mobile phone as shown in FIG. 3, and may also be an area on the upper side of the screen as shown in FIG. 4, and may also refer to the screen as shown in FIG. 5 and FIG. The lower side and the area on the left side.
  • one handset may be provided with one or more target areas 230, the settings of which may be determined by the user's selection. For example, a user who is accustomed to the left-handed handshake can select the target area 230 shown in FIG. 6, and the user who is accustomed to the other hand to perform the operation can select the target area 230 shown in FIG. 4 or FIG.
  • the mobile phone 200 can also set the size of the target area 230.
  • the screen pixel of the mobile phone 200 is 480*800.
  • the maximum value of x is 480
  • the maximum value of y is 800.
  • the target area 230 may be a quadrilateral area surrounded by points m (280, 250), n (280, 600), q (480, 250), and p (480, 600).
  • the mobile phone 200 can set the size of the area, for example, the length of the target area 230 in the y-axis direction can be adjusted, and the target area 230 can be set to move up or down in the y-axis direction as a whole.
  • the area where the menu item is located may refer to the square grid in which the menu item is located, wherein the square grid may refer to a small grid in which the area of the user interface display menu is equally divided or divided by other rules, and each small grid is used to display a menu item.
  • the user interface may display only the icon or logo of the menu item without displaying the square grid.
  • the main menu item may include a plurality of forms such as a single application icon, a folder, and a menu item including a sub menu.
  • Different types of main menu items correspond to different actions, which are further described below in conjunction with FIGS. 9 to 1.
  • the main menu item A corresponds to a sub menu
  • the main menu item A when it is detected that the contact of the gesture 312 shown in FIG. 8 is swiped to the area where the main menu item A is located, as shown in FIG. 9, the main menu item A is displayed on the screen.
  • Submenu. Specifically, the main menu item A and the sub menu corresponding to the main menu item A may be correspondingly displayed on the user interface.
  • the submenu of the main menu item may include: an icon of a folder including at least one application icon, an option icon representing a set of at least one function, and an application icon, etc., which are exemplified below.
  • the main menu includes setup options, mobile data switch options, folder options, and app store icon options.
  • the sub-menu corresponding to the setting option is displayed, and the sub-menu includes a Wireless Local Area Networks (WLAN) switch option, an automatic rotation option, and Bluetooth.
  • WLAN Wireless Local Area Networks
  • the mobile phone 200 detects that the user's contact of the gesture 312 is swiped to the area where the main menu item is located, if the main menu item has a function, after the contact disappears from the screen of the mobile phone, that is, the user contact leaves the mobile phone screen, Perform this function.
  • the function can be a switch that launches an application, controls some devices or functions.
  • the touch data switch option of the main menu is swiped at the contact of the gesture 312 (not shown).
  • the function is executed, that is, the mobile data service is turned on or off.
  • the mobile phone 200 detects that the sub menu item A is A gesture 313 of 2 (shown in FIG. 9) is an operation in which the user's contact is swiped to the area in which the sub-menu item A 2 is located.
  • the corresponding action is performed according to the specific form of the submenu item A 2 .
  • the submenu item A 2 corresponds to a sub menu
  • the contact of the gesture 313 is swiped to the area where the sub menu item A 2 is located
  • the sub menu corresponding to the sub menu item A 2 is displayed on the screen. And so on.
  • the display of the submenu is not limited to the target area.
  • the mobile phone determines that the sub-menu item A 2 is selected when it is detected.
  • the user can move from the submenu item to the main menu item to select the main menu item.
  • the mobile phone detects that the contact of the gesture 314 is swiped to the area where the main menu item C is located, the mobile phone performs a corresponding action according to the specific form of the main menu item C.
  • the function is executed after detecting that the contact of the gesture 314 disappears from the screen of the mobile phone. If the main menu item C corresponds to a sub menu, when it is detected that the contact of the gesture 314 is swiped to the area where the main menu item C is located, the sub menu corresponding to the main menu item A is closed, and the sub menu corresponding to the main menu item C is displayed. .
  • the menu displayed on the user interface is closed. if the mobile phone detects that the distance between the contact of the gesture 315 and the edge of the screen is less than or equal to the preset distance, the menu displayed on the user interface may also be closed. For example, as shown in FIG. 15, the mobile phone detects the distance between the contact of the gesture 315 and the edge of the mobile phone screen (for example, the pq side shown in FIG. 3), and when the distance is less than or equal to the preset distance, the user interface display is turned off. Menu, which can include a main menu as well as a submenu.
  • the menu when the menu is closed, it can also be hierarchically closed. For example, as shown in Figure 16, if the gesture is detected after the contact of the submenu item B 2, B want to turn off the main menu item corresponding to a sub-menu, and then select the main menu item A. Then, the user can swipe to the area where the main menu item A is located. As shown in FIG. 17, when the mobile phone detects that the user contact leaves the area where the submenu item corresponding to the main menu item B is located, and does not point to the area where the main menu item B is located, the main menu item B corresponding to the closed display corresponds to Submenu.
  • the user contact moves (point 313) with a motion trajectory pointing to the center of the screen of the mobile phone
  • the user contact is in the area where the main menu item B is located, and the sub-menu corresponding to the main menu item B is displayed
  • the contact is drawn to the main
  • the sub menu corresponding to the main menu item B is continuously displayed
  • the movement of the user contact to the edge of the screen of the mobile phone is detected (gesture 315)
  • the user contact leaves the main menu item.
  • the center of the screen of the mobile phone may refer to the central axis of the screen of the mobile phone.
  • the edge of the screen of the mobile phone refers to the edge of the screen parallel to the central axis.
  • FIG. 18 is a schematic flowchart diagram of a menu display method according to an embodiment of the present application. As shown in FIG. 18, the method specifically includes:
  • the user interface (eg, the user interface shown in FIGS. 3-6) can be displayed on a screen (eg, touch screen 120 as shown in FIG. 1).
  • the user can perform a swipe or click on the user interface.
  • the user can perform a click operation on the user interface, click the icon of the camera application in the user interface, and the camera application can be run, and the camera application can realize functions such as photographing and photographing; or Click the icon of the folder in the user interface to display the details of the folder.
  • the main menu is displayed on the user interface.
  • the first gesture is a gesture in which the user contact is swiped from the screen edge of the terminal to the inside of the screen in the target area, the target area is a part of the screen, and the main menu includes at least one main menu item.
  • the terminal can detect the gesture of the user swiping in the target area, and swipe the target area to display the menu (for example, the main menu and the submenu as shown in FIG. 8-17).
  • the menu can be categorized by level, each category can correspond to a main menu, and each subclass can correspond to a sub-menu.
  • the menu item of the main menu may be an icon of a folder including at least one application icon, an option icon (for example, a setting option icon) representing a set of at least one function, and a form of a single application icon.
  • the main menu categories shown in FIG. 10 may include setting options, moving data switch options, folders, and the like, and the corresponding categories of setting options include WLAN switch options, auto-rotation options, Bluetooth switch options, and ring switches. Options and more.
  • the main menu item in the menu may correspond to a sub-menu (for example, an icon of a folder as shown in FIG. 10 or a setting option icon) or a function corresponding to execution of the menu item (for example, a single application as shown in FIG. 10). Icon app store, or mobile data switch option).
  • the user interface may display other content such as icons before the menu is displayed, and the menu displayed by the method may be displayed above the other contents in the form of a floating window.
  • the menu can be classified as shown in Table 1 below.
  • the menu for the target area swipe display may include only the setting class shown in Table 1, then the main menu item of the displayed menu may include a device and a customization, etc., and the sub-menu item includes a boil menu item corresponding to Submenu.
  • the application store described in Table 1 can be classified as an application category.
  • the configured menu can be considered to be hidden on the edge side of the terminal screen.
  • the terminal detects that the target area is swiped from the edge of the screen to the screen, it can be understood that the user instructs to display the hidden menu edge of the screen by dragging.
  • the main menu is displayed in the target area, and the main menu is suspended and displayed on the other displayed content of the user interface.
  • the main menu includes the main menu items, "Settings”, “Store”, and "Applications”.
  • the terminal displays the interface as shown in FIG. 3, after the terminal detects the first gesture (gesture 311), the interface shown in FIG. 8 is displayed.
  • the target area includes any one or more of the locations described below on the screen: the top of the screen, the bottom of the screen, the left side of the screen, and the right side of the screen.
  • the second gesture is: after the first gesture, the user touches the operation to the area where the target main menu item is located; the target main menu item is one of the at least one main menu item; in addition, the first gesture and the first The second gesture is a continuous operation in which the user contact does not leave the screen during the transition from the first gesture to the second gesture.
  • the terminal displays the interface as shown in FIG. 3, after detecting that the first gesture (gesture 311) is detected by the terminal, if the user contact is detected to slide to the area where the main menu item A is located, the terminal displays as shown in FIG. Interface, where the main menu item A is the target main menu of the second gesture.
  • the terminal displays the main menu
  • the terminal after detecting the first gesture (gesture 311), if the user detects that the user contact slides to the area where the setting option is located, the interface shown in FIG. 10 is displayed, and the setting option here is The target main menu for the second gesture.
  • the terminal detects the first gesture, it detects that the user contact slides to the area where the main menu item "Store” is located, and the user contact is in the main menu item "Store". When the area where it is located disappears, the function corresponding to the "store" is activated.
  • the disappearance of the user contact means that the terminal does not detect the user contact, that is, the user contact leaves the screen of the terminal.
  • the function corresponding to the menu item of the menu that has been displayed may also be triggered by a gesture of continuous sliding.
  • the method further includes
  • the third gesture is: after the second gesture, the user contact is swiped to the target sub-menu item And the user contact leaves the screen in the area where the target sub-menu item is located; the target sub-menu item is one of the at least one sub-menu item, and the target function is a function corresponding to the target sub-menu item; in addition, the second gesture and The third gesture is a continuous operation in which the user contact does not leave the screen during the transition from the second gesture to the third gesture.
  • the second gesture is for the user contact to swipe to the area where the main menu item "Settings" is located.
  • the user touch is detected to be swiped to the child.
  • the area where the menu item "Device” is located can display the second sub-menu corresponding to the sub-menu item "Device”. If the user touch is detected to the area where the second sub-menu item (eg, volume, display, or storage, etc.) is located, and the user contact disappears in the area where the second sub-menu item is located, then the second is activated The function corresponding to the submenu item.
  • the terminal displays the interface as shown in FIG.
  • the second gesture is for the user contact to swipe to the area where the main menu item "device" is located, and after detecting the second gesture, detecting that the user contact is swiped to the sub-menu item "volume" In the area where the user contact disappears in the area where the second sub-menu item is located, the function corresponding to the second sub-menu item is activated, and the function corresponding to the sub-menu item “volume” can display the volume adjustment interface, so as to Volume adjusts the size of the interface device volume.
  • the menu already displayed on the screen can also be turned off by a continuous sliding gesture.
  • the method further includes
  • the fourth gesture is a gesture that the user contact swipes from the inside of the screen to the edge of the screen; the second gesture and the fourth gesture are continuous operations, and the process of transitioning from the second gesture to the fourth gesture The user contact does not leave the screen.
  • the terminal displays the user interface as shown in FIG. 14, if the gesture 315 is detected, the menu displayed by the user interface is closed, as shown in FIG.
  • the swipe from the edge of the screen to the inside of the screen may refer to a swipe from the edge of the screen where the target area is located to the inside of the screen; the swipe may be from the edge where the target area is located to the edge parallel to the edge A swipe in the direction of an edge.
  • the “function” in the embodiment of the present application may refer to a function that is implemented by running a software program, such as a startup application, a display setting interface, and a switch function.
  • target main menu “target sub-menu”, and “target function” are only for distinguishing from other main menus, sub-menus and functions. It is easy to understand and does not constitute a limitation.
  • target main menu When detecting that the user's contact is in one of the other main menus, submenus, or one of the options corresponding to the function, one of the other main menus, submenus, or functions may also be referred to as the target main menu, the target sub menu, or the target.
  • FIG. 19 is a schematic flowchart diagram of another menu display method according to an embodiment of the present application. As shown in FIG. 19, the method specifically includes:
  • the terminal displays a user interface.
  • the terminal detects the first gesture
  • the menu is displayed on the user interface.
  • the first gesture is a gesture in which the user contact is swiped from the screen edge of the terminal to the inside of the screen in the target area, the target area is a part of the screen, and the menu includes at least one menu item.
  • At least one menu item displayed in the user interface includes a single application icon or a switch option, etc., an icon capable of running an application or function.
  • a single application icon or a switch option etc.
  • an icon capable of running an application or function for example, the application store icon and the mobile data option as shown in FIG.
  • S610-S620 can be referred to the related description of S510-S520 in the embodiment described in connection with FIG.
  • the second gesture is: after the first gesture, the user contact is swiped to the target menu item, And the user contact leaves the screen in the area where the target menu item is located; the target menu item is one of at least one menu item, and the target function is the target dish
  • the single-event corresponding function; the first gesture and the second gesture are continuous operations, and the user contact does not leave the screen during the transition from the first gesture to the second gesture.
  • the target menu item is an icon of a single application icon or other function.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A menu display method and terminal. A terminal displays a user interface (S510); the terminal displays a main menu on the user interface when a first gesture is detected, the first gesture being a gesture that a user contact swipes, within a target area, to the center from the edge of a terminal screen, and the target area being a part of the screen (S520); after the first gesture is detected, the terminal displays at least one sub-menu corresponding to a target main menu on the user interface when a second gesture is detected, the second gesture being an operation that the user contact swipes to an area of the target main menu, and the target main menu being one of at least one main menu; the user contact does not leave the screen during the process of converting to the second gesture from the first gesture (S530). So that multilevel menus can be displayed without lifting the finger by detecting continuous swiping of a user. Multiple interactions are achieved by only one operation; the operation is quick and convenient; and the efficient is high.

Description

菜单显示方法及终端Menu display method and terminal 技术领域Technical field
本申请实施例涉及通信技术领域,尤其涉及一种菜单显示方法及终端。The embodiments of the present invention relate to the field of communications technologies, and in particular, to a menu display method and a terminal.
背景技术Background technique
随着终端设备的不断发展,在人机交互中,用户获取信息以及对信息的便捷交互已经成为关键体验因素。其中,终端设备根据用户指示显示不同级别的菜单是一种重要的人机交互方式。现有菜单显示方式通常是逐层显示不同级别的菜单,界面的层次至少为两层,例如,当屏幕界面上显示的内容需要由一级菜单切换到二级菜单时,一级菜单会被二级菜单的页面覆盖。With the continuous development of terminal devices, in human-computer interaction, user access to information and convenient interaction with information has become a key experience factor. Among them, the terminal device displays different levels of menus according to user instructions is an important human-computer interaction mode. The existing menu display mode usually displays different levels of menus layer by layer. The level of the interface is at least two layers. For example, when the content displayed on the screen interface needs to be switched from the first level menu to the second level menu, the first level menu will be two. The page coverage of the level menu.
其中,一般通过单次划动来指示终端显示一级菜单或执行一个交互动作,例如,选择一个菜单中的功能。如果想进入下一级菜单或者选择另一个动作,用户必须抬起手指,用另一个交互手势完成。Wherein, the terminal is generally instructed to display a first-level menu or perform an interactive action by a single swipe, for example, selecting a function in a menu. If you want to go to the next level menu or choose another action, the user must raise his finger and complete it with another interactive gesture.
发明内容Summary of the invention
本申请实施例提供了一种菜单显示方法及终端,无需抬起手指,即可实现菜单的显示以及菜单项的选择等等。The embodiment of the present application provides a menu display method and a terminal, which can realize display of a menu, selection of menu items, and the like without lifting a finger.
第一方面,本申请实施例提供了一种菜单显示方法,包括:终端显示用户界面;当终端检测到第一手势时,在用户界面显示主菜单,该第一手势为用户触点在目标区域内由终端的屏幕边缘向屏幕内部划动的手势,该目标区域为屏幕的一部分,该主菜单包括至少一个主菜单项;在检测到第一手势后,当终端检测到第二手势时,在用户界面显示目标主菜单项对应的至少一个子菜单项;该第二手势为:在第一手势之后,用户触点划动到目标主菜单项所在区域的操作;目标主菜单项为至少一个主菜单项中的一个;第一手势和第二手势为连续操作,在从所述第一手势转变为所述第二手势的过程中,用户触点未离开屏幕。本申请实施例通过检测用户的连续划动,且无需抬起手指,即可实现多级菜单的显示一次操作多次交互,方便快捷,效率更高。In a first aspect, the embodiment of the present application provides a menu display method, including: displaying a user interface by a terminal; and displaying a main menu on the user interface when the terminal detects the first gesture, the first gesture is a user contact in a target area. a gesture of swiping from the edge of the screen of the terminal to the inside of the screen, the target area being a part of the screen, the main menu including at least one main menu item; after detecting the first gesture, when the terminal detects the second gesture, Displaying at least one sub-menu item corresponding to the target main menu item in the user interface; the second gesture is: an operation of the user contact swiping to the area where the target main menu item is located after the first gesture; the target main menu item is at least One of the main menu items; the first gesture and the second gesture are continuous operations, and the user contact does not leave the screen during the transition from the first gesture to the second gesture. The embodiment of the present application can realize the continuous stroke of the user, and the display of the multi-level menu can be performed multiple times with one operation, which is convenient and quick, and has higher efficiency.
在一个可选地实现中,该方法还包括:在检测到第二手势后,当终端检测到第三手势时,执行目标功能;其中,第三手势为:在第二手势之后,用户触点划动到目标子菜单项,且用户触点在目标子菜单项所在的区域离开所述屏幕;该目标子菜单项为至少一个子菜单项中的一个,该目标功能为该目标子菜单项对应的功能;第二手势和第三手势为连续操作,在从所述第二手势转变为所述第三手势的过程中,用户触点未离开屏幕。通过本申请实施例,无需抬起手指,即可实现多级菜单的显示以及菜单项的选择,效率更高。In an optional implementation, the method further includes: after detecting the second gesture, when the terminal detects the third gesture, performing a target function; wherein the third gesture is: after the second gesture, the user The contact swipes to the target sub-menu item, and the user contact leaves the screen in an area where the target sub-menu item is located; the target sub-menu item is one of at least one sub-menu item, and the target function is the target sub-menu The function corresponding to the item; the second gesture and the third gesture are continuous operations, and the user contact does not leave the screen during the transition from the second gesture to the third gesture. Through the embodiment of the present application, the display of the multi-level menu and the selection of the menu items can be realized without lifting the finger, and the efficiency is higher.
在另一个可选地实现中,该方法还包括:在检测到第二手势后,若检测到第四手势,且用户触点距离屏幕边缘小于或等于阈值时,则关闭用户界面显示的主菜单,第四手势为 用户触点从屏幕内部向屏幕边缘划动的手势;第二手势和第四手势为连续操作,在从所述第二手势转变为所述第四手势的过程中,用户触点未离开屏幕。通过本申请实施例,实现了无需抬手,便可实现终端显示菜单以及关闭菜单的过程,一次操作多次交互,效率更高。In another optional implementation, the method further includes: after detecting the second gesture, if the fourth gesture is detected, and the user contact is less than or equal to a threshold from the edge of the screen, then the main user interface is closed. Menu, the fourth gesture is a gesture of the user contact swiping from the inside of the screen to the edge of the screen; the second gesture and the fourth gesture are continuous operations, and the user contact does not leave during the transition from the second gesture to the fourth gesture screen. Through the embodiment of the present application, the process of displaying the menu and closing the menu of the terminal can be realized without raising the hand, and the interaction is performed multiple times in one operation, and the efficiency is higher.
在又一个可选地实现中,该方法还包括:目标区域包括屏幕上如下描述的任意一个或多个位置:屏幕的顶端、屏幕的底端、屏幕的左方以及屏幕的右方。通过本申请实施例可以实现,可以提供更多可能的目标区域,能够适用不同的操作习惯,提升用户体验。In yet another alternative implementation, the method further includes the target area including any one or more of the locations described below on the screen: a top end of the screen, a bottom end of the screen, a left side of the screen, and a right side of the screen. The embodiment of the present application can be implemented, and more possible target areas can be provided, and different operating habits can be applied to improve the user experience.
在再一个可选地实现中,在目标区域内显示主菜单。通过在目标区域显示主菜单,能够更明确用户操作所达到的效果,用户体验更好。In yet another alternative implementation, the main menu is displayed within the target area. By displaying the main menu in the target area, the effect achieved by the user operation can be more clearly defined, and the user experience is better.
在再一个可选地实现中,主菜单包括以下主菜单项:包括至少一个应用图标的文件夹的图标,或,代表至少一个功能的集合的选项图标。In still another alternative implementation, the main menu includes the following main menu items: an icon including a folder of at least one application icon, or an option icon representing a collection of at least one function.
第二方面,本申请实施例提供了一种菜单显示方法,包括:终端在屏幕上显示用户界面;当终端检测到第一手势时,在用户界面显示菜单,第一手势为用户触点在目标区域内由终端的屏幕边缘向屏幕内部划动的手势,目标区域为屏幕的一部分,该菜单包括至少一个菜单项;在检测到第一手势后,当终端检测到第二手势时,执行目标功能;第二手势为:在第一手势之后,用户触点划动到目标菜单项,且用户触点在目标菜单项所在的区域离开屏幕;所述目标菜单项为至少一个菜单项中的一个,目标功能为所述目标菜单项对应的功能;第一手势和第二手势为连续操作,在从第一手势转变为第二手势的过程中,用户触点未离开所述屏幕。本申请实施例通过检测用户的连续划动,且无需抬起手指,即可实现菜单的显示以及菜单项的选择,一次操作多次交互,方便快捷,效率更高。In a second aspect, the embodiment of the present application provides a menu display method, including: displaying, by a terminal, a user interface on a screen; when the terminal detects the first gesture, displaying a menu on the user interface, the first gesture is a user contact at the target a gesture in the area that is swiped from the screen edge of the terminal to the inside of the screen, the target area is a part of the screen, the menu includes at least one menu item; after detecting the first gesture, when the terminal detects the second gesture, executing the target The second gesture is: after the first gesture, the user contact swipes to the target menu item, and the user contact leaves the screen in the area where the target menu item is located; the target menu item is in at least one menu item One, the target function is a function corresponding to the target menu item; the first gesture and the second gesture are continuous operations, and the user contact does not leave the screen during the transition from the first gesture to the second gesture. The embodiment of the present application can realize the display of the menu and the selection of the menu item by detecting the continuous stroke of the user without lifting the finger, and the interaction is convenient and fast, and the efficiency is higher.
第三方面,本发明实施例提供了一种终端,该终端具有实现上述第一方面和/或第二方面的方法实际中终端行为的功能。所述功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。所述硬件或软件包括一个或多个与上述功能相对应的模块。In a third aspect, an embodiment of the present invention provides a terminal, where the terminal has a function of implementing a terminal behavior in a method of the foregoing first aspect and/or the second aspect. The functions may be implemented by hardware or by corresponding software implemented by hardware. The hardware or software includes one or more modules corresponding to the functions described above.
在一个可选地实现中,本申请实施例提供了一种终端,包括:触摸屏,用于显示用户界面;处理器以及存储器,存储器用户存储数据和程序。当终端运行时,处理器执行存储器存储的计算机执行指令,以使输入设备执行如第一方面以及第一方面的各种可选方式中和/或第二方面中的菜单显示的方法。In an optional implementation, the embodiment of the present application provides a terminal, including: a touch screen for displaying a user interface; a processor and a memory, and the memory user storing data and a program. When the terminal is running, the processor executes a memory-stored computer-executable instruction to cause the input device to perform the method of the menu display in the first aspect and in various alternatives of the first aspect and/or in the second aspect.
第四方面,本申请实施例提供了一种计算机可读存储介质,用于储存为上述终端所用的计算机软件指令,其包含用于执行上述第一方面以及可选地实现中和/或第二方面中所设计的程序。In a fourth aspect, an embodiment of the present application provides a computer readable storage medium, configured to store computer software instructions for use by the terminal, including: performing the foregoing first aspect and optionally implementing a neutral and/or second The program designed in the aspect.
第五方面,本申请实施例提供了一种计算机程序产品,用于储存为上述终端所用的计算机软件指令,其包含用于执行上述第一方面以及可选地实现中和/或第二方面中所设计的程序。In a fifth aspect, an embodiment of the present application provides a computer program product for storing computer software instructions for use in the foregoing terminal, which is included for performing the above first aspect and optionally implementing the middle and/or the second aspect. The program designed.
附图说明DRAWINGS
图1为本申请实施例提供的一种手机结构示意图;FIG. 1 is a schematic structural diagram of a mobile phone according to an embodiment of the present application;
图2为本申请实施例提供的另一种手机结构示意图;2 is a schematic structural diagram of another mobile phone according to an embodiment of the present application;
图3为本申请实施例提供的一个界面示意图;FIG. 3 is a schematic diagram of an interface provided by an embodiment of the present application;
图4为本申请实施例提供的另一个界面示意图;4 is another schematic diagram of an interface provided by an embodiment of the present application;
图5为本申请实施例提供的又一个界面示意图; FIG. 5 is a schematic diagram of still another interface provided by an embodiment of the present application;
图6为本申请实施例提供的再一个界面示意图;FIG. 6 is a schematic diagram of still another interface provided by an embodiment of the present application;
图7为本申请实施例提供的再一个界面示意图;FIG. 7 is a schematic diagram of still another interface provided by an embodiment of the present application;
图8为本申请实施例提供的再一个界面示意图;FIG. 8 is a schematic diagram of still another interface provided by an embodiment of the present application;
图9为本申请实施例提供的再一个界面示意图;FIG. 9 is a schematic diagram of still another interface provided by an embodiment of the present application;
图10为本申请实施例提供的再一个界面示意图;FIG. 10 is a schematic diagram of still another interface provided by an embodiment of the present application;
图11为本申请实施例提供的再一个界面示意图;FIG. 11 is a schematic diagram of still another interface provided by an embodiment of the present application;
图12为本申请实施例提供的再一个界面示意图;FIG. 12 is a schematic diagram of still another interface provided by an embodiment of the present application;
图13为本申请实施例提供的再一个界面示意图;FIG. 13 is a schematic diagram of still another interface provided by an embodiment of the present application;
图14为本申请实施例提供的再一个界面示意图;FIG. 14 is a schematic diagram of still another interface provided by an embodiment of the present application;
图15为本申请实施例提供的再一个界面示意图;FIG. 15 is a schematic diagram of still another interface provided by an embodiment of the present application;
图16为本申请实施例提供的再一个界面示意图;FIG. 16 is a schematic diagram of still another interface provided by an embodiment of the present application;
图17为本申请实施例提供的再一个界面示意图;FIG. 17 is a schematic diagram of still another interface provided by an embodiment of the present application;
图18为本申请实施例提供的一种菜单显示方法的流程示意图;FIG. 18 is a schematic flowchart diagram of a menu display method according to an embodiment of the present disclosure;
图19为本申请实施例提供的另一种菜单显示方法的流程示意图。FIG. 19 is a schematic flowchart diagram of another menu display method according to an embodiment of the present application.
具体实施方式detailed description
本申请实施例提供了一种菜单显示方法及终端。该方法通过检测用户的连续划动,且无需抬起手指,即可实现菜单的显示以及菜单项的选择,一次操作多次交互,方便快捷,效率更高。The embodiment of the present application provides a menu display method and a terminal. The method can realize the display of the menu and the selection of the menu items by detecting the continuous stroke of the user without lifting the finger, and the interaction is more convenient and faster, and the efficiency is higher.
其中,本申请实施例涉及的终端可以包括手机、平板电脑、个人数字助理(Personal Digital Assistant,PDA)、销售终端(Point of Sales,POS)、车载电脑以及其他能够显示菜单的终端。该终端至少可以包括存储器、触摸屏以及处理器,存储器可用于存储软件程序,处理器通过运行存储在存储器的软件程序,执行终端的各种功能,触摸屏可用于显示由用户输入的信息、提供给用户的信息以及终端的各种菜单,还可以接受用户输入。The terminal involved in the embodiment of the present application may include a mobile phone, a tablet computer, a personal digital assistant (PDA), a point of sales (POS), an in-vehicle computer, and other terminals capable of displaying a menu. The terminal can include at least a memory, a touch screen, and a processor. The memory can be used to store a software program. The processor executes various functions of the terminal by running a software program stored in the memory, and the touch screen can be used to display information input by the user and provide the information to the user. The information as well as the various menus of the terminal can also accept user input.
下面以图1为例对终端的结构做进一步的介绍。图1示出的是与本申请实施例相关的一种手机结构示意图。参考图1,手机100包括、RF(Radio Frequency,射频)电路110、存储器120、其他输入设备130、触摸屏140、传感器150、音频电路160、I/O子***170、处理器180、以及电源190等部件。本领域技术人员可以理解,图1中示出的手机结构并不构成对手机的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。The structure of the terminal is further described below by taking FIG. 1 as an example. FIG. 1 is a schematic structural diagram of a mobile phone related to an embodiment of the present application. Referring to FIG. 1, a mobile phone 100 includes an RF (Radio Frequency) circuit 110, a memory 120, other input devices 130, a touch screen 140, a sensor 150, an audio circuit 160, an I/O subsystem 170, a processor 180, and a power supply 190. And other components. It will be understood by those skilled in the art that the structure of the mobile phone shown in FIG. 1 does not constitute a limitation to the mobile phone, and may include more or less components than those illustrated, or combine some components, or split some components, or Different parts are arranged.
下面结合图1对手机100的各个构成部件进行具体的介绍:The components of the mobile phone 100 will be specifically described below with reference to FIG. 1 :
RF电路110可用于收发信息或通话过程中,信号的接收和发送,特别地,将基站的下行信息接收后,给处理器180处理;另外,将设计上行的数据发送给基站。通常,RF电路包括但不限于天线、至少一个放大器、收发信机、耦合器、LNA(Low Noise Amplifier,低噪声放大器)、双工器等。此外,RF电路110还可以通过无线通信与网络和其他设备通信。所述无线通信可以使用任一通信标准或协议,包括但不限于GSM(Global System of Mobile communication,全球移动通讯***)、GPRS(General Packet Radio Service,通用分组无线服务)、CDMA(Code Division Multiple Access,码分多址)、WCDMA(Wideband Code Division Multiple Access,宽带码分多址)、 LTE(Long Term Evolution,长期演进)、电子邮件、SMS(Short Messaging Service,短消息服务)等。The RF circuit 110 can be used for transmitting and receiving information or during a call, and receiving and transmitting the signal. Specifically, after receiving the downlink information of the base station, the processor 180 processes the data. In addition, the uplink data is designed to be sent to the base station. Generally, RF circuits include, but are not limited to, an antenna, at least one amplifier, a transceiver, a coupler, an LNA (Low Noise Amplifier), a duplexer, and the like. In addition, RF circuitry 110 can also communicate with the network and other devices via wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System of Mobile communication), GPRS (General Packet Radio Service), CDMA (Code Division Multiple Access). , code division multiple access), WCDMA (Wideband Code Division Multiple Access), LTE (Long Term Evolution), e-mail, SMS (Short Messaging Service), and the like.
存储器120可用于存储软件程序,处理器180通过运行存储在存储器120的软件程序,从而执行手机100的各种功能。存储器120可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作***、至少一个功能所需的应用程序(比如声音播放功能、图象播放功能等)等;存储数据区可存储根据手机100的使用所维护的数据(比如音频数据、电话本等)等。此外,存储器120可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。The memory 120 can be used to store software programs, and the processor 180 executes various functions of the mobile phone 100 by running software programs stored in the memory 120. The memory 120 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may be stored. Data maintained according to the use of the mobile phone 100 (such as audio data, phone book, etc.). Moreover, memory 120 can include high speed random access memory, and can also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
其他输入设备130可用于接收输入的数字或字符信息,以及产生与手机100的用户设置以及功能控制有关的键信号输入。具体地,其他输入设备130可包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆、光鼠(光鼠是不显示可视输出的触摸敏感表面,或者是由触摸屏形成的触摸敏感表面的延伸)等中的一种或多种。其他输入设备130与I/O子***170的其他输入设备控制器171相连接,在其他设备输入控制器171的控制下与处理器180进行信号交互。 Other input devices 130 can be used to receive input numeric or character information, as well as generate key signal inputs related to user settings and function controls of the handset 100. Specifically, other input devices 130 may include, but are not limited to, a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackballs, mice, joysticks, and light mice (the light mouse is not sensitive to display visual output). One or more of a surface, or an extension of a touch sensitive surface formed by a touch screen. Other input devices 130 are coupled to other input device controllers 171 of I/O subsystem 170 for signal interaction with processor 180 under the control of other device input controllers 171.
触摸屏140可用于显示由用户输入的信息或提供给用户的信息以及手机100的各种菜单,还可以接受用户输入。具体的触摸屏140可包括显示面板141,以及触控面板142。其中显示面板141可以采用LCD(Liquid Crystal Display,液晶显示器)、OLED(Organic Light-Emitting Diode,有机发光二极管)等形式来配置显示面板141。触控面板142,也称为显示屏、触敏屏等,可收集用户在其上或附近的接触或者非接触操作(比如用户使用手指、触笔等任何适合的物体或附件在触控面板142上或在触控面板142附近的操作,也可以包括体感操作;该操作包括单点控制操作、多点控制操作等操作类型。),并根据预先设定的程序驱动相应的连接装置。可选的,触控面板142可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的手势,也就是触摸方位、姿势,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成处理器能够处理的信息,再送给处理器180,并能接收处理器180发来的命令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型实现触控面板142,也可以采用未来发展的任何技术实现触控面板142。进一步的,触控面板142可覆盖显示面板141,用户可以根据显示面板141显示的内容(该显示内容包括但不限于,软键盘、虚拟鼠标、虚拟按键、图标等等),在显示面板141上覆盖的触控面板142上或者附近进行操作,触控面板142检测到在其上或附近的操作后,通过I/O子***170传送给处理器180以确定用户输入,随后处理器180根据用户输入通过I/O子***170在显示面板141上提供相应的视觉输出。虽然在图1中,触控面板142与显示面板141是作为两个独立的部件来实现手机100的输入和输入功能,但是在某些实施例中,可以将触控面板142与显示面板141集成而实现手机100的输入和输出功能。The touch screen 140 can be used to display information input by the user or information provided to the user as well as various menus of the mobile phone 100, and can also accept user input. The specific touch screen 140 may include a display panel 141 and a touch panel 142. The display panel 141 can be configured by using an LCD (Liquid Crystal Display), an OLED (Organic Light-Emitting Diode), or the like. The touch panel 142, also referred to as a display screen, a touch sensitive screen, etc., can collect contact or non-contact operations on or near the user (eg, the user uses any suitable object or accessory such as a finger, a stylus, etc. on the touch panel 142). The operation on or near the touch panel 142 may also include a somatosensory operation; the operation includes a single point control operation, a multi-point control operation, and the like, and drives the corresponding connection device according to a preset program. Optionally, the touch panel 142 may include two parts: a touch detection device and a touch controller. Wherein, the touch detection device detects a gesture of the user, that is, touches an orientation, a posture, and detects a signal brought by the touch operation, and transmits a signal to the touch controller; the touch controller receives the touch information from the touch detection device, and converts the information The information that the processor can process is sent to the processor 180 and can receive commands from the processor 180 and execute them. In addition, the touch panel 142 can be implemented by using various types such as resistive, capacitive, infrared, and surface acoustic waves, and the touch panel 142 can be implemented by any technology developed in the future. Further, the touch panel 142 can cover the display panel 141, and the user can display the content according to the display panel 141 (the display content includes, but is not limited to, a soft keyboard, a virtual mouse, a virtual button, an icon, etc.) on the display panel 141. Operation is performed on or near the covered touch panel 142. After detecting the operation thereon or nearby, the touch panel 142 transmits to the processor 180 through the I/O subsystem 170 to determine user input, and then the processor 180 according to the user The input provides a corresponding visual output on display panel 141 via I/O subsystem 170. Although the touch panel 142 and the display panel 141 are used as two separate components to implement the input and input functions of the mobile phone 100 in FIG. 1, in some embodiments, the touch panel 142 may be integrated with the display panel 141. The input and output functions of the mobile phone 100 are implemented.
手机100还可包括至少一种传感器150,比如光传感器、运动传感器以及其他传感器。具体地,光传感器可包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示面板141的亮度,接近传感器可在手机100移动到耳 边时,关闭显示面板141和/或背光。作为运动传感器的一种,加速计传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别手机姿态的应用(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;至于手机100还可配置的陀螺仪、气压计、湿度计、温度计、红外线传感器等其他传感器,在此不再赘述。The handset 100 can also include at least one type of sensor 150, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel 141 according to the brightness of the ambient light, and the proximity sensor may move to the ear in the mobile phone 100. When sideways, the display panel 141 and/or the backlight are turned off. As a kind of motion sensor, the accelerometer sensor can detect the magnitude of acceleration in all directions (usually three axes). When it is stationary, it can detect the magnitude and direction of gravity. It can be used to identify the gesture of the mobile phone (such as horizontal and vertical screen switching, related Game, magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tapping), etc. As for the mobile phone 100 can also be configured with gyroscopes, barometers, hygrometers, thermometers, infrared sensors and other sensors, here Let me repeat.
音频电路160、扬声器161,麦克风162可提供用户与手机100之间的音频接口。音频电路160可将接收到的音频数据转换后的信号,传输到扬声器161,由扬声器161转换为声音信号输出;另一方面,麦克风162将收集的声音信号转换为信号,由音频电路160接收后转换为音频数据,再将音频数据输出至RF电路108以发送给比如另一手机,或者将音频数据输出至存储器120以便进一步处理。The audio circuit 160, the speaker 161, and the microphone 162 can provide an audio interface between the user and the handset 100. The audio circuit 160 can transmit the converted audio data to the speaker 161 for conversion to the sound signal output by the speaker 161; on the other hand, the microphone 162 converts the collected sound signal into a signal, which is received by the audio circuit 160. The audio data is converted to audio data, which is then output to the RF circuit 108 for transmission to, for example, another mobile phone, or the audio data is output to the memory 120 for further processing.
I/O子***170用来控制输入输出的外部设备,可以包括其他设备输入控制器171、传感器控制器172、显示控制器173。可选的,一个或多个其他输入控制设备控制器171从其他输入设备130接收信号和/或者向其他输入设备130发送信号,其他输入设备130可以包括物理按钮(按压按钮、摇臂按钮等)、拨号盘、划动开关、操纵杆、点击滚轮、光鼠(光鼠是不显示可视输出的触摸敏感表面,或者是由触摸屏形成的触摸敏感表面的延伸)。值得说明的是,其他输入控制设备控制器171可以与任一个或者多个上述设备连接。所述I/O子***170中的显示控制器173从触摸屏140接收信号和/或者向触摸屏140发送信号。触摸屏140检测到用户输入后,显示控制器173将检测到的用户输入转换为与显示在触摸屏140上的用户界面对象的交互,即实现人机交互。传感器控制器172可以从一个或者多个传感器150接收信号和/或者向一个或者多个传感器150发送信号。The I/O subsystem 170 is used to control external devices for input and output, and may include other device input controllers 171, sensor controllers 172, and display controllers 173. Optionally, one or more other input control device controllers 171 receive signals from other input devices 130 and/or send signals to other input devices 130. Other input devices 130 may include physical buttons (press buttons, rocker buttons, etc.) , dial, swipe switch, joystick, click wheel, light mouse (light mouse is a touch-sensitive surface that does not display visual output, or an extension of a touch-sensitive surface formed by a touch screen). It is worth noting that other input control device controllers 171 can be connected to any one or more of the above devices. Display controller 173 in I/O subsystem 170 receives signals from touch screen 140 and/or transmits signals to touch screen 140. After the touch screen 140 detects the user input, the display controller 173 converts the detected user input into an interaction with the user interface object displayed on the touch screen 140, that is, implements human-computer interaction. Sensor controller 172 can receive signals from one or more sensors 150 and/or send signals to one or more sensors 150.
处理器180是手机100的控制中心,利用各种接口和线路连接整个手机的各个部分,通过运行或执行存储在存储器120内的软件程序和/或模块,以及调用存储在存储器120内的数据,执行手机100的各种功能和处理数据,从而对手机进行整体监控。可选的,处理器180可包括一个或多个处理单元;优选的,处理器180可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作***、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器180中。The processor 180 is the control center of the handset 100, connecting various portions of the entire handset with various interfaces and lines, by running or executing software programs and/or modules stored in the memory 120, and recalling data stored in the memory 120, The various functions and processing data of the mobile phone 100 are executed to perform overall monitoring of the mobile phone. Optionally, the processor 180 may include one or more processing units; preferably, the processor 180 may integrate an application processor and a modem processor, where the application processor mainly processes an operating system, a user interface, an application, and the like. The modem processor primarily handles wireless communications. It can be understood that the above modem processor may not be integrated into the processor 180.
手机100还包括给各个部件供电的电源190(比如电池),优选的,电源可以通过电源管理***与处理器180逻辑相连,从而通过电源管理***实现管理充电、放电、以及功耗等功能。The handset 100 also includes a power source 190 (such as a battery) that supplies power to the various components. Preferably, the power source can be logically coupled to the processor 180 via a power management system to manage functions such as charging, discharging, and power consumption through the power management system.
尽管未示出,手机100还可以包括摄像头、蓝牙模块等,在此不再赘述。Although not shown, the mobile phone 100 may further include a camera, a Bluetooth module, and the like, and details are not described herein.
其中,存储器120存储的模块可以包括:操作***、接触/运动模块、图形模块以及应用程序等等。The modules stored in the memory 120 may include an operating system, a contact/motion module, a graphics module, an application, and the like.
接触/运动模块用于检测物体或手指与触摸屏140或点击式触摸转盘的接触,捕捉接触的速度(方向和大小)、加速度(大小或方向的变化),判断接触事件类型。例如,多种接触事件检测模块:figure-down/dragging/up/tap,有时手势和UI界面中的元素相结合实现一些操作:pinching/depinching(手指挤压/扩大)等等。The contact/motion module is used to detect the contact of an object or finger with the touch screen 140 or the point-and-click touch dial, capture the speed (direction and size) of the contact, the acceleration (change in size or direction), and determine the type of contact event. For example, a variety of contact event detection modules: figure-down/dragging/up/tap, sometimes combining gestures with elements in the UI interface to achieve some operations: pinching/depinching (finger squeeze/expansion) and so on.
图形模块用于在触摸屏或其他显示器上渲染和显示图形,图形包括网页、图标、数 字图像、视频和动画。The graphics module is used to render and display graphics on a touch screen or other display, including graphics, icons, and numbers. Word images, videos and animations.
应用程序可以包括联系人、电话、视频会议、电子邮件客户端、即时通信、个人运动、相机、图像管理、视频播放器、音乐播放器、日历、插件(天气、股票、计算器、时钟、词典)、自定义插件、搜索、笔记、地图以及在线视频等等。Applications can include contacts, phones, video conferencing, email clients, instant messaging, personal sports, cameras, image management, video players, music players, calendars, plugins (weather, stocks, calculators, clocks, dictionaries) ), custom plugins, search, notes, maps, online videos, and more.
图2示出的是与本申请实施例提供的另一手机结构示意图。参考图2,该手机200包括本体1以及触摸屏2(例如,如图1所示的触摸屏140)。触摸屏2可以采用触控面板与显示面板集成一体而实现手机200的输入和输出功能,用户可用手指3或手写笔4在触摸屏进行点击、划动等操作,触控面板可检测到这些操作,触摸屏也可称为屏幕。本体1包括摄像头11、感光元件12、听筒13、实体键14、电源键15和音量键16等等。其中摄像头11可包括前置摄像头和后置摄像头。感光元件12主要用于感测人体与手机的距离,例如,用户在打电话时,手机是紧挨耳边的,感光元件12检测到该距离信息后,手机200的触摸屏可关闭输入功能,这样可防止误触。实体键14一般为Home键,也可以是集成指纹识别模块的Home键,实体键14还可以包括返回键、菜单键以及退出键,另外,实体键14还可以是触摸屏上设定位置的触摸键,其中,实体键14、电源键15和音量键16具体可参照图1所示的实施例中其他输入设备130的描述。应该知道的是,本申请实施例还可包括话筒17、数据接口18、客户识别模块(Subscriber Identification Module,SIM)卡接口(图中未示出)以及耳机接口19等等。FIG. 2 is a schematic structural diagram of another mobile phone provided by an embodiment of the present application. Referring to FIG. 2, the handset 200 includes a body 1 and a touch screen 2 (eg, a touch screen 140 as shown in FIG. 1). The touch screen 2 can integrate the touch panel and the display panel to realize the input and output functions of the mobile phone 200. The user can use the finger 3 or the stylus 4 to perform operations such as clicking and swiping on the touch screen, and the touch panel can detect these operations, and the touch screen can be detected. Also known as the screen. The body 1 includes a camera 11, a photosensitive element 12, an earpiece 13, a physical key 14, a power key 15, a volume key 16, and the like. The camera 11 can include a front camera and a rear camera. The photosensitive element 12 is mainly used for sensing the distance between the human body and the mobile phone. For example, when the user is calling, the mobile phone is close to the ear, and after the photosensitive element 12 detects the distance information, the touch screen of the mobile phone 200 can turn off the input function, so that It can prevent accidental touch. The physical key 14 is generally a Home key, and may also be a Home key integrated with a fingerprint recognition module. The physical key 14 may further include a return key, a menu key, and an exit key. In addition, the physical key 14 may also be a touch key of a set position on the touch screen. The physical key 14, the power key 15 and the volume key 16 may be specifically referred to the description of the other input devices 130 in the embodiment shown in FIG. 1. It should be noted that the embodiment of the present application may further include a microphone 17, a data interface 18, a Subscriber Identification Module (SIM) card interface (not shown), a headphone interface 19, and the like.
应该知道的是图2所示的手机200仅为示例,并不构成限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。It should be understood that the mobile phone 200 shown in FIG. 2 is merely an example and is not limiting, and may include more or less components than those illustrated, or combine some components, or split certain components, or different components. Arrangement.
下面结合附图,以图2所示的手机200为例,对在本申请实施例的具体实施过程中,该手机200与用户的交互过程进行进一步地介绍。The mobile phone 200 shown in FIG. 2 is taken as an example to describe the interaction process between the mobile phone 200 and the user in the specific implementation process of the embodiment of the present application.
图3-图13为本申请实施例提供的用户界面示意图。如图3-图6所示,手机200在开机或屏幕解锁后,可显示用户界面,该用户界面可显示应用图标以及状态栏等信息。其中,状态栏可用于显示运营商名称201(例如,***、***等)、无线保真(Wireless-Fidelity,Wi-Fi)状态控制标识(例如,可显示Wi-Fi信号强度)、移动通信信号状态控制标识(例如,可显示移动通信的信号强度,对于多卡多待的手机,还可以还可显示多个移动通信的信号强度)、电池电量状态控制标识、本地时间在内的一个或多个信息等等。FIG. 3 is a schematic diagram of a user interface provided by an embodiment of the present application. As shown in FIG. 3-6, after the mobile phone 200 is turned on or the screen is unlocked, the mobile phone 200 can display a user interface, and the user interface can display information such as an application icon and a status bar. The status bar can be used to display the operator name 201 (for example, China Mobile, China Unicom, etc.), Wireless-Fidelity (Wi-Fi) status control identifier (for example, Wi-Fi signal strength can be displayed), and mobile The communication signal state control identifier (for example, the signal strength of the mobile communication can be displayed, and the signal strength of the plurality of mobile communications can also be displayed for the mobile phone with multiple cards and multiple standbys), the battery state control flag, and the local time. Or multiple information and so on.
如图3所示,手机200检测用户的划动操作。当手机200检测到用户手势311,可显示主菜单,该主菜单可在目标区域230内显示。其中,该用户手势311为用户触点在目标区域230内由手机200的屏幕边缘向屏幕内部划动的手势。其中,这里的屏幕内部是指触摸屏的表面的内部。As shown in FIG. 3, the mobile phone 200 detects a user's swipe operation. When the handset 200 detects the user gesture 311, a main menu can be displayed that can be displayed within the target area 230. The user gesture 311 is a gesture in which the user contact is swiped from the screen edge of the mobile phone 200 to the inside of the screen in the target area 230. Here, the inside of the screen here refers to the inside of the surface of the touch screen.
需要说明的是,由屏幕边缘向屏幕内部划动可以是指,由目标区域所在的屏幕边缘向屏幕内部的划动;该划动可以为由目标区域所在的边缘向与该边缘平行的另一边缘的方向的划动。例如,如图3所示,由qp所在的边缘向与该边缘平行的另一边缘的方向的划动皆可以称为由屏幕边缘向屏幕内部划动。It should be noted that the swipe from the edge of the screen to the inside of the screen may refer to a swipe from the edge of the screen where the target area is located to the inside of the screen; the swipe may be another edge parallel to the edge of the target area The direction of the edge is swiped. For example, as shown in FIG. 3, the swipe from the edge where qp is located to the other edge parallel to the edge may be referred to as swiping from the edge of the screen toward the inside of the screen.
相应的,用户触点从屏幕内部向目标区域所在的边缘的划动,皆可以称为由屏幕内部向屏幕边缘划动。Correspondingly, the user's contact from the inside of the screen to the edge of the target area can be referred to as swiping from the inside of the screen to the edge of the screen.
具体地,如图7所示,在手机200检测用户手势311划动过程中,追踪用户触点320, 可根据用户触点320的移动,以动态形式显示主菜单被拖动的过程。或者,手机200确定用户手势311划动过程中用户触点320与屏幕边缘的距离,当该距离达到或超过阈值时,在手机200的屏幕上显示主菜单。其中,该主菜单可以悬浮显示在原有用户界面上。Specifically, as shown in FIG. 7, during the process of detecting that the user gesture 311 is swiped by the mobile phone 200, the user contact 320 is tracked. The process in which the main menu is dragged can be displayed in a dynamic form according to the movement of the user contact 320. Alternatively, the handset 200 determines the distance of the user contact 320 from the edge of the screen during the swipe of the user gesture 311, and displays the main menu on the screen of the handset 200 when the distance reaches or exceeds the threshold. Among them, the main menu can be displayed floating on the original user interface.
在本申请实施例中,目标区域230为手机200的屏幕的一部分区域,该区域逻辑上存在,可不显示在手机200的屏幕上。例如,目标区域230可以是如图3所示的手机屏幕右侧的区域,还可以是如图4所示的屏幕上侧的区域,当然还可以是指如图5以及图6所示,屏幕下侧以及左侧的区域。另外,一个手机可设置一个或多个目标区域230,该目标区域的设置可由用户的选择确定。例如,习惯左手握手机的用户可选择图6所示的目标区域230,习惯一手握手机另一手进行操作的用户,可选择图4或图5所示的目标区域230。手机200还可以设置目标区域230的大小。In the embodiment of the present application, the target area 230 is a part of the screen of the mobile phone 200, and the area is logically present and may not be displayed on the screen of the mobile phone 200. For example, the target area 230 may be an area on the right side of the screen of the mobile phone as shown in FIG. 3, and may also be an area on the upper side of the screen as shown in FIG. 4, and may also refer to the screen as shown in FIG. 5 and FIG. The lower side and the area on the left side. In addition, one handset may be provided with one or more target areas 230, the settings of which may be determined by the user's selection. For example, a user who is accustomed to the left-handed handshake can select the target area 230 shown in FIG. 6, and the user who is accustomed to the other hand to perform the operation can select the target area 230 shown in FIG. 4 or FIG. The mobile phone 200 can also set the size of the target area 230.
例如,手机200的屏幕像素为480*800,结合图3所示,x最大取值480,y最大取值为800。目标区域230可以为点m(280,250)、n(280,600)、q(480,250)以及p(480,600)所围成的四边形区域。其中,手机200可设置该区域的大小,例如,可调整目标区域230在y轴方向的长度,还可以设置将该目标区域230整体在y轴方向上移或下移。For example, the screen pixel of the mobile phone 200 is 480*800. As shown in FIG. 3, the maximum value of x is 480, and the maximum value of y is 800. The target area 230 may be a quadrilateral area surrounded by points m (280, 250), n (280, 600), q (480, 250), and p (480, 600). Wherein, the mobile phone 200 can set the size of the area, for example, the length of the target area 230 in the y-axis direction can be adjusted, and the target area 230 can be set to move up or down in the y-axis direction as a whole.
如图8所示,在检测到手势311后,当手机200检测到手势312时,根据主菜单项A具体的形式,执行相应的动作;其中,该手势312为用户触点划动到主菜单项A所在的区域的操作。As shown in FIG. 8, after the gesture 311 is detected, when the mobile phone 200 detects the gesture 312, according to the specific form of the main menu item A, a corresponding action is performed; wherein the gesture 312 is a user contact swipe to the main menu. The operation of the area where item A is located.
菜单项所在的区域可以是指菜单项所在的方格子,其中方格子可以是指用户界面显示菜单的区域被等分或者以其他规则分成的若干小格子,每个小格子用于显示一个菜单项,在具体应用中,用户界面可以只显示菜单项的图标或标识,而不显示方格子。The area where the menu item is located may refer to the square grid in which the menu item is located, wherein the square grid may refer to a small grid in which the area of the user interface display menu is equally divided or divided by other rules, and each small grid is used to display a menu item. In a specific application, the user interface may display only the icon or logo of the menu item without displaying the square grid.
其中,主菜单项可以包括单个应用图标、文件夹以及包括子菜单的菜单项等多种形式。不同形式的主菜单项对应的动作也不同,下面结合图9至1进行进一步地介绍。若主菜单项A对应有子菜单,在检测到图8所示的手势312的触点划动到主菜单项A所在的区域时,如图9所示,在屏幕上显示主菜单项A对应的子菜单。具体地,可在用户界面对应显示主菜单项A以及与主菜单项A对应的子菜单。Among them, the main menu item may include a plurality of forms such as a single application icon, a folder, and a menu item including a sub menu. Different types of main menu items correspond to different actions, which are further described below in conjunction with FIGS. 9 to 1. If the main menu item A corresponds to a sub menu, when it is detected that the contact of the gesture 312 shown in FIG. 8 is swiped to the area where the main menu item A is located, as shown in FIG. 9, the main menu item A is displayed on the screen. Submenu. Specifically, the main menu item A and the sub menu corresponding to the main menu item A may be correspondingly displayed on the user interface.
其中,主菜单项的子菜单可以包括:包含至少一个应用图标的文件夹的图标,代表至少一个功能的集合的选项图标以及应用图标等多种形式,下面进行举例说明。The submenu of the main menu item may include: an icon of a folder including at least one application icon, an option icon representing a set of at least one function, and an application icon, etc., which are exemplified below.
如图10所示,该主菜单包括设置选项、移动数据开关选项、文件夹选项以及应用商店图标选项。在手势312的触点划动到主菜单的设置选项所在的区域时,显示该设置选项对应的子菜单,该子菜单包括无线局域网(Wireless Local Area Networks,WLAN)开关选项、自动旋转选项、蓝牙开关选项、响铃开关选项、截屏选项以及飞行模式开关选项等等。As shown in Figure 10, the main menu includes setup options, mobile data switch options, folder options, and app store icon options. When the contact of the gesture 312 is swiped to the area where the setting option of the main menu is located, the sub-menu corresponding to the setting option is displayed, and the sub-menu includes a Wireless Local Area Networks (WLAN) switch option, an automatic rotation option, and Bluetooth. Switch options, ring switch options, screen capture options, flight mode switch options, and more.
如图11所示,在手势312的触点划动到主菜单的文件夹选项所在的区域时,显示该文件夹选项对应的子菜单,该子菜单的子菜单项为该文件夹选项包括的应用图标(图标A、图标B以及图标C)。As shown in FIG. 11, when the contact of the gesture 312 is swiped to the area where the folder option of the main menu is located, a sub-menu corresponding to the folder option is displayed, and the sub-menu item of the sub-menu is included in the folder option. Application icons (icon A, icon B, and icon C).
手机200在检测到手势312的用户触点划动到主菜单项所在的区域时,若主菜单项对应有功能,在触点从手机屏幕上消失后,即,用户触点离开手机屏幕,则执行该功能。例如,该功能可以是启动应用程序、控制一些设备或功能的开关。When the mobile phone 200 detects that the user's contact of the gesture 312 is swiped to the area where the main menu item is located, if the main menu item has a function, after the contact disappears from the screen of the mobile phone, that is, the user contact leaves the mobile phone screen, Perform this function. For example, the function can be a switch that launches an application, controls some devices or functions.
如图12所示,在手势312的触点(图中未示出)划动到主菜单的移动数据开关选项 所在的区域时,在触点从手机屏幕上消失后,则执行该功能,也就是开启或关闭移动数据服务。As shown in FIG. 12, the touch data switch option of the main menu is swiped at the contact of the gesture 312 (not shown). In the area where the contact disappears from the screen of the mobile phone, the function is executed, that is, the mobile data service is turned on or off.
若主菜单项A对应有子菜单,在手势312的触点划动到主菜单项A所在的区域,在屏幕上显示主菜单项A对应的子菜单后,手机200检测到针对子菜单项A2的手势313(如图9所示),该手势313为用户触点划动到子菜单项A2所在的区域的操作。确定用户触点划到子菜单项A2所在的区域时,根据子菜单项A2具体的形式,执行相应的动作。If the main menu item A corresponds to a sub menu, after the contact of the gesture 312 is swiped to the area where the main menu item A is located, and the sub menu corresponding to the main menu item A is displayed on the screen, the mobile phone 200 detects that the sub menu item A is A gesture 313 of 2 (shown in FIG. 9) is an operation in which the user's contact is swiped to the area in which the sub-menu item A 2 is located. When it is determined that the user contact is drawn to the area where the sub menu item A 2 is located, the corresponding action is performed according to the specific form of the submenu item A 2 .
如图13所示,若子菜单项A2对应有子菜单,在手势313的触点划动到子菜单项A2所在的区域时,则在屏幕上显示子菜单项A2对应的子菜单。以此类推。其中,子菜单的显示并不限制于目标区域内。As shown in FIG. 13, if the submenu item A 2 corresponds to a sub menu, when the contact of the gesture 313 is swiped to the area where the sub menu item A 2 is located, the sub menu corresponding to the sub menu item A 2 is displayed on the screen. And so on. Among them, the display of the submenu is not limited to the target area.
另外,用户执行划动操作时,可能会出现错误操作,或者划动超过预期位置的情况。例如,如图13所示,用户本意是选择主菜单项C,但是由于划动偏差,手机在检测时,判断为选择子菜单项A2。在本申请实施例中,用户可从子菜单项划向主菜单项,以选择主菜单项。如图14所示,手机在检测到在手势314的触点划动到主菜单项C所在的区域时,根据主菜单项C具体地形式,执行相应的动作。例如,若主菜单项C对应有功能,在检测到手势314的触点从手机屏幕上消失后,则执行该功能。若主菜单项C对应有子菜单,在检测到手势314的触点划动到主菜单项C所在的区域时,则关闭主菜单项A对应的子菜单,显示主菜单项C对应的子菜单。In addition, when the user performs a swipe operation, an erroneous operation may occur, or the swipe may exceed a desired position. For example, as shown in FIG. 13, the user intends to select the main menu item C, but due to the stroke deviation, the mobile phone determines that the sub-menu item A 2 is selected when it is detected. In the embodiment of the present application, the user can move from the submenu item to the main menu item to select the main menu item. As shown in FIG. 14, when the mobile phone detects that the contact of the gesture 314 is swiped to the area where the main menu item C is located, the mobile phone performs a corresponding action according to the specific form of the main menu item C. For example, if the main menu item C corresponds to a function, the function is executed after detecting that the contact of the gesture 314 disappears from the screen of the mobile phone. If the main menu item C corresponds to a sub menu, when it is detected that the contact of the gesture 314 is swiped to the area where the main menu item C is located, the sub menu corresponding to the main menu item A is closed, and the sub menu corresponding to the main menu item C is displayed. .
如图15所示,若手机检测到从屏幕内部向屏幕边缘划动的手势315的触点在x轴(如图3所示)方向划动的距离超过阈值时,关闭在用户界面显示的菜单。或者,若手机检测到手势315的触点与屏幕边缘的距离小于或等于预设距离时,也可以关闭在用户界面显示的菜单。例如,如图15所示,手机检测手势315的触点与手机屏幕的边缘(例如,图3所示的pq边)的距离,当距离小于或等于预先设定距离时,则关闭用户界面显示的菜单,该菜单可以包括主菜单以及子菜单。As shown in FIG. 15, if the mobile phone detects that the contact of the gesture 315 swiping from the inside of the screen to the edge of the screen is swiping in the direction of the x-axis (as shown in FIG. 3) exceeds the threshold, the menu displayed on the user interface is closed. . Alternatively, if the mobile phone detects that the distance between the contact of the gesture 315 and the edge of the screen is less than or equal to the preset distance, the menu displayed on the user interface may also be closed. For example, as shown in FIG. 15, the mobile phone detects the distance between the contact of the gesture 315 and the edge of the mobile phone screen (for example, the pq side shown in FIG. 3), and when the distance is less than or equal to the preset distance, the user interface display is turned off. Menu, which can include a main menu as well as a submenu.
在本申请另一个实施例中,在关闭菜单时,还可以分级关闭。例如,如图16所示,若检测到手势的触点在子菜单项B2处后,想要关闭主菜单项B对应的子菜单,然后选择主菜单项A。那么,用户可向主菜单项A所在的区域划动。如图17所示,在手机检测到用户触点离开了主菜单项B对应的子菜单项所在的区域,且没有划向主菜单项B所在的区域,则关闭显示的该主菜单项B对应的子菜单。例如,用户触点在以指向手机屏幕中心的运动轨迹移动(手势313)时,用户触点在主菜单项B所在的区域,则显示主菜单项B对应的子菜单;当触点划到主菜单项B对应的子菜单项的区域时,持续显示主菜单项B对应的子菜单;当检测到用户触点向手机屏幕边缘的运动轨迹移动(手势315),在用户触点离开主菜单项B以及主菜单项B对应的子菜单所在的区域,向主菜单项A所在的区域移动时,关闭主菜单项B对应的子菜单。需要说明的是,手机屏幕的中心可以是指,手机屏幕的中轴线,此时,手机屏幕的边缘是指与该中轴线平行的屏幕边缘。In another embodiment of the present application, when the menu is closed, it can also be hierarchically closed. For example, as shown in Figure 16, if the gesture is detected after the contact of the submenu item B 2, B want to turn off the main menu item corresponding to a sub-menu, and then select the main menu item A. Then, the user can swipe to the area where the main menu item A is located. As shown in FIG. 17, when the mobile phone detects that the user contact leaves the area where the submenu item corresponding to the main menu item B is located, and does not point to the area where the main menu item B is located, the main menu item B corresponding to the closed display corresponds to Submenu. For example, when the user contact moves (point 313) with a motion trajectory pointing to the center of the screen of the mobile phone, the user contact is in the area where the main menu item B is located, and the sub-menu corresponding to the main menu item B is displayed; when the contact is drawn to the main When the area of the submenu item corresponding to the menu item B is continued, the sub menu corresponding to the main menu item B is continuously displayed; when the movement of the user contact to the edge of the screen of the mobile phone is detected (gesture 315), the user contact leaves the main menu item. B and the area where the sub menu corresponding to the main menu item B is located, when moving to the area where the main menu item A is located, the sub menu corresponding to the main menu item B is closed. It should be noted that the center of the screen of the mobile phone may refer to the central axis of the screen of the mobile phone. At this time, the edge of the screen of the mobile phone refers to the edge of the screen parallel to the central axis.
图18为本申请实施例提供的一种菜单显示方法的流程示意图。如图18所示,该方法具体包括:FIG. 18 is a schematic flowchart diagram of a menu display method according to an embodiment of the present application. As shown in FIG. 18, the method specifically includes:
S510,显示用户界面。S510, displaying a user interface.
终端在开机或屏幕解锁后,可在屏幕(例如,如图1所示的触摸屏120)上显示用户界面(例如,如图3-图6所示的用户界面)。用户可在该用户界面上进行划动或点击等操 作,以便运行该用户界面显示的应用程序,或者例如翻页等其他的一些设置指令。结合图4所示的用户界面,用户可在该用户界面上进行点击操作,点击该用户界面中的相机应用程序的图标,可运行相机应用程序,通过相机应用程序可实现拍照摄像等功能;或者,点击该用户界面中的文件夹的图标,可显示该文件夹包括的详细内容。After the terminal is powered on or the screen is unlocked, the user interface (eg, the user interface shown in FIGS. 3-6) can be displayed on a screen (eg, touch screen 120 as shown in FIG. 1). The user can perform a swipe or click on the user interface. To run the application displayed by the user interface, or some other setting instructions such as page turning. In combination with the user interface shown in FIG. 4, the user can perform a click operation on the user interface, click the icon of the camera application in the user interface, and the camera application can be run, and the camera application can realize functions such as photographing and photographing; or Click the icon of the folder in the user interface to display the details of the folder.
S520,当终端检测到第一手势时,在用户界面显示主菜单。其中,第一手势为用户触点在目标区域内由终端的屏幕边缘向屏幕内部划动的手势,该目标区域为屏幕的一部分,该主菜单包括至少一个主菜单项。S520. When the terminal detects the first gesture, the main menu is displayed on the user interface. The first gesture is a gesture in which the user contact is swiped from the screen edge of the terminal to the inside of the screen in the target area, the target area is a part of the screen, and the main menu includes at least one main menu item.
终端在显示用户界面后,可检测用户在目标区域划动的手势,在该目标区域进行划动可显示菜单(例如,如图8-17所示的主菜单以及子菜单)。该菜单可按层级进行分类,每一大类可对应一个主菜单,每一小类可对应一个子菜单。主菜单的菜单项可以是包括至少一个应用图标的文件夹的图标,代表至少一个功能的集合的选项图标(例如,设置选项图标)以及单个应用图标等形式。例如,如图10所示的主菜单大类可包括设置选项、移动数据开关选项以及文件夹等等,而设置选项对应的小类包括WLAN开关选项、自动旋转选项、蓝牙开关选项以及响铃开关选项等等。After displaying the user interface, the terminal can detect the gesture of the user swiping in the target area, and swipe the target area to display the menu (for example, the main menu and the submenu as shown in FIG. 8-17). The menu can be categorized by level, each category can correspond to a main menu, and each subclass can correspond to a sub-menu. The menu item of the main menu may be an icon of a folder including at least one application icon, an option icon (for example, a setting option icon) representing a set of at least one function, and a form of a single application icon. For example, the main menu categories shown in FIG. 10 may include setting options, moving data switch options, folders, and the like, and the corresponding categories of setting options include WLAN switch options, auto-rotation options, Bluetooth switch options, and ring switches. Options and more.
该菜单中的主菜单项可对应有子菜单(例如,如图10所示的文件夹的图标或设置选项图标)或指示执行该菜单项对应的功能(例如,如图10所示的单个应用图标应用商店,或者移动数据开关选项)。其中,用户界面在显示菜单之前可能显示有图标等其他的内容,通过该方法显示的菜单可以以浮窗的形式显示在这些其他内容的上方。The main menu item in the menu may correspond to a sub-menu (for example, an icon of a folder as shown in FIG. 10 or a setting option icon) or a function corresponding to execution of the menu item (for example, a single application as shown in FIG. 10). Icon app store, or mobile data switch option). The user interface may display other content such as icons before the menu is displayed, and the menu displayed by the method may be displayed above the other contents in the form of a floating window.
还例如,该菜单的分类方式可以如下表1所示。Also for example, the menu can be classified as shown in Table 1 below.
表1Table 1
Figure PCTCN2017081481-appb-000001
Figure PCTCN2017081481-appb-000001
应该知道的是,图8-17和表1中的主菜单以及子菜单仅为举例,在实际应用中还可以有其他的分类方式。例如,针对目标区域划动显示的菜单,可以仅包括表1中所示的设置类,那么,该显示的菜单的主菜单项可以包括设备以及自定义等等,子菜单项包括煮菜单项对应的子菜单。又例如,表1中所述的应用商店可以归类为应用程序大类。 It should be noted that the main menu and sub-menu in Figure 8-17 and Table 1 are only examples, and there may be other classification methods in practical applications. For example, the menu for the target area swipe display may include only the setting class shown in Table 1, then the main menu item of the displayed menu may include a device and a customization, etc., and the sub-menu item includes a boil menu item corresponding to Submenu. As another example, the application store described in Table 1 can be classified as an application category.
其中,配置好的菜单可认为是隐藏在终端屏幕的边缘一侧。在终端检测到在目标区域由屏幕边缘向屏幕内划动,则可理解为用户通过拖动的方式,指示显示屏幕边缘隐藏的菜单。Among them, the configured menu can be considered to be hidden on the edge side of the terminal screen. When the terminal detects that the target area is swiped from the edge of the screen to the screen, it can be understood that the user instructs to display the hidden menu edge of the screen by dragging.
例如,结合表1所示的菜单,在终端检测到第一手势后,则在目标区域显示主菜单,且所述主菜单悬浮显示在所述用户界面其他显示的内容之上。该主菜单包括主菜单项,即“设置”“商店”以及“应用程序”。For example, in combination with the menu shown in Table 1, after the terminal detects the first gesture, the main menu is displayed in the target area, and the main menu is suspended and displayed on the other displayed content of the user interface. The main menu includes the main menu items, "Settings", "Store", and "Applications".
还例如,终端在显示如图3所示的界面后,在终端检测到第一手势(手势311)后,显示如图8所示的界面。For example, after the terminal displays the interface as shown in FIG. 3, after the terminal detects the first gesture (gesture 311), the interface shown in FIG. 8 is displayed.
目标区域包括屏幕上如下所述的任意一个或多个位置:屏幕的顶端、屏幕的底端、屏幕的左方以及屏幕的右方。The target area includes any one or more of the locations described below on the screen: the top of the screen, the bottom of the screen, the left side of the screen, and the right side of the screen.
S530,在检测到第一手势后,当终端检测到第二手势时,在用户界面显示目标主菜单项对应的至少一个子菜单项。其中,第二手势为:在第一手势之后,用户触点划动到目标主菜单项所在区域的操作;目标主菜单项为至少一个主菜单项中的一个;另外,第一手势和第二手势为连续操作,在从第一手势转变为第二手势的过程中,用户触点未离开所述屏幕。S530. After detecting the first gesture, when the terminal detects the second gesture, displaying at least one sub-menu item corresponding to the target main menu item on the user interface. The second gesture is: after the first gesture, the user touches the operation to the area where the target main menu item is located; the target main menu item is one of the at least one main menu item; in addition, the first gesture and the first The second gesture is a continuous operation in which the user contact does not leave the screen during the transition from the first gesture to the second gesture.
在一个示例中,结合表1所示,在检测到第一手势后,若检测到用户触点滑动到主菜单项所在的区域,例如,触点划动到主菜单项“设置”所在的区域,则显示“设置”对应的第一子菜单项,也就是,“设备”以及“自定义”等等。或者,终端在显示如图3所示的界面后,在终端检测到第一手势(手势311)后,若检测到用户触点滑动到主菜单项A所在的区域,显示如图9所示的界面,这里的主菜单项A即为第二手势的目标主菜单。或者,终端在显示主菜单后,在终端检测到第一手势(手势311)后,若检测到用户触点滑动到设置选项所在的区域,显示如图10所示的界面,这里的设置选项即为第二手势的目标主菜单。In one example, as shown in Table 1, after detecting the first gesture, if it is detected that the user contact slides to the area where the main menu item is located, for example, the contact is swiped to the area where the main menu item "Settings" is located. , the first submenu item corresponding to "Settings" is displayed, that is, "Device" and "Custom" and so on. Alternatively, after the terminal displays the interface as shown in FIG. 3, after detecting that the first gesture (gesture 311) is detected by the terminal, if the user contact is detected to slide to the area where the main menu item A is located, the terminal displays as shown in FIG. Interface, where the main menu item A is the target main menu of the second gesture. Alternatively, after the terminal displays the main menu, after detecting the first gesture (gesture 311), if the user detects that the user contact slides to the area where the setting option is located, the interface shown in FIG. 10 is displayed, and the setting option here is The target main menu for the second gesture.
在另一个示例中,结合表1所示,若终端在检测到第一手势后,检测到用户触点滑动到主菜单项“商店”所在的区域,并且用户触点在主菜单项“商店”所在的区域消失时,则启动该“商店”对应的功能。其中,用户触点消失是指,终端检测不到用户触点,也就是,用户触点离开了终端的屏幕。In another example, as shown in conjunction with Table 1, if the terminal detects the first gesture, it detects that the user contact slides to the area where the main menu item "Store" is located, and the user contact is in the main menu item "Store". When the area where it is located disappears, the function corresponding to the "store" is activated. Wherein, the disappearance of the user contact means that the terminal does not detect the user contact, that is, the user contact leaves the screen of the terminal.
在另一个实施例中,还可以通过连续滑动的手势,触发已经显示的菜单的菜单项对应的功能。具体的,该方法还包括,In another embodiment, the function corresponding to the menu item of the menu that has been displayed may also be triggered by a gesture of continuous sliding. Specifically, the method further includes
S540,在检测到所述第二手势后,当终端检测到第三手势时,执行目标功能;其中,第三手势为:在第二手势之后,用户触点划动到目标子菜单项,且用户触点在目标子菜单项所在的区域离开屏幕;目标子菜单项为至少一个子菜单项中的一个,目标功能为所述目标子菜单项对应的功能;另外,第二手势和第三手势为连续操作,在从所述第二手势转变为所述第三手势的过程中,用户触点未离开所述屏幕。S540, after detecting the second gesture, when the terminal detects the third gesture, performing a target function; wherein the third gesture is: after the second gesture, the user contact is swiped to the target sub-menu item And the user contact leaves the screen in the area where the target sub-menu item is located; the target sub-menu item is one of the at least one sub-menu item, and the target function is a function corresponding to the target sub-menu item; in addition, the second gesture and The third gesture is a continuous operation in which the user contact does not leave the screen during the transition from the second gesture to the third gesture.
在一个示例中,结合表1所示,第二手势为用户触点划动到主菜单项“设置”所在的区域,在检测到第二手势后,检测到用户触点划动到子菜单项“设备”所在的区域,则可显示子菜单项“设备”对应的第二子菜单。若检测到用户触点划动到第二子菜单项(例如,音量、显示或者存储等等)所在的区域,并且用户触点在第二子菜单项所在的区域消失时,则启动该第二子菜单项对应的功能。或者,终端在显示如图9所示的界面后,在终 端检测到第二手势后,若检测到用户触点滑动到子菜单项A2所在的区域,并且用户触点在子菜单项A2所在的区域消失时,执行子菜单项A2对应的功能。或者,终端在显示如图10所示的界面后,在终端检测到第二手势后,若检测到用户触点滑动到WLAN开关选项、自动旋转选项、蓝牙开关选项或者响铃开关选项等所在的区域时,并且用户触点在该区域消失时,执行其对应的功能。In one example, in conjunction with Table 1, the second gesture is for the user contact to swipe to the area where the main menu item "Settings" is located. After detecting the second gesture, the user touch is detected to be swiped to the child. The area where the menu item "Device" is located can display the second sub-menu corresponding to the sub-menu item "Device". If the user touch is detected to the area where the second sub-menu item (eg, volume, display, or storage, etc.) is located, and the user contact disappears in the area where the second sub-menu item is located, then the second is activated The function corresponding to the submenu item. Alternatively, after the terminal displays the interface as shown in FIG. 9, after detecting the second gesture at the terminal, if the user contact is detected to slide to the area where the sub-menu item A 2 is located, and the user contact is in the sub-menu item A When the area where 2 is located disappears, the function corresponding to submenu item A 2 is executed. Alternatively, after the terminal displays the interface as shown in FIG. 10, after detecting the second gesture, the terminal detects that the user contact slides to the WLAN switch option, the automatic rotation option, the Bluetooth switch option, or the ring switch option. The region, and when the user contact disappears in the region, performs its corresponding function.
在另一个示例中,第二手势为用户触点划动到主菜单项“设备”所在的区域,在检测到第二手势后,检测到用户触点划动到子菜单项“音量”所在的区域,并且用户触点在第二子菜单项所在的区域消失时,则启动该第二子菜单项对应的功能,该子菜单项“音量”对应的功能可以显示音量调整界面,以便在音量调整界面设备音量的大小。In another example, the second gesture is for the user contact to swipe to the area where the main menu item "device" is located, and after detecting the second gesture, detecting that the user contact is swiped to the sub-menu item "volume" In the area where the user contact disappears in the area where the second sub-menu item is located, the function corresponding to the second sub-menu item is activated, and the function corresponding to the sub-menu item “volume” can display the volume adjustment interface, so as to Volume adjusts the size of the interface device volume.
在另一个实施例中,还可以通过连续滑动的手势,关闭屏幕上已经显示的菜单。具体地,该方法还包括,In another embodiment, the menu already displayed on the screen can also be turned off by a continuous sliding gesture. Specifically, the method further includes
S550,在检测到第二手势后,若检测到第四手势,且用户触点距离屏幕边缘小于或等于阈值时,则关闭用户界面显示的主菜单。其中,第四手势为用户触点从所述屏幕内部向屏幕边缘划动的手势;第二手势和第四手势为连续操作,从所述第二手势转变为所述第四手势的过程中,用户触点未离开所述屏幕。S550. After detecting the second gesture, if the fourth gesture is detected, and the user contact is less than or equal to the threshold from the edge of the screen, the main menu displayed by the user interface is closed. Wherein, the fourth gesture is a gesture that the user contact swipes from the inside of the screen to the edge of the screen; the second gesture and the fourth gesture are continuous operations, and the process of transitioning from the second gesture to the fourth gesture The user contact does not leave the screen.
例如,终端在显示如图14所示的用户界面后,若检测到手势315,则关闭用户界面显示的菜单,如图15所示。For example, after the terminal displays the user interface as shown in FIG. 14, if the gesture 315 is detected, the menu displayed by the user interface is closed, as shown in FIG.
需要说明的是,由屏幕边缘向屏幕内部划动可以是指,由目标区域所在的屏幕边缘,向屏幕内部的划动;该划动可以为由目标区域所在的边缘向与该边缘平行的另一边缘的方向的划动。It should be noted that the swipe from the edge of the screen to the inside of the screen may refer to a swipe from the edge of the screen where the target area is located to the inside of the screen; the swipe may be from the edge where the target area is located to the edge parallel to the edge A swipe in the direction of an edge.
还需要说明的是,本申请实施例中的“功能”可以是指,启动应用程序,显示设置界面,以及开关功能等等终端通过运行软件程序实现的功能。It should be noted that the “function” in the embodiment of the present application may refer to a function that is implemented by running a software program, such as a startup application, a display setting interface, and a switch function.
应理解的是,本申请实施例中,诸如“目标主菜单”、“目标子菜单”以及“目标功能”等类似的描述,仅是为了与其他主菜单、子菜单以及功能在描述上进行区分,便于理解,并不构成限定。在检测到用户触点在其他主菜单、子菜单或者功能对应的选项中的一个所在的区域时,其他主菜单、子菜单或者功能中的一个也可称为目标主菜单、目标子菜单或者目标功能。It should be understood that in the embodiments of the present application, similar descriptions such as "target main menu", "target sub-menu", and "target function" are only for distinguishing from other main menus, sub-menus and functions. It is easy to understand and does not constitute a limitation. When detecting that the user's contact is in one of the other main menus, submenus, or one of the options corresponding to the function, one of the other main menus, submenus, or functions may also be referred to as the target main menu, the target sub menu, or the target. Features.
图19为本申请实施例提供的另一种菜单显示方法的流程示意图。如图19所示,该方法具体包括:FIG. 19 is a schematic flowchart diagram of another menu display method according to an embodiment of the present application. As shown in FIG. 19, the method specifically includes:
S610,终端显示用户界面;S610. The terminal displays a user interface.
S620,当终端检测到第一手势时,在用户界面显示菜单。其中,第一手势为用户触点在目标区域内由所述终端的屏幕边缘向所述屏幕内部划动的手势,目标区域为屏幕的一部分,菜单包括至少一个菜单项。S620. When the terminal detects the first gesture, the menu is displayed on the user interface. The first gesture is a gesture in which the user contact is swiped from the screen edge of the terminal to the inside of the screen in the target area, the target area is a part of the screen, and the menu includes at least one menu item.
在用户界面显示的至少一个菜单项包括单个的应用程序图标或者开关选项等等能够运行应用程序或者功能的图标。例如,如图12所的应用商店图标和移动数据选项。At least one menu item displayed in the user interface includes a single application icon or a switch option, etc., an icon capable of running an application or function. For example, the application store icon and the mobile data option as shown in FIG.
S610-S620可参见,结合图18所描述的实施例中S510-S520的相关描述。S610-S620 can be referred to the related description of S510-S520 in the embodiment described in connection with FIG.
S630,在检测到所述第一手势后,当终端检测到第二手势时,执行目标功能;其中,第二手势为:在第一手势之后,用户触点划动到目标菜单项,且用户触点在目标菜单项所在的区域离开屏幕;目标菜单项为至少一个菜单项中的一个,所述目标功能为所述目标菜 单项对应的功能;第一手势和第二手势为连续操作,在从第一手势转变为第二手势的过程中,用户触点未离开所述屏幕。S630, after detecting the first gesture, when the terminal detects the second gesture, performing a target function; wherein, the second gesture is: after the first gesture, the user contact is swiped to the target menu item, And the user contact leaves the screen in the area where the target menu item is located; the target menu item is one of at least one menu item, and the target function is the target dish The single-event corresponding function; the first gesture and the second gesture are continuous operations, and the user contact does not leave the screen during the transition from the first gesture to the second gesture.
其中,该目标菜单项为单个应用图标或其他功能的图标。The target menu item is an icon of a single application icon or other function.
专业人员应该还可以进一步意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、计算机软件或者二者的结合来实现,为了清楚地说明硬件和软件的可互换性,在上述说明中已经按照功能一般性地描述了各示例的组成及步骤。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。A person skilled in the art should further appreciate that the elements and algorithm steps of the various examples described in connection with the embodiments disclosed herein can be implemented in electronic hardware, computer software, or a combination of both, in order to clearly illustrate hardware and software. Interchangeability, the composition and steps of the various examples have been generally described in terms of function in the above description. Whether these functions are performed in hardware or software depends on the specific application and design constraints of the solution. A person skilled in the art can use different methods to implement the described functions for each particular application, but such implementation should not be considered to be beyond the scope of the present application.
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分步骤是可以通过程序来指令处理器完成,所述的程序可以存储于计算机可读存储介质中,所述存储介质是非短暂性(英文:non-transitory)介质,例如随机存取存储器,只读存储器,快闪存储器,硬盘,固态硬盘,磁带(英文:magnetic tape),软盘(英文:floppy disk),光盘(英文:optical disc)及其任意组合。It will be understood by those skilled in the art that all or part of the steps of implementing the above embodiments may be performed by a program, and the program may be stored in a computer readable storage medium, which is non-transitory ( English: non-transitory) media, such as random access memory, read-only memory, flash memory, hard disk, solid state disk, magnetic tape (English: magnetic tape), floppy disk (English: floppy disk), CD (English: optical disc) And any combination thereof.
以上所述,仅为本申请较佳的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应该以权利要求的保护范围为准。 The above description is only a preferred embodiment of the present application, but the scope of protection of the present application is not limited thereto, and any person skilled in the art can easily think of changes or within the technical scope disclosed in the present application. Replacement should be covered by the scope of this application. Therefore, the scope of protection of the present application should be determined by the scope of protection of the claims.

Claims (20)

  1. 一种菜单显示方法,其特征在于,包括:A menu display method, comprising:
    终端在屏幕上显示用户界面;The terminal displays a user interface on the screen;
    当所述终端检测到第一手势时,在所述用户界面显示主菜单,所述第一手势为用户触点在目标区域内由所述终端的屏幕边缘向屏幕内部划动的手势,所述目标区域为所述屏幕的一部分,所述主菜单包括至少一个主菜单项;When the terminal detects the first gesture, displaying a main menu on the user interface, where the first gesture is a gesture in which the user contact is swiped from the screen edge of the terminal to the inside of the screen in the target area, The target area is part of the screen, and the main menu includes at least one main menu item;
    在检测到所述第一手势后,当所述终端检测到第二手势时,在所述用户界面显示目标主菜单项对应的至少一个子菜单项;所述第二手势为:在所述第一手势之后,所述用户触点划动到所述目标主菜单项所在区域的操作;所述目标主菜单项为所述至少一个主菜单项中的一个;所述第一手势和所述第二手势为连续操作,在从所述第一手势转变为所述第二手势的过程中,所述用户触点未离开所述屏幕。After detecting the first gesture, when the terminal detects the second gesture, displaying at least one sub-menu item corresponding to the target main menu item on the user interface; the second gesture is: After the first gesture, the user touches an operation to an area where the target main menu item is located; the target main menu item is one of the at least one main menu item; the first gesture and the The second gesture is a continuous operation in which the user contact does not leave the screen during the transition from the first gesture to the second gesture.
  2. 根据权利要求1所述的方法,其特征在于,还包括:The method of claim 1 further comprising:
    在检测到所述第二手势后,当所述终端检测到第三手势时,执行目标功能;所述第三手势为:在所述第二手势之后,所述用户触点划动到目标子菜单项,且所述用户触点在所述目标子菜单项所在的区域离开所述屏幕;所述目标子菜单项为所述至少一个子菜单项中的一个,所述目标功能为所述目标子菜单项对应的功能;After detecting the second gesture, when the terminal detects the third gesture, performing a target function; the third gesture is: after the second gesture, the user contact is swiped to a target sub-menu item, and the user contact leaves the screen in an area where the target sub-menu item is located; the target sub-menu item is one of the at least one sub-menu item, the target function is The function corresponding to the target submenu item;
    所述第三手势为连续操作,在从所述第二手势转变为所述第三手势的过程中,所述用户触点未离开所述屏幕。The third gesture is a continuous operation, and the user contact does not leave the screen during the transition from the second gesture to the third gesture.
  3. 根据权利要求1或2所述的方法,其特征在于,还包括:The method according to claim 1 or 2, further comprising:
    在检测到第二手势后,若检测到第四手势,且所述用户触点距离所述屏幕边缘小于或等于阈值时,则关闭所述用户界面显示的主菜单,所述第四手势为所述用户触点从所述屏幕内部向屏幕边缘划动的手势;After detecting the second gesture, if the fourth gesture is detected, and the user contact is less than or equal to the threshold from the edge of the screen, the main menu displayed by the user interface is closed, and the fourth gesture is a gesture of the user contact swiping from the inside of the screen to the edge of the screen;
    所述第四手势为连续操作,在从所述第二手势转变为所述第四手势的过程中,所述用户触点未离开所述屏幕。The fourth gesture is a continuous operation in which the user contact does not leave the screen during the transition from the second gesture to the fourth gesture.
  4. 根据权利要求1-3任意一项所述的方法,其特征在于,所述目标区域位于以下任意一个或多个位置:The method according to any one of claims 1 to 3, wherein the target area is located at any one or more of the following locations:
    所述屏幕的顶端、所述屏幕的底端、所述屏幕的左方以及所述屏幕的右方。The top of the screen, the bottom of the screen, the left side of the screen, and the right side of the screen.
  5. 根据权利要求1-4任意一项所述的方法,其特征在于,所述主菜单显示在所述目标区域内,且所述主菜单悬浮显示在所述用户界面上。The method according to any one of claims 1 to 4, wherein the main menu is displayed in the target area, and the main menu is displayed floating on the user interface.
  6. 根据权利要求1-5任意一项所述的方法,其特征在于,所述主菜单包括以下主菜单项:A method according to any one of claims 1 to 5, wherein the main menu comprises the following main menu items:
    包括至少一个应用图标的文件夹的图标,或,代表至少一个功能的集合的选项图标。An icon of a folder including at least one application icon, or an option icon representing a collection of at least one function.
  7. 一种菜单显示方法,其特征在于,包括:A menu display method, comprising:
    终端在屏幕上显示用户界面;The terminal displays a user interface on the screen;
    当所述终端检测到第一手势时,在所述用户界面显示菜单,所述第一手势为用户触点在目标区域内由所述终端的屏幕边缘向屏幕内部划动的手势,所述目标区域为所述屏幕的一部分,所述菜单包括至少一个菜单项;When the terminal detects the first gesture, displaying a menu on the user interface, the first gesture is a gesture in which the user contact is swiped from the screen edge of the terminal to the inside of the screen in the target area, the target The area is part of the screen, the menu comprising at least one menu item;
    在检测到所述第一手势后,当所述终端检测到第二手势时,执行目标功能;所述第二 手势为:在所述第一手势之后,所述用户触点划动到目标菜单项,且所述用户触点在所述目标菜单项所在的区域离开所述屏幕;所述目标菜单项为所述至少一个菜单项中的一个,所述目标功能为所述目标菜单项对应的功能;所述第一手势和所述第二手势为连续操作,在从所述第一手势转变为所述第二手势的过程中,所述用户触点未离开所述屏幕。After detecting the first gesture, when the terminal detects the second gesture, performing a target function; the second The gesture is: after the first gesture, the user contact swipes to a target menu item, and the user contact leaves the screen in an area where the target menu item is located; the target menu item is Determining one of the at least one menu item, the target function being a function corresponding to the target menu item; the first gesture and the second gesture being continuous operations, transitioning from the first gesture to the During the second gesture, the user contact does not leave the screen.
  8. 根据权利要求7所述的方法,其特征在于,所述目标区域位于以下任意一个或多个位置:The method of claim 7 wherein said target area is located at any one or more of the following locations:
    所述屏幕的顶端、所述屏幕的底端、所述屏幕的左方以及所述屏幕的右方。The top of the screen, the bottom of the screen, the left side of the screen, and the right side of the screen.
  9. 根据权利要求7或8所述的方法,其特征在于,所述菜单显示在所述目标区域内,且所述菜单悬浮显示在所述用户界面上。The method according to claim 7 or 8, wherein the menu is displayed in the target area, and the menu is displayed on the user interface.
  10. 一种终端,其特征在于,包括:A terminal, comprising:
    显示器,用于显示用户界面;a display for displaying a user interface;
    处理器,用于当检测到第一手势时,在所述用户界面显示主菜单,所述第一手势为用户触点在目标区域内由所述终端的屏幕边缘向屏幕内部划动的手势,所述目标区域为所述屏幕的一部分,所述主菜单包括至少一个主菜单项;a processor, configured to display a main menu on the user interface when the first gesture is detected, where the first gesture is a gesture in which the user contact is swiped from the screen edge of the terminal to the inside of the screen in the target area, The target area is part of the screen, and the main menu includes at least one main menu item;
    所述处理器还用于,在检测到所述第一手势后,当检测到第二手势时,在所述用户界面显示所述目标主菜单项对应的至少一个子菜单项;所述第二手势为:在所述第一手势之后,所述用户触点划动到所述目标主菜单项所在区域的操作;所述目标主菜单项为所述至少一个主菜单项中的一个;所述第一手势和所述第二手势为连续操作,在从所述第一手势转变为所述第二手势的过程中,所述用户触点未离开所述屏幕。The processor is further configured to display, when the second gesture is detected, at least one sub-menu item corresponding to the target main menu item on the user interface after detecting the first gesture; The second gesture is: after the first gesture, the user touches an operation to an area where the target main menu item is located; the target main menu item is one of the at least one main menu item; The first gesture and the second gesture are continuous operations, and the user contact does not leave the screen during the transition from the first gesture to the second gesture.
  11. 根据权利要求10所述的终端,其特征在于,还包括:The terminal according to claim 10, further comprising:
    所述处理器还用于,在检测到所述第二手势后,当检测到第三手势时,执行目标功能;所述第三手势为:在所述第二手势之后,所述用户触点划动到目标子菜单项,且所述用户触点在所述目标子菜单项所在的区域离开所述屏幕;所述目标子菜单项为所述至少一个子菜单项中的一个,所述目标功能为所述目标子菜单项对应的功能;The processor is further configured to: after detecting the second gesture, when the third gesture is detected, performing a target function; the third gesture is: after the second gesture, the user Touching a contact to a target sub-menu item, and the user contact leaves the screen in an area where the target sub-menu item is located; the target sub-menu item is one of the at least one sub-menu item The target function is a function corresponding to the target sub-menu item;
    所述第三手势为连续操作,在从所述第二手势转变为所述第三手势的过程中,所述用户触点未离开所述屏幕。The third gesture is a continuous operation, and the user contact does not leave the screen during the transition from the second gesture to the third gesture.
  12. 根据权利要求10或11所述的终端,其特征在于,还包括:The terminal according to claim 10 or 11, further comprising:
    所述处理器还用于,在检测到第二手势后,若检测到第四手势,且所述用户触点距离所述屏幕边缘小于或等于阈值时,则关闭所述用户界面显示的主菜单,所述第四手势为所述用户触点从所述屏幕内部向屏幕边缘划动的手势;The processor is further configured to: after detecting the second gesture, if the fourth gesture is detected, and the user contact is less than or equal to a threshold from the edge of the screen, then the main user of the user interface is closed a menu, the fourth gesture being a gesture of the user contact swiping from inside the screen to the edge of the screen;
    所述第四手势为连续操作,在从所述第二手势转变为所述第四手势的过程中,所述用户触点未离开所述屏幕。The fourth gesture is a continuous operation in which the user contact does not leave the screen during the transition from the second gesture to the fourth gesture.
  13. 根据权利要求10-12任意一项所述的终端,其特征在于,所述目标区域位于以下任意一个或多个位置:The terminal according to any one of claims 10 to 12, wherein the target area is located at any one or more of the following positions:
    所述屏幕的顶端、所述屏幕的底端、所述屏幕的左方以及所述屏幕的右方。The top of the screen, the bottom of the screen, the left side of the screen, and the right side of the screen.
  14. 根据权利要求10-13任意一项所述的终端,其特征在于,所述主菜单显示在所述目标区域内,且所述主菜单悬浮显示在所述用户界面上。The terminal according to any one of claims 10-13, wherein the main menu is displayed in the target area, and the main menu is displayed floating on the user interface.
  15. 根据权利要求10-14任意一项所述的终端,其特征在于,所述主菜单包括以下主菜单项: The terminal according to any one of claims 10-14, wherein the main menu comprises the following main menu items:
    包括至少一个应用图标的文件夹的图标,或,代表至少一个功能的集合的选项图标。An icon of a folder including at least one application icon, or an option icon representing a collection of at least one function.
  16. 一种终端,其特征在于,包括:A terminal, comprising:
    设备显示器,用于显示用户界面;Device display for displaying the user interface;
    处理器,用于当检测到第一手势时,在所述用户界面显示菜单,所述第一手势为用户触点在目标区域内由所述终端的屏幕边缘向屏幕内部划动的手势,所述目标区域为所述屏幕的一部分,所述菜单包括至少一个菜单项;a processor, configured to display a menu on the user interface when the first gesture is detected, where the first gesture is a gesture in which the user contact is swiped from the screen edge of the terminal to the inside of the screen in the target area, The target area is part of the screen, and the menu includes at least one menu item;
    所述处理器还用于,在检测到所述第一手势后,当检测到第二手势时,执行目标功能;所述第二手势为:在所述第一手势之后,所述用户触点划动到目标菜单项,且所述用户触点在所述目标菜单项所在的区域离开所述屏幕;所述目标菜单项为所述至少一个菜单项中的一个,所述目标功能为所述目标菜单项对应的功能;所述第一手势和所述第二手势为连续操作,在从所述第一手势转变为所述第二手势的过程中,所述用户触点未离开所述屏幕。The processor is further configured to: after detecting the first gesture, perform a target function when detecting the second gesture; the second gesture is: after the first gesture, the user The contact is swiped to the target menu item, and the user contact leaves the screen in an area where the target menu item is located; the target menu item is one of the at least one menu item, and the target function is The function corresponding to the target menu item; the first gesture and the second gesture are continuous operations, and in the process of transitioning from the first gesture to the second gesture, the user contact is not Leave the screen.
  17. 根据权利要求16所述的终端,其特征在于,所述目标区域位于以下任意一个或多个位置:The terminal according to claim 16, wherein the target area is located at any one or more of the following positions:
    所述屏幕的顶端、所述屏幕的底端、所述屏幕的左方以及所述屏幕的右方。The top of the screen, the bottom of the screen, the left side of the screen, and the right side of the screen.
  18. 根据权利要求16或17所述的终端,其特征在于,所述菜单显示在所述目标区域内,且所述菜单悬浮显示在所述用户界面上。The terminal according to claim 16 or 17, wherein the menu is displayed in the target area, and the menu is displayed on the user interface.
  19. 一种计算机可读存储介质,包括指令,当其在计算机上运行时,使得计算机执行如权利要求1-6任意一项所述的方法。A computer readable storage medium comprising instructions which, when run on a computer, cause the computer to perform the method of any of claims 1-6.
  20. 一种计算机可读存储介质,包括指令,当其在计算机上运行时,使得计算机执行如权利要求7-9任意一项所述的方法。 A computer readable storage medium comprising instructions which, when run on a computer, cause the computer to perform the method of any of claims 7-9.
PCT/CN2017/081481 2016-11-04 2017-04-21 Menu display method and terminal WO2018082269A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201780006166.6A CN108885525A (en) 2016-11-04 2017-04-21 Menu display method and terminal

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN201610962439 2016-11-04
CN201610962439.1 2016-11-04
CN201710199204.6 2017-03-29
CN201710199204 2017-03-29

Publications (1)

Publication Number Publication Date
WO2018082269A1 true WO2018082269A1 (en) 2018-05-11

Family

ID=62076658

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/081481 WO2018082269A1 (en) 2016-11-04 2017-04-21 Menu display method and terminal

Country Status (2)

Country Link
CN (1) CN108885525A (en)
WO (1) WO2018082269A1 (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109766043A (en) * 2018-12-29 2019-05-17 华为技术有限公司 The operating method and electronic equipment of electronic equipment
CN111061419A (en) * 2019-10-23 2020-04-24 华为技术有限公司 Application bar display method and electronic equipment
CN111831205A (en) * 2020-07-09 2020-10-27 Oppo广东移动通信有限公司 Device control method, device, storage medium and electronic device
CN111949193A (en) * 2019-05-16 2020-11-17 腾讯科技(深圳)有限公司 Interface switching method, device, terminal and storage medium
CN113067934A (en) * 2021-03-15 2021-07-02 Oppo广东移动通信有限公司 Encrypted content decryption method and terminal equipment
CN113325987A (en) * 2021-06-15 2021-08-31 深圳地平线机器人科技有限公司 Method and device for guiding operation body to perform air-separating operation
WO2022007544A1 (en) * 2020-07-09 2022-01-13 Oppo广东移动通信有限公司 Device control method and apparatus, and storage medium and electronic device
CN114579009A (en) * 2020-11-30 2022-06-03 中移(苏州)软件技术有限公司 Method, device, equipment and storage medium for triggering menu items
CN115291788A (en) * 2022-06-30 2022-11-04 青岛海尔科技有限公司 Method and device for mobile terminal interaction, mobile terminal and storage medium
CN117472220A (en) * 2023-09-15 2024-01-30 荣耀终端有限公司 Operation identification method and device
JP7498352B2 (en) 2020-07-09 2024-06-11 オッポ広東移動通信有限公司 Device control method and apparatus, storage medium and electronic device

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109947327B (en) * 2019-03-15 2022-10-04 努比亚技术有限公司 Interface viewing method, wearable device and computer-readable storage medium
CN113434068A (en) * 2021-05-28 2021-09-24 北京信和时代科技有限公司 Control method and device for suspension shortcut menu, electronic equipment and storage medium
CN114296599A (en) * 2021-12-02 2022-04-08 深圳市华胜软件技术有限公司 Interface interaction method, device, terminal and storage medium
CN114356177A (en) * 2021-12-31 2022-04-15 上海洛轲智能科技有限公司 Display method and device of vehicle-mounted system menu bar and electronic equipment
CN115202530B (en) * 2022-05-26 2024-04-09 当趣网络科技(杭州)有限公司 Gesture interaction method and system of user interface

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102118476A (en) * 2011-03-10 2011-07-06 宇龙计算机通信科技(深圳)有限公司 Method for displaying menu of mobile phone and mobile phone
CN102736856A (en) * 2012-06-28 2012-10-17 宇龙计算机通信科技(深圳)有限公司 Method and device for selecting menu
CN103235688A (en) * 2013-04-17 2013-08-07 昆山富泰科电脑有限公司 Method and graphical user interface for processing messages rapidly in intelligent device notification bar
CN104102441A (en) * 2013-04-09 2014-10-15 腾讯科技(深圳)有限公司 Menuitem executing method and device
US20160092050A1 (en) * 2014-09-29 2016-03-31 Silent Circle, LLC Method, device, and computer program for generating an inverse sliding menu for graphical user interfaces

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103049176A (en) * 2013-01-22 2013-04-17 网神信息技术(北京)股份有限公司 Method and device for displaying menus
CN105988668A (en) * 2015-02-27 2016-10-05 阿里巴巴集团控股有限公司 Menu selection method and apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102118476A (en) * 2011-03-10 2011-07-06 宇龙计算机通信科技(深圳)有限公司 Method for displaying menu of mobile phone and mobile phone
CN102736856A (en) * 2012-06-28 2012-10-17 宇龙计算机通信科技(深圳)有限公司 Method and device for selecting menu
CN104102441A (en) * 2013-04-09 2014-10-15 腾讯科技(深圳)有限公司 Menuitem executing method and device
CN103235688A (en) * 2013-04-17 2013-08-07 昆山富泰科电脑有限公司 Method and graphical user interface for processing messages rapidly in intelligent device notification bar
US20160092050A1 (en) * 2014-09-29 2016-03-31 Silent Circle, LLC Method, device, and computer program for generating an inverse sliding menu for graphical user interfaces

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109766043A (en) * 2018-12-29 2019-05-17 华为技术有限公司 The operating method and electronic equipment of electronic equipment
CN111949193A (en) * 2019-05-16 2020-11-17 腾讯科技(深圳)有限公司 Interface switching method, device, terminal and storage medium
CN111949193B (en) * 2019-05-16 2023-08-04 腾讯科技(深圳)有限公司 Interface switching method, device, terminal and storage medium
CN111061419B (en) * 2019-10-23 2023-03-03 华为技术有限公司 Application bar display method and electronic equipment
CN111061419A (en) * 2019-10-23 2020-04-24 华为技术有限公司 Application bar display method and electronic equipment
US11868605B2 (en) 2019-10-23 2024-01-09 Huawei Technologies Co., Ltd. Application bar display method and electronic device
EP4036699A4 (en) * 2019-10-23 2022-12-07 Huawei Technologies Co., Ltd. Application bar display method and electronic device
CN111831205A (en) * 2020-07-09 2020-10-27 Oppo广东移动通信有限公司 Device control method, device, storage medium and electronic device
JP7498352B2 (en) 2020-07-09 2024-06-11 オッポ広東移動通信有限公司 Device control method and apparatus, storage medium and electronic device
WO2022007544A1 (en) * 2020-07-09 2022-01-13 Oppo广东移动通信有限公司 Device control method and apparatus, and storage medium and electronic device
WO2022007541A1 (en) * 2020-07-09 2022-01-13 Oppo广东移动通信有限公司 Device control method and apparatus, storage medium, and electronic device
CN114661219A (en) * 2020-07-09 2022-06-24 Oppo广东移动通信有限公司 Device control method, device, storage medium and electronic device
CN114579009A (en) * 2020-11-30 2022-06-03 中移(苏州)软件技术有限公司 Method, device, equipment and storage medium for triggering menu items
CN113067934A (en) * 2021-03-15 2021-07-02 Oppo广东移动通信有限公司 Encrypted content decryption method and terminal equipment
CN113325987A (en) * 2021-06-15 2021-08-31 深圳地平线机器人科技有限公司 Method and device for guiding operation body to perform air-separating operation
CN115291788A (en) * 2022-06-30 2022-11-04 青岛海尔科技有限公司 Method and device for mobile terminal interaction, mobile terminal and storage medium
CN115291788B (en) * 2022-06-30 2023-11-17 青岛海尔科技有限公司 Method and device for mobile terminal interaction, mobile terminal and storage medium
CN117472220A (en) * 2023-09-15 2024-01-30 荣耀终端有限公司 Operation identification method and device

Also Published As

Publication number Publication date
CN108885525A (en) 2018-11-23

Similar Documents

Publication Publication Date Title
WO2018082269A1 (en) Menu display method and terminal
US11907013B2 (en) Continuity of applications across devices
CN108701001B (en) Method for displaying graphical user interface and electronic equipment
US10956028B2 (en) Portable device and method for providing user interface mode thereof
KR20210042071A (en) Foldable electronic apparatus and method for performing interfacing thereof
CN111240789B (en) Widget processing method and related device
RU2662690C2 (en) User apparatus object control device and method of management
TWI629636B (en) Method for controlling an electronic device, electronic device and non-transitory computer-readable storage medium
WO2019014859A1 (en) Multi-task operation method and electronic device
CN107193455B (en) Information processing method and mobile terminal
US10423264B2 (en) Screen enabling method and apparatus, and electronic device
CN110007996B (en) Application program management method and terminal
CN105518605A (en) Touch operation method and apparatus for terminal
CN108920069B (en) Touch operation method and device, mobile terminal and storage medium
WO2018133285A1 (en) Display method and terminal
WO2018112803A1 (en) Touch screen-based gesture recognition method and device
WO2018137276A1 (en) Method for processing data and mobile device
CN108700990B (en) Screen locking method, terminal and screen locking device
CN107003759B (en) Method for selecting text
EP3674867B1 (en) Human-computer interaction method and electronic device
WO2022063034A1 (en) Input interface display method and terminal
CN117931015A (en) Application program display method and device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17867612

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17867612

Country of ref document: EP

Kind code of ref document: A1