WO2019056393A1 - 一种终端界面的显示方法及终端 - Google Patents

一种终端界面的显示方法及终端 Download PDF

Info

Publication number
WO2019056393A1
WO2019056393A1 PCT/CN2017/103288 CN2017103288W WO2019056393A1 WO 2019056393 A1 WO2019056393 A1 WO 2019056393A1 CN 2017103288 W CN2017103288 W CN 2017103288W WO 2019056393 A1 WO2019056393 A1 WO 2019056393A1
Authority
WO
WIPO (PCT)
Prior art keywords
terminal
user
gesture
interface
high frequency
Prior art date
Application number
PCT/CN2017/103288
Other languages
English (en)
French (fr)
Inventor
朱金鹏
王魁
李想
林宗芳
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to PCT/CN2017/103288 priority Critical patent/WO2019056393A1/zh
Priority to US16/650,264 priority patent/US11307760B2/en
Priority to CN201780073653.4A priority patent/CN109997348B/zh
Publication of WO2019056393A1 publication Critical patent/WO2019056393A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/25Integrating or interfacing systems involving database management systems
    • G06F16/252Integrating or interfacing systems involving database management systems between a Database Management System and a front-end application
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/725Cordless telephones

Definitions

  • the present application relates to the field of communications technologies, and in particular, to a terminal interface display method and terminal.
  • touch screen of touch-screen mobile phones is getting larger and larger, and touch-screen mobile phones are becoming more and more popular.
  • the touch screen of the touch screen mobile phone is large, the user cannot conveniently operate the touch screen mobile phone with one hand.
  • the senor may be configured on the touch screen mobile phone to identify that the user currently uses the left or right hand to operate the mobile phone; and the display interface of the mobile phone is updated according to the recognition result to facilitate the user's operation. For example, when it is recognized that the user currently uses the right hand to operate the mobile phone, the application icon in the mobile phone can be displayed on the right side of the touch screen of the mobile phone.
  • the present application provides a display method of a terminal interface, which can enable the user to operate the high frequency application icon more conveniently and comfortably, and can improve the user experience.
  • the present application provides a display method of a terminal interface, where the method may include: determining, by the terminal, a high frequency touch area on a first side of the terminal in response to a first gesture input by the user on the first interface, the first A gesture is a finger input gesture on the first side of the user, and the high-frequency touch area is a touch area on the terminal interface that is operated by the user or the number of times is higher than the first threshold, and the first interface includes at least two Applying an icon; displaying, on the first side of the high frequency touch area, at least one high frequency application icon, wherein the at least one high frequency application icon is a frequency or a number of times of the at least two application icons operated by the user is higher than a second threshold Application icon.
  • the first side of the terminal when the finger on the first side is a left-hand finger, the first side of the terminal is the left side of the terminal, and when the finger of the first side of the user is the right-hand finger, the first side of the terminal is the right side of the terminal.
  • the high frequency application icon in the terminal can be displayed on the first side of the high frequency touch area, so that the user can be more convenient and comfortable. Operating the high frequency application icon can improve the user experience.
  • the terminal displays at least one high frequency application icon on the first side of the high frequency touch area, including: the high frequency touch area of the terminal on the first side, and the display includes the at least one Folder icon for high frequency application icon.
  • the terminal may display a folder icon including all high frequency application icons in the high frequency touch area on the first side.
  • the high frequency application icons of the terminal can be solved, and the high frequency touch area of the first side is insufficient to display all the problems of the high frequency application icons.
  • displaying the folder icon including all the high frequency application icons on the first side of the high frequency touch area the user can conveniently manipulate all the high frequency application icons in the terminal.
  • the display in the high frequency touch area of the first side of the terminal, includes After the folder icon of the at least one high frequency application icon is described, the method of the present application further includes: in response to the user inputting the folder icon, displaying the folder icon corresponding to the high frequency touch area on the first side The folder expands the window, and the at least one high frequency application icon is displayed in the folder expansion window.
  • the terminal may also display the folder expansion window corresponding to the folder icon on the first side of the high frequency touch area in response to the input of the folder icon including the high frequency application icon, which may be convenient for the user. Manipulate all high frequency application icons in the terminal.
  • the determining the high frequency touch area of the first side of the terminal includes: determining, by the terminal, the coordinates of the finger sliding track of the at least one first side of the first side trajectory model The high-frequency touch area on the terminal interface of the finger on the first side of the user.
  • the first side trajectory model is a left-hand trajectory model or a right-hand trajectory model
  • the right-hand trajectory model includes coordinates of at least one right-hand sliding trajectory
  • the left-hand trajectory model includes coordinates of at least one left-hand sliding trajectory.
  • the terminal may determine, according to the coordinates of the finger sliding trajectory of the at least one first side of the first side trajectory model, that the first gesture is a gesture input by the user using a left-hand finger or a right-hand finger (the finger on the first side), and then The high frequency touch area of the finger on the first side of the user is determined.
  • the first gesture is a gesture input by the user using the left-hand finger or the right-hand finger, and the hardware cost is not required to be added, and the cost of determining the user to operate the mobile phone using the left-hand or the right-hand can be reduced.
  • the determining, by the terminal, the high-frequency touch area on the first side of the terminal in response to the first gesture input by the user in the first interface may include: the terminal inputting in response to the user in the first interface The first gesture, calculating a tangent of a line connecting the start point and the end point of the sliding track of the first gesture, and an angle between the x-axis or the y-axis of the coordinate axis; when the tangent value belongs to the first side of the terminal The value interval, and when the point of the preset ratio in the sliding track of the first gesture is close to the first side of the terminal, the terminal determines the high frequency touch area of the first side.
  • the terminal may determine the value of the tangent of the angle between the start and end points of the sliding track of the first gesture, the x-axis or the y-axis of the coordinate axis, and the sliding path of the first gesture. With the distribution of points, it can be judged that the user operates the mobile phone with the left or right hand, so that the problem of high cost due to the additional hardware addition is avoided.
  • the terminal determines the high-frequency touch area of the first side of the terminal in response to the first gesture input by the user at the first interface, including: the terminal is input in response to the user inputting at the first interface. a first gesture, determining a starting point coordinate and an ending point coordinate of the sliding track of the first gesture; the terminal searching for the first sliding track from the left hand track model and the right hand track model, the start point coordinate and the end point coordinate of the first sliding track being at the terminal interface
  • the distribution matches the start point coordinate and the end point coordinate of the sliding path of the first gesture, the left hand track model including coordinates of at least one left hand sliding track, the right hand track model including coordinates of at least one right hand sliding track; when the terminal is from the first When the first sliding track is found in the side trajectory model, the high-frequency touch area on the first side of the terminal is determined, and the first side trajectory model is a left-hand trajectory model or a right-hand trajectory model.
  • the distribution of the start point coordinate and the end point coordinate of the first sliding track in the present application matches the start point coordinate and the end point coordinate of the sliding track of the first gesture, specifically: the starting point coordinate of the first sliding track and the first gesture
  • the starting point coordinates of the sliding track are the same, and the end point coordinates of the first sliding track are the same as the ending point coordinates of the sliding track of the first gesture.
  • included in the left-hand trajectory model or the right-hand trajectory model is a range of values of the starting point coordinates of the first sliding trajectory and a range of values of the ending point coordinates.
  • the distribution of the start point coordinate and the end point coordinate of the first sliding track in the terminal interface matches the start point coordinate and the end point coordinate of the sliding track of the first gesture, specifically It means that the starting point coordinate of the first gesture is within the value range of the starting point coordinate of the first sliding track, and the end point coordinate of the sliding track of the first gesture is within the value range of the end point coordinate of the first sliding track.
  • the method of the present application before the terminal determines the high frequency touch area of the first side of the terminal in response to the first gesture input by the user in the first interface, the method of the present application further includes: the terminal responds to The fourth gesture input by the user at the terminal interface determines that the fourth gesture is a gesture of finger input on the first side of the user, and saves the coordinates of the sliding trajectory of the fourth gesture in the first side trajectory model.
  • the terminal can count the coordinates of the sliding trajectory of the plurality of gestures input by the user (corresponding to the gesture habit of the user on the touch screen of the terminal) without the user's perception, and save the sliding trajectory in the first trajectory model. Coordinates, so that after receiving the first gesture of the user input, the coordinates of the sliding trajectory in the left-hand trajectory model and the right-hand trajectory model can be compared to determine whether the first gesture is a gesture input by the user's left hand or a right-hand input by the user. Gesture.
  • the method of the present application before the terminal determines the high frequency touch area of the first side of the terminal in response to the first gesture input by the user at the first interface, the method of the present application further includes: displaying the third terminal by the terminal
  • the interface includes a first prompt information, where the first prompt information is used to prompt the user to slide on the terminal interface by using the finger of the first side; and the terminal responds to at least two thirds input by the user in the third interface.
  • Gesture counting coordinates of the sliding trajectory of at least two third gestures, obtaining coordinates of at least one finger sliding trajectory of the first side
  • the third gesture is a gesture of finger input on the first side of the user; and the trajectory model of the terminal at the first side The coordinates of the finger sliding track of at least one first side are saved.
  • the terminal can guide the user to use the left hand input gesture or the right hand input gesture.
  • the terminal specifically guides the user to input the left-hand gesture the coordinates of the sliding trajectory of the left-hand gesture input by the user according to the guidance input of the terminal may be collected; when the terminal specifically guides the user to input the right-hand gesture, the right-hand gesture of the user according to the guiding input of the terminal may be collected.
  • the coordinates of the sliding track In this way, the coordinate accuracy of the sliding trajectory saved in the first side trajectory model can be improved.
  • the terminal may further determine the third in response to the third gesture input by the user in the third interface.
  • the gesture is a gesture input by the user's left finger or a gesture input by the right finger, and then determines to the user whether the terminal's judgment is correct; when the user determines that the terminal's judgment is correct, the coordinates of the corresponding sliding track are saved.
  • the method of the present application further includes: displaying, by the terminal, the third interface; Determining, in the third gesture input by the user in the third interface, that the third gesture is a gesture of finger input on the first side of the user, and displaying a fourth interface, where the fourth interface includes determining whether the third gesture is a user
  • the prompt information of the gesture input by the finger on one side the terminal saves the coordinates of the sliding track of the third gesture in the first side trajectory model in response to the first input of the user in the fourth interface, the first input is used to indicate the third
  • the gesture is a gesture of finger input on the first side of the user.
  • the terminal can not only guide the user to use the left hand input gesture or use the right hand input gesture. Moreover, after the user inputs the gesture according to the instruction of the terminal, the terminal may determine, by using a two-fold determination process, a gesture input by the user, which side of the user inputs the gesture. That is, the terminal may first determine whether the third gesture is a gesture input by the user's left finger or a gesture input by the right finger; and then determine to the user whether the terminal's judgment is correct; when the user determines that the terminal's judgment is correct, the terminal saves the corresponding The coordinates of the sliding track. Through the above two-fold judging process, the coordinate accuracy of the sliding trajectory saved in the first side trajectory model can be improved.
  • the present application provides a display method of a terminal interface, the method includes: determining, by the terminal, a high-frequency touch area on a first side of the terminal, the first gesture, in response to a first gesture input by the user on the first interface
  • the gesture of the finger input on the first side of the user, the high frequency touch area is a touch area on the terminal interface that is operated by the user or the frequency is higher than the first threshold; the high frequency touch area of the terminal on the first side
  • Displaying a first touch panel the first touch panel is configured to operate the first interface in response to a gesture input by a user; and displaying a second gesture in response to a second input of the user in the first touch panel
  • An interface where the second interface includes an interface element displayed by the terminal in response to a third gesture input by the user at a corresponding position of the first interface.
  • the display method of the terminal interface provided by the present application can recognize the gesture input by the user on the touch screen to determine whether the user operates the mobile phone with the left or right hand, so as to avoid the problem of high cost due to additional hardware equipment.
  • the terminal may display the first touch panel that can be used to operate the terminal interface on the terminal interface when the user is operated to use the finger on the first side (such as the left hand or the right hand) to facilitate the user. All content on the terminal interface can be operated on the first touch panel. In this way, the user can operate the area of the terminal interface that the first side of the finger cannot reach without affecting the user's vision and operation experience.
  • the first touch panel is displayed on the first side of the high-frequency touch area, which can further facilitate all content that the user can operate on the first touch panel operation terminal interface.
  • the method that the terminal determines the high-frequency touch area on the first side of the terminal in response to the first gesture input by the user in the first interface may refer to the foregoing aspect of the first aspect.
  • the related descriptions in the possible design methods are not described herein again.
  • the method of the present application before the terminal determines the high frequency touch area of the first side of the terminal in response to the first gesture input by the user in the first interface, the method of the present application further includes: the terminal responding to the user
  • the fourth gesture input at the terminal interface determines that the fourth gesture is a gesture of finger input on the first side of the user, and saves the coordinates of the sliding trajectory of the fourth gesture in the first side trajectory model.
  • the terminal may be described in detail with reference to the possible design method of the first aspect of the present application.
  • the terminal can guide the user to use the left hand input gesture or use the right hand input gesture, and save the coordinates of the sliding trajectory of the gesture input by the user.
  • the terminal specifically guides the user to use the left-hand input gesture or the right-hand input gesture, and saves the coordinates and the effect analysis of the coordinates of the sliding trajectory of the gesture input by the user, and may refer to the possible design method terminal description of the first aspect of the present application. This application will not be repeated here.
  • the present application provides a display method of a terminal interface, where the method includes: determining, by the terminal, a gesture of a finger input on a first side of a user, in response to a first gesture input by the user at the first interface,
  • the first interface includes a first interface element, the first interface element includes a navigation bar icon and/or a dock bar icon; and the terminal moves the first interface element to a display area near the first side of the terminal.
  • the dock bar icon and/or the navigation bar icon in the terminal interface may be moved to the display area near the first side of the terminal for display.
  • the dock bar icon and/or navigation bar icon can be operated more conveniently and comfortably to improve the user experience.
  • the application provides a terminal, where the terminal includes: an input unit, a determining unit, and a display unit.
  • the input unit is configured to receive a first gesture input by the user on the first interface, where the first gesture is a gesture of finger input on the first side of the user.
  • the determining unit is configured to determine a high frequency touch area of the first side of the terminal in response to the first gesture input by the user on the first interface received by the input unit, where the high frequency touch area is on the terminal interface
  • the frequency or the number of times the user is operated is higher than the first threshold, and the first interface includes at least two application icons.
  • the display unit is configured to display at least one high frequency application icon in the high frequency touch area of the first side determined by the determining unit, where the at least one high frequency application icon is operated by a user among the at least two application icons An application icon whose frequency or number is higher than the second threshold.
  • the display unit is specifically configured to display, in the high frequency touch area on the first side, a folder icon including the at least one high frequency application icon.
  • the input unit is further configured to: after the display unit displays the folder icon including the at least one high frequency application icon on the high frequency touch area of the first side, receive the user Input to the above folder icon.
  • the display unit is further configured to display, in response to the input of the folder icon by the user, a folder expansion window corresponding to the folder icon in the high frequency touch area on the first side, and the folder expansion window displays the above At least one high frequency application icon.
  • the determining unit is specifically configured to: determine, according to coordinates of a finger sliding track of at least one first side of the first side trajectory model, high frequency touch on the first side of the terminal region.
  • the first side trajectory model is a left-hand trajectory model or the right-hand trajectory model
  • the right-hand trajectory model includes coordinates of at least one right-hand sliding trajectory
  • the left-hand trajectory model includes coordinates of at least one left-hand sliding trajectory.
  • the determining unit is specifically configured to: calculate a tangent of a line connecting the start point and the end point of the sliding track of the first gesture, and an angle between the x axis or the y axis of the coordinate axis; Determining the high frequency touch on the first side when the tangent value belongs to the value interval corresponding to the first side of the terminal, and the point of the preset ratio in the sliding track of the first gesture is close to the first side of the terminal region.
  • the determining unit is specifically configured to: determine a starting point coordinate and an ending point coordinate of the sliding track of the first gesture; and search for the first sliding track from the left hand track model and the right hand track model,
  • the distribution of the start point coordinate and the end point coordinate of the first sliding track at the terminal interface matches the start point coordinate and the end point coordinate of the sliding track of the first gesture
  • the left hand trajectory model includes coordinates of at least one left hand sliding track
  • the right hand trajectory model Included in the coordinates of at least one right-hand sliding track; when the first sliding track is found from the first-side trajectory model, determining a high-frequency touch area on the first side of the terminal, the first side trajectory model is the left hand The trajectory model or the right-hand trajectory model described above.
  • the display unit is further configured to display a third interface, where the third interface includes the first prompt, before the determining unit determines the high-frequency touch area of the first side of the terminal.
  • the first prompt information is used to prompt the user to slide on the terminal interface by using the finger on the first side.
  • the input unit is further configured to receive at least two third gestures input by the user in the third interface.
  • the terminal further includes: a statistics unit and a storage unit.
  • the statistic unit is configured to count coordinates of the sliding trajectories of the at least two third gestures in response to the at least two third gestures input by the user in the third interface received by the input unit, to obtain at least one first side
  • the coordinates of the finger sliding track, the third gesture described above It is a gesture of finger input on the first side of the user.
  • the storage unit is configured to save coordinates of the finger sliding track of the at least one first side in the first side trajectory model.
  • the application provides a terminal, where the terminal includes: an input unit, a determining unit, and a display unit.
  • the input unit is configured to receive a first gesture input by the user on the first interface.
  • the determining unit is configured to determine a high frequency touch area of the first side of the terminal in response to the first gesture input by the user at the first interface received by the input unit, where the first gesture is a first side of the user.
  • the finger input gesture, the high frequency touch area is a touch area on the terminal interface that is operated by the user or the number of times is higher than the first threshold.
  • the display unit is further configured to display the first touch panel on the first-level high-frequency touch area displayed by the display unit, where the first touch panel is configured to operate the first interface in response to a gesture input by a user.
  • the input unit is further configured to receive a second gesture of the input of the user in the first touch panel displayed by the display unit.
  • the display unit is further configured to display a second interface in response to the second gesture input by the user in the first touch panel received by the input unit, where the second interface includes the terminal in response to the user being in the foregoing
  • the interface element displayed by the third gesture input by the corresponding position of the first interface when the finger on the first side is a left-hand finger, the first side of the terminal is the left side of the terminal, and when the finger of the first side is the right-hand finger, the first side of the terminal is the right side of the terminal.
  • the method for determining the high frequency touch area of the first side of the terminal may be referred to the related description of the determining unit in the foregoing possible design method of the fourth aspect. This application will not be repeated here.
  • the input unit is further configured to receive a fourth gesture input by the user at the terminal interface before the determining unit determines the high-frequency touch area of the first side of the terminal; the determining unit, Also for determining a gesture in which the fourth gesture is a finger input of the first side of the user in response to the fourth gesture input by the user at the terminal interface. And a storage unit configured to save coordinates of the sliding track of the fourth gesture in the first side trajectory model.
  • the application provides a terminal, where the terminal includes: an input unit, a determining unit, and a display unit.
  • the input unit is configured to receive a first gesture input by the user on the first interface.
  • the determining unit is configured to determine, in response to the first gesture input by the user in the first interface, the first gesture is a gesture of finger input on a first side of the user, where the first interface includes a first interface element, the first interface
  • the element includes a navigation bar icon and/or a dock bar icon;
  • the display unit is configured to move the first interface element to a display area display near the first side of the terminal.
  • the application provides a terminal, the terminal includes: a processor, a memory, and a touch screen, wherein the memory, the touch screen is coupled to the processor, the memory is used to store computer program code, and the computer program code includes computer instructions.
  • the processor executes the computer instruction, the terminal performs the following operations: the touch screen is configured to display the first interface, and the first interface includes at least two application icons.
  • the processor is configured to determine a high frequency touch area of the first side of the terminal in response to a first gesture input by the user on the first interface displayed on the touch screen, where the first gesture is a first side of the user
  • the finger input gesture, the high frequency touch area is a touch area on the terminal interface that is operated by the user or the frequency is higher than the first threshold.
  • the touch screen is further configured to display at least one high frequency application icon in the high frequency touch area on the first side determined by the processor, where the at least one high frequency application icon is operated by a user in the at least two application icons.
  • An application icon whose frequency or number is higher than the second threshold.
  • the touch screen is specifically configured to: display a folder icon including the at least one high frequency application icon on the high frequency touch area of the first side.
  • the processor is further configured to: after displaying the folder icon including the at least one high frequency application icon on the high frequency touch area of the first side, receive the user display on the touch screen The input of the above folder icon.
  • the touch screen is further configured to display, in response to the input of the folder icon by the user, a folder expansion window corresponding to the folder icon in the high frequency touch area on the first side, and the at least the folder expansion window displays the at least A high frequency application icon.
  • the processor is specifically configured to: determine, according to coordinates of a finger sliding track of at least one first side of the first side trajectory model, a high frequency touch area of the first side of the terminal .
  • the first side trajectory model is a left-hand trajectory model or the right-hand trajectory model
  • the right-hand trajectory model includes coordinates of at least one right-hand sliding trajectory
  • the left-hand trajectory model includes coordinates of at least one left-hand sliding trajectory.
  • the application provides a terminal, the terminal includes: a processor, a memory, and a touch screen, wherein the memory, the touch screen is coupled to the processor, the memory is used to store computer program code, and the computer program code includes computer instructions.
  • the processor executes the computer instruction, the terminal performs the following operations: the touch screen is configured to display the first interface.
  • the processor is configured to determine a high frequency touch area of the first side of the terminal in response to a first gesture input by the user on the first interface displayed on the touch screen, where the first gesture is a first side of the user
  • the finger input gesture, the high frequency touch area is a touch area on the terminal interface that is operated by the user or the frequency is higher than the first threshold.
  • the touch screen is further configured to display the first touch panel in the high frequency touch area on the first side determined by the processor, where the first touch panel is configured to operate the first interface in response to a gesture input by a user.
  • the processor is further configured to receive a second gesture of the input of the user in the first touch panel displayed by the touch screen.
  • the touch screen is further configured to display a second interface in response to the second gesture input by the user in the first touch panel, where the second interface includes the terminal in response to the user inputting at a corresponding position of the first interface.
  • the interface element displayed by the third gesture wherein, when the finger on the first side is a left-hand finger, the first side of the terminal is the left side of the terminal, and when the finger of the first side is the right-hand finger, the first side of the terminal is the right side of the terminal.
  • the processor is specifically configured to: determine, according to coordinates of a finger sliding track of at least one first side of the first side trajectory model, a high frequency touch area of the first side of the terminal .
  • the first side trajectory model is a left-hand trajectory model or the right-hand trajectory model
  • the right-hand trajectory model includes coordinates of at least one right-hand sliding trajectory
  • the left-hand trajectory model includes coordinates of at least one left-hand sliding trajectory.
  • the processor is specifically configured to: calculate a tangent of a line connecting the start point and the end point of the sliding track of the first gesture, and an angle between an x-axis or a y-axis of the coordinate axis;
  • the tangential value belongs to the value interval corresponding to the first side of the terminal, and when the preset proportion of the sliding trajectory of the first gesture is close to the first side of the terminal, the high frequency touch area of the first side is determined.
  • the processor is specifically configured to: determine a starting point coordinate and an ending point coordinate of the sliding track of the first gesture; and search for a first sliding track from the left hand track model and the right hand track model, The distribution of the start point coordinate and the end point coordinate of the first sliding track at the terminal interface matches the start point coordinate and the end point coordinate of the sliding track of the first gesture, and the left hand track model includes at least one left hand sliding track.
  • the right-hand trajectory model includes coordinates of at least one right-hand sliding track; and when the first sliding trajectory is found from the first-side trajectory model, determining a high-frequency touch area of the first side of the terminal, the first The side trajectory model is the above-described left-hand trajectory model or the above-described right-hand trajectory model.
  • the touch screen is further configured to display a third interface, where the third interface includes the first prompt information, before the processor determines the high frequency touch area of the first side of the terminal.
  • the first prompt information is used to prompt the user to slide on the terminal interface by using the finger on the first side.
  • the processor is further configured to receive at least two third gestures input by the user in the third interface displayed by the touch screen, and to count the at least two in response to at least two third gestures input by the user in the third interface.
  • the coordinates of the sliding trajectory of the third gesture obtain coordinates of at least one finger sliding trajectory of the first side
  • the third gesture is a gesture of finger input of the first side of the user.
  • the memory is further configured to store coordinates of the finger sliding track of the at least one first side in the first side trajectory model.
  • the processor is further configured to: before responding to the first gesture input by the user at the first interface, determining that the first gesture is a gesture input by the user's left hand or a gesture input by the right hand, responding The fourth gesture input by the user at the terminal interface determines that the fourth gesture is a gesture of finger input on the first side of the user.
  • the above memory is further configured to save coordinates of the sliding track of the fourth gesture in the first side trajectory model.
  • the application provides a terminal, the terminal includes: a processor, a memory, and a touch screen, wherein the memory, the touch screen is coupled to the processor, the memory is used to store computer program code, and the computer program code includes computer instructions.
  • the terminal When the processor executes the computer instruction, the terminal performs the following operations: the processor is configured to receive a first gesture input by the user on the first interface, and determine the first in response to the first gesture input by the user in the first interface A gesture is a gesture of a finger input on a first side of the user, the first interface includes a first interface element, the first interface element includes a navigation bar icon and/or a dock bar icon, and the touch screen is configured to use the first interface element The display area that moves to the first side close to the terminal is displayed.
  • the present application provides a graphical user interface (GUI), where the graphical user interface is stored in a terminal, where the terminal includes a touch screen, a memory, and a processor, where the processor is configured to execute the storage in the memory.
  • GUI graphical user interface
  • the graphical user interface comprising: a first GUI displayed on the touch screen, the first GUI comprising at least two application icons.
  • the high frequency touch area of the first side of the second GUI includes at least one high frequency application icon
  • the first gesture is the user's first a finger input gesture
  • the high frequency touch area is a touch area on the second GUI that is operated by a user or a frequency higher than a first threshold
  • the at least one high frequency application icon is the at least two application icons The application icon in which the frequency or number of times the user is operated is higher than the second threshold.
  • the second GUI includes a folder icon
  • the folder icon includes the at least one high frequency application icon
  • the GUI further includes: displaying, in response to the input of the folder icon in the second GUI, a third GUI, where the third GUI includes a folder expansion window corresponding to the folder icon
  • the above-mentioned folder expansion window displays the above-mentioned at least one high frequency application icon.
  • the application provides a graphical user interface (GUI), wherein the graphical user interface is stored in a terminal, the terminal includes a touch screen, a memory, and a processor, and the processor is configured to execute one or more stored in the memory Computer program, characterized in that the graphical user interface comprises: a first GUI on the touch screen; in response to the first gesture input in the first GUI, the second GUI is displayed, and the high-frequency touch area on the first side of the second GUI includes a first touch panel, the first The touch panel is configured to operate the first GUI in response to a gesture input by the user, where the first gesture is a finger input gesture of the first side of the user, and the high frequency touch area is a frequency operated by the user on the second GUI.
  • GUI graphical user interface
  • the GUI further includes: a fourth GUI displayed on the touch screen, where the fourth GUI includes first prompt information, where the first prompt information is used to prompt the user to use the first side.
  • the finger slides on the fourth GUI described above.
  • the present application provides a graphical user interface (GUI), wherein the graphical user interface is stored in a terminal, the terminal includes a touch screen, a memory, and a processor, and the processor is configured to execute one or more stored in the memory
  • GUI graphical user interface
  • the graphical user interface comprises: a first GUI displayed on the touch screen, the first GUI includes a first interface element, the first interface element includes a navigation bar icon and/or a dock bar icon And displaying the second GUI in response to the first gesture input in the first GUI, wherein the first interface element is included in the display area of the first side of the second GUI.
  • the application provides a computer storage medium
  • the computer storage deadline includes a computer instruction, when the computer instruction is run on the terminal, causing the terminal to perform the first aspect, the second aspect, and the third Aspect, and display method of the terminal interface as described in any of the possible design methods.
  • the present application provides a computer program product, when the computer program product is run on a computer, causing the computer to perform the first aspect, the second aspect, the third aspect, and any of the present application A display method of the terminal interface as described in a possible design method.
  • the terminal according to the fourth aspect to the ninth aspect and the possible design method thereof, the GUI of the tenth to twelfth aspects, the computer storage medium of the thirteenth aspect, and the The computer program products described in the fourteenth aspects are all used to perform the corresponding methods provided above, and therefore, the beneficial effects that can be achieved can be referred to the beneficial effects in the corresponding methods provided above, and no longer here. Narration.
  • FIG. 1 is a schematic diagram 1 of an example of a terminal interface of a mobile phone provided by the present application
  • FIG. 2 is a schematic structural diagram of hardware of a mobile phone provided by the present application.
  • FIG. 3 is a flowchart 1 of a display method of a terminal interface provided by the present application.
  • FIG. 4 is a second schematic diagram of a terminal interface of a mobile phone provided by the present application.
  • FIG. 5 is a schematic diagram of a mapping example between a first interface and a first touch panel provided by the present application
  • 6A is a third schematic diagram of a terminal interface of a mobile phone provided by the present application.
  • 6B is a fourth schematic diagram of a terminal interface of a mobile phone provided by the present application.
  • FIG. 7 is a schematic diagram 5 of a terminal interface of a mobile phone provided by the present application.
  • FIG. 8 is a schematic diagram 6 of an example of a terminal interface of a mobile phone provided by the present application.
  • FIG. 9 is a second flowchart of a display method of a terminal interface provided by the present application.
  • FIG. 10 is a schematic diagram of an example of a touch point in a coordinate axis and a coordinate axis on a mobile phone provided by the present application;
  • FIG. 11 is a flowchart 3 of a display method of a terminal interface provided by the present application.
  • FIG. 12 is a schematic diagram of an example of a trajectory database provided by the present application.
  • FIG. 13 is a schematic diagram of an example of a network architecture applied to a method for displaying a terminal interface according to the present application.
  • FIG. 14 is a flowchart 4 of a display method of a terminal interface provided by the present application.
  • FIG. 15 is a schematic diagram 7 of an example of a terminal interface of a mobile phone provided by the present application.
  • 16 is a schematic diagram 8 of an example of a terminal interface of a mobile phone provided by the present application.
  • 17 is a flowchart 5 of a display method of a terminal interface provided by the present application.
  • FIG. 18 is a schematic diagram 1 of an example of a sliding track provided by the present application.
  • FIG. 19 is a second schematic diagram of a sliding track provided by the present application.
  • 20 is a schematic diagram of an example of a high frequency touch area provided by the present application.
  • 21 is a flowchart 6 of a display method of a terminal interface provided by the present application.
  • 22A is a schematic diagram 9 of an example of a terminal interface of a mobile phone provided by the present application.
  • 22B is a flowchart 7 of a display method of a terminal interface provided by the present application.
  • 22C is a schematic diagram 10 of an example of a terminal interface of a mobile phone provided by the present application.
  • 22D is a schematic diagram 11 of an example of a terminal interface of a mobile phone provided by the present application.
  • FIG. 23 is a flowchart 8 of a display method of a terminal interface provided by the present application.
  • 24 is a schematic diagram 12 of an example of a terminal interface of a mobile phone provided by the present application.
  • 25 is a schematic diagram 13 of an example of a terminal interface of a mobile phone provided by the present application.
  • 26 is a schematic diagram of an example of a terminal interface of a mobile phone provided by the present application.
  • FIG. 27 is a schematic diagram of an example of a terminal interface of a mobile phone provided by the present application.
  • FIG. 28 is a schematic diagram 16 of an example of a terminal interface of a mobile phone provided by the present application.
  • 29 is a schematic structural diagram 1 of a terminal structure provided by the present application.
  • FIG. 30 is a second schematic structural diagram of a terminal provided by the present application.
  • first and second are used for descriptive purposes only, and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, features defining “first” and “second” may include one or more of the features either explicitly or implicitly. In the description of the present application, "a plurality” means two or more unless otherwise stated.
  • the touch screen of the touch screen mobile phone when the user uses the touch screen mobile phone, when the touch screen of the touch screen mobile phone is large, the user cannot conveniently operate the touch screen mobile phone with one hand.
  • the left hand of the user when the user holds the mobile phone 100 left, the left hand of the user cannot touch and operate the application icon displayed on the upper right of the touch screen of the mobile phone 100, such as the “China Merchants Bank” icon 01;
  • the user's right hand cannot touch and operate the application icon displayed on the upper left of the touch screen of the mobile phone 100, such as the "photo" icon 02.
  • the display method and terminal of the terminal interface provided by the application can recognize the gesture input by the user on the touch screen to determine whether the user operates the mobile phone with the left or right hand, so as to avoid the problem of high cost due to additional hardware equipment.
  • the terminal may display a touch area that can be used to operate the terminal interface on the first side close to the terminal when the terminal is operated by the user using the finger of the first side (such as the left hand or the right hand), so as to facilitate The user can operate all the content on the terminal interface in the touch area.
  • the user can operate the area of the terminal interface that the first side of the finger cannot reach without affecting the user's vision and operation experience.
  • the user can operate in the touch area as shown in (b) of FIG.
  • the photo icon 02 cannot be touched by the hand.
  • the execution body of the display method of the terminal interface provided by the present application may be a display device of the terminal interface, and the display device of the terminal interface may be the mobile phone 100 shown in FIG. 1 or FIG. 2 .
  • the display device of the terminal interface may also be a central processing unit (English: Central Processing Unit, CPU for short) of the terminal, or a control module for executing a display method of the terminal interface in the terminal.
  • a display method of the terminal interface provided by the embodiment of the present invention is described in the embodiment of the present invention.
  • the terminal in the present application may be a mobile phone (such as the mobile phone 100 shown in FIG. 2), a tablet computer, a personal computer (PC), and a personal digital assistant (personal computer) that can install an application and display an application icon.
  • Digital assistant (PDA) smart watch, netbook, wearable electronic device, etc., the specific form of the device is not particularly limited.
  • the mobile phone 100 is used as an example of the terminal.
  • the mobile phone 100 may specifically include: a processor 101, a radio frequency (RF) circuit 102, a memory 103, a touch screen 104, a Bluetooth device 105, and one or more sensors 106.
  • RF radio frequency
  • These components can communicate over one or more communication buses or signal lines (not shown in Figure 2).
  • the hardware structure shown in FIG. 2 does not constitute a limitation to the mobile phone, and the mobile phone 100 may include more or less components than those illustrated, or some components may be combined, or different component arrangements.
  • the processor 101 is a control center of the mobile phone 100, and connects various parts of the mobile phone 100 by using various interfaces and lines, and executes the mobile phone 100 by running or executing an application stored in the memory 103 and calling data stored in the memory 103.
  • the processor 101 may include one or more processing units; for example, the processor 101 may be a Kirin 960 chip manufactured by Huawei Technologies Co., Ltd.
  • the processor 101 may further include a fingerprint verification chip for verifying the collected fingerprint.
  • the radio frequency circuit 102 can be used to receive and transmit wireless signals during transmission or reception of information or calls.
  • the radio frequency circuit 102 can process the downlink data of the base station and then process it to the processor 101; in addition, transmit the data related to the uplink to the base station.
  • radio frequency circuits include, but are not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency circuit 102 can also communicate with other devices through wireless communication.
  • the wireless communication can use any communication standard or protocol, including but not limited to global mobile communication systems, general packet radio services, code division multiple access, wideband code division multiple access, long term evolution, email, short message service, and the like.
  • the memory 103 is used to store applications and data, and the processor 101 executes various functions and data processing of the mobile phone 100 by running applications and data stored in the memory 103.
  • the memory 103 mainly includes a storage program area and a storage data area, wherein the storage program area can store an operating system, an application required for at least one function (such as a sound playing function, an image playing function, etc.); the storage data area can be stored according to the use of the mobile phone. Data created at 100 o'clock (such as audio data, phone book, etc.).
  • the memory 103 may include a high speed random access memory (RAM), and may also include a nonvolatile memory such as a magnetic disk storage device, a flash memory device, or other volatile solid state storage device.
  • the memory 103 can store various operating systems, for example, developed by Apple. Operating system, developed by Google Inc. Operating system, etc.
  • the above memory 103 may be independent and connected to the processor 101 via the above communication bus; the memory 103 may also be integrated with the processor 101.
  • the touch screen 104 may specifically include a touch panel 104-1 and a display 104-2.
  • the touch panel 104-1 can collect touch events on or near the user of the mobile phone 100 (for example, the user uses any suitable object such as a finger, a stylus, or the like on the touch panel 104-1 or on the touchpad 104.
  • the operation near -1), and the collected touch information is sent to other devices (for example, processor 101).
  • the touch event of the user in the vicinity of the touch panel 104-1 may be referred to as a hovering touch; the hovering touch may mean that the user does not need to directly touch the touchpad in order to select, move or drag a target (eg, an icon, etc.) , and only the user is located near the device to perform the desired function.
  • the touch panel 104-1 can be implemented in various types such as resistive, capacitive, infrared, and surface acoustic waves.
  • a display (also referred to as display) 104-2 can be used to display information entered by the user or information provided to the user as well as various menus of the mobile phone 100.
  • the display 104-2 can be configured in the form of a liquid crystal display, an organic light emitting diode, or the like.
  • the touchpad 104-1 can be overlaid on the display 104-2, and when the touchpad 104-1 detects a touch event on or near it, it is transmitted to the processor 101 to determine the type of touch event, and then the processor 101 may provide a corresponding visual output on display 104-2 depending on the type of touch event.
  • the touchpad 104-1 and the display 104-2 are implemented as two separate components to implement the input and output functions of the handset 100, in some embodiments, the touchpad 104- 1 is integrated with the display screen 104-2 to implement the input and output functions of the mobile phone 100. It is to be understood that the touch screen 104 is formed by stacking a plurality of layers of materials. In the embodiment of the present application, only the touch panel (layer) and the display screen (layer) are shown, and other layers are not described in the embodiment of the present application. .
  • the touch panel 104-1 may be disposed on the front surface of the mobile phone 100 in the form of a full-board
  • the display screen 104-2 may also be disposed on the front surface of the mobile phone 100 in the form of a full-board, so that the front of the mobile phone can be borderless. Structure.
  • the mobile phone 100 can also have a fingerprint recognition function.
  • the fingerprint reader 112 can be configured on the back of the handset 100 (eg, below the rear camera) or on the front side of the handset 100 (eg, below the touch screen 104).
  • the fingerprint collection device 112 can be configured in the touch screen 104 to implement the fingerprint recognition function, that is, the fingerprint collection device 112 can be integrated with the touch screen 104 to implement the fingerprint recognition function of the mobile phone 100.
  • the fingerprint capture device 112 is disposed in the touch screen 104 and may be part of the touch screen 104 or may be otherwise disposed in the touch screen 104.
  • the main component of the fingerprint collection device 112 in the embodiment of the present application is a fingerprint sensor, which can employ any type of sensing technology, including but not limited to optical, capacitive, piezoelectric or ultrasonic sensing technologies.
  • the mobile phone 100 may also include a Bluetooth device 105 for enabling data exchange between the handset 100 and other short-range devices (eg, mobile phones, smart watches, etc.).
  • the Bluetooth device in the embodiment of the present application may be an integrated circuit or a Bluetooth chip or the like.
  • the handset 100 can also include at least one type of sensor 106, such as a light sensor, motion sensor, and other sensors.
  • the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display of the touch screen 104 according to the brightness of the ambient light, and the proximity sensor may turn off the power of the display when the mobile phone 100 moves to the ear.
  • the accelerometer sensor can detect the magnitude of acceleration in all directions (usually three axes). When it is stationary, it can detect the magnitude and direction of gravity.
  • the mobile phone 100 can also be configured with gyroscopes, barometers, hygrometers, Other sensors such as thermometers and infrared sensors will not be described here.
  • the WiFi device 107 is configured to provide the mobile phone 100 with network access complying with the WiFi-related standard protocol, and the mobile phone 100 can access the WiFi access point through the WiFi device 107, thereby helping the user to send and receive emails, browse web pages, and access streaming media. It provides users with wireless broadband Internet access.
  • the WiFi device 107 can also function as a WiFi wireless access point, and can provide WiFi network access for other devices.
  • the positioning device 108 is configured to provide a geographic location for the mobile phone 100. It can be understood that the positioning device 108 can be specifically a receiver of a positioning system such as a Global Positioning System (GPS) or a Beidou satellite navigation system, or a Russian GLONASS. After receiving the geographical location transmitted by the positioning system, the positioning device 108 sends the information to the processor 101 for processing, or sends it to the memory 103 for storage. In some other embodiments, the positioning device 108 can also be a receiver of an Assisted Global Positioning System (AGPS), which assists the positioning device 108 in performing ranging and positioning services by acting as an auxiliary server.
  • AGPS Assisted Global Positioning System
  • the secondary location server provides location assistance over a wireless communication network in communication with a location device 108 (i.e., a GPS receiver) of the device, such as handset 100.
  • a location device 108 i.e., a GPS receiver
  • the positioning device 108 can also be a WiFi access point based positioning technology. Since each WiFi access point has a globally unique (Media Access Control, MAC) address, the device can scan and collect the broadcast signal of the surrounding WiFi access point when the WiFi is turned on, so that the WiFi connection can be obtained.
  • MAC Media Access Control
  • the MAC address broadcasted by the inbound point the device sends the data (such as the MAC address) capable of indicating the WiFi access point to the location server through the wireless communication network, and the location server retrieves the geographical location of each WiFi access point and combines The strength of the WiFi broadcast signal is calculated, and the geographic location of the device is calculated and sent to the location device 108 of the device.
  • the data such as the MAC address
  • the location server retrieves the geographical location of each WiFi access point and combines
  • the strength of the WiFi broadcast signal is calculated, and the geographic location of the device is calculated and sent to the location device 108 of the device.
  • the audio circuit 109, the speaker 113, and the microphone 114 can provide an audio interface between the user and the handset 100.
  • the audio circuit 109 can transmit the converted electrical data of the received audio data to the speaker 113 for conversion to the sound signal output by the speaker 113; on the other hand, the microphone 114 converts the collected sound signal into an electrical signal by the audio circuit 109. After receiving, it is converted into audio data, and then the audio data is output to the RF circuit 102 for transmission to, for example, another mobile phone, or the audio data is output to the memory 103 for further processing.
  • the peripheral interface 110 is used to provide various interfaces for external input/output devices (such as a keyboard, a mouse, an external display, an external memory, a subscriber identity module card, etc.). For example, it is connected to the mouse through a Universal Serial Bus (USB) interface, and is connected to a Subscriber Identification Module (SIM) card provided by the service provider through a metal contact on the card slot of the subscriber identity module. . Peripheral interface 110 can be used to couple the external input/output peripherals described above to processor 101 and memory 103.
  • USB Universal Serial Bus
  • SIM Subscriber Identification Module
  • the mobile phone 100 can communicate with other devices in the device group through the peripheral interface 110.
  • the peripheral interface 110 can receive display data sent by other devices for display, etc. No restrictions are imposed.
  • the mobile phone 100 may further include a power supply device 111 (such as a battery and a power management chip) that supplies power to the various components.
  • the battery may be logically connected to the processor 101 through the power management chip to manage charging, discharging, and power management through the power supply device 111. And other functions.
  • the mobile phone 100 may further include a camera (front camera and/or rear camera), a flash, a micro projection device, a near field communication (NFC) device, and the like, and details are not described herein.
  • a camera front camera and/or rear camera
  • a flash a flash
  • micro projection device a micro projection device
  • NFC near field communication
  • the present application provides a display method of a terminal interface, and the display method of the terminal interface includes S301-S303:
  • the terminal determines a high frequency touch area on the first side of the terminal in response to the first gesture input by the user on the first interface.
  • the finger on the first side of the user is the left hand finger or the right hand finger of the user.
  • the terminal may first determine whether the first gesture is a gesture input by the user's left hand or a gesture input by the user's right hand in response to the first gesture input by the user at the first interface; and when determining that the first gesture is a gesture input by the user's left hand
  • the high-frequency touch area on the left side of the terminal is determined; when the first gesture is determined to be a gesture input by the right hand of the user, the high-frequency touch area on the right side of the terminal is determined.
  • the above S301 can be replaced by S301a-S301b.
  • the display method of the terminal interface includes S301a-S301b, S302, and S303:
  • the terminal determines, in response to the first gesture input by the user at the first interface, that the first gesture is a gesture of finger input by the first side of the user.
  • the mobile phone 100 can display the first interface 401 shown in (a) of FIG.
  • the first interface displayed by the terminal in the present application includes, but is not limited to, the display desktop 401 including the application icon shown in (a) of FIG. 4 .
  • the first interface may also be any display interface of any application in the terminal.
  • the first gesture may be that the terminal inputs a sliding track in any area of the first interface.
  • the sliding trajectory 402 input by the user in the first interface 401 may be a sliding trajectory corresponding to the first gesture.
  • the terminal may calculate a tangent of a line connecting the start point and the end point of the sliding track of the first gesture, and an angle between the x-axis or the y-axis of the coordinate axis, according to the range of the tangent value, and The distribution of the point of the sliding trajectory of the first gesture in the terminal interface determines whether the first gesture is a gesture input by the user's left hand or a gesture input by the user's right hand.
  • the terminal may determine a starting point coordinate and an ending point coordinate of the sliding track of the first gesture, and then search for a distribution of the starting point coordinate and the ending point coordinate in the terminal interface from the pre-saved left-hand trajectory model and the right-hand trajectory model.
  • the first sliding trajectory of the starting point coordinate and the ending point coordinate of the sliding trajectory of the first gesture If the terminal finds the first sliding track in the first side trajectory model (such as the left hand trajectory model), it may be determined that the first gesture is a gesture input by a finger (such as a left hand) of the first side of the user.
  • S301b The terminal determines a high frequency touch area on the first side of the terminal.
  • the high-frequency touch area is a touch area on the terminal interface that is operated by the user or the number of times is higher than the first threshold.
  • the high frequency touch area on the first side may be a high frequency touch area on the left side of the terminal or a high frequency touch area on the right side.
  • the terminal may acquire a sliding trajectory of a left-hand gesture (ie, a left-hand finger input gesture) and a right-hand gesture (ie, a right-hand finger input gesture) input by the user on the terminal interface; a point of counting a sliding trajectory of the left-hand gesture (referred to as a left)
  • the distribution of the side track points in the terminal interface is determined as the high-frequency touch area on the left side of the left side track point in the terminal interface; the point of the sliding track of the right hand gesture is counted (referred to as the right track point)
  • the area where the right track point is distributed in the terminal interface is determined as the high-frequency touch area on the right side.
  • the area where the track points are distributed more is the area where the intensity of the track points in the terminal interface is higher than a certain threshold.
  • the terminal displays a first touch panel in the high frequency touch area on the first side, where the first touch panel is configured to operate the first interface in response to a gesture input by the user.
  • the first side of the terminal is the left side of the terminal
  • the finger of the first side of the user is the right hand finger
  • the first side of the terminal is the right side of the terminal
  • the mobile phone 100 determines that the gesture corresponding to the sliding trajectory 402 (ie, the first gesture) is a gesture input by the user's right hand finger, then (b) in FIG. 4
  • the mobile phone 100 can display the first touch panel 403 in the high frequency touch area on the right side of the first interface 401.
  • the left side of the terminal refers to a touch screen of the terminal along which the center line of the vertical direction (parallel to the center line of the frame on the left and right sides of the mobile phone) is divided into two parts, and is close to the left side frame of the mobile phone.
  • Side; the right side of the terminal refers to the side of the touch screen of the terminal divided into two parts along the vertical center line, close to the side frame of the right side of the mobile phone.
  • the mobile phone 100 can display the first touch panel 403 at the lower right of its touch screen.
  • the first side may be specifically directed to the lower left of the touch screen of the terminal.
  • the second interface is displayed by the terminal in response to the second gesture input by the user in the first touch panel, where the second interface includes the third gesture displayed by the terminal in response to the user inputting the corresponding position of the first interface. Interface elements.
  • the first touch panel is configured to operate the first interface in response to the gesture input by the user, that is, the operation of the user on the first touch panel may be mapped to the same operation of the user on the first interface. That is, the touch points on the first touch panel may be mapped one by one to the touch points on the corresponding positions of the first interface. For example, as shown in FIG. 5, the touch point a on the first touch panel 403 may be mapped to the touch point A in the first interface 401, and the touch point b on the first touch panel 403 may be mapped to the first interface 401. In the touch point B in the middle, the touch point c on the first touch panel 403 can be mapped to the touch point C in the first interface 401.
  • the terminal may display the terminal interface X in response to the user's click operation on the touch point A in the first interface 401. Then, when the user clicks the touch point a on the first touch panel 403, the terminal may display all the interface elements in the terminal interface X in response to the user's clicking operation on the touch point a on the first touch panel 403.
  • Terminal interface Y The first touch panel 403 can also be included in the terminal interface Y. Certainly, the first touch panel 403 may not be included in the terminal interface Y, that is, the terminal interface Y is completely the same as the terminal interface X.
  • the touch point a on the first touch panel 403 can be mapped to the touch point A where the icon of the “photo” application in the first interface 401 is located. Then, as shown in (a) of FIG. 6A, when the user clicks the touch point a on the first touch panel 403, the mobile phone 100 can open the "photo” application to display the display shown in (b) of FIG. 6A. Photo list interface 601 and first touch panel 403.
  • the first touch panel 403 in (b) of FIG. 6A is optional, and the first touch panel 403 may not be displayed in the mobile phone 100 in (b) of FIG. 6A.
  • the mobile phone 100 when the user clicks on the touch point a on the first touch panel 403, the mobile phone 100 can display the cursor 602 at the position of the icon of the "photo” application. Moreover, the mobile phone 100 can also display the dynamic display mode of the icon of the “photo” application when the user clicks the touch point a on the first touch panel 403 and the icon of the “photo” application is clicked. The icon for the Photos app.
  • the first touch panel may also display some operable interface elements in the first interface, such as a “back” button, a “share button”, and the like.
  • the first touch panel 403 may further include a "return” button 604 of the photo list interface 601.
  • the mobile phone 100 can display the display interface shown in (b) of FIG. 6B.
  • the first touch panel 403 shown in (b) of FIG. 6B may further include a "return camera” button 605 compared to the first touch panel 403 shown in (b) of FIG. 6A.
  • the mobile phone 100 can activate the camera in response to the user's click operation on the "return camera” button 605.
  • the terminal can determine the size and shape of the first touch panel according to the habit of the user using the terminal. For example, taking the user's right hand held mobile phone 100 as an example, as shown in (a) of FIG. 7 , the mobile phone 100 can count the right hand thumb of the user when the right hand mobile phone is touched, and the right thumb of the user can reach the farthest from the right side frame of the mobile phone 100 . At distance L1, the user's right thumb can reach the farthest distance L2 from the lower border of the handset 100. Then, the mobile phone 100 can determine, according to the size of L1 and L2, the first touch panel 403 to be displayed by the mobile phone 100 when the first gesture is a gesture input by the right hand finger of the user, as shown in (b) of FIG. 7 . .
  • the first touch panel in the present application includes, but is not limited to, the first touch panel 403 shown in (b) of FIG. 7 .
  • the mobile phone 100 can count the user when the user right hand is holding the mobile phone. The right thumb of the user can reach the farthest distance L1 from the right side frame of the mobile phone 100, and the right thumb of the user can reach the farthest distance L2 from the lower frame of the mobile phone 100.
  • the handset 100 can then determine the sector curve 801 as shown in (a) of FIG. 8 based on the sizes of L1 and L2.
  • the mobile phone 100 may display the first touch panel 802 corresponding to the shape of the sector curve 801 as shown in (b) of FIG. 8 .
  • FIG. 7 and FIG. 8 show only two possible examples of the first touch panel in the present application by way of example.
  • the size and shape of the first touch panel include but are not limited to those shown in FIG. 7 and FIG. 8 .
  • the first touch panel .
  • the display method of the terminal interface provided by the present application can recognize the gesture input by the user on the touch screen to determine whether the user operates the mobile phone with the left or right hand, so as to avoid the problem of high cost due to additional hardware equipment.
  • the terminal may display the first touch panel that can be used to operate the terminal interface on the terminal interface when the user is operated to use the finger on the first side (such as the left hand or the right hand) to facilitate the user. All content on the terminal interface can be operated on the first touch panel. In this way, the user can operate the area of the terminal interface that the first side of the finger cannot reach without affecting the user's vision and operation experience.
  • the first touch panel is displayed on the first side of the high-frequency touch area, which can further facilitate all content that the user can operate on the first touch panel operation terminal interface.
  • the terminal may calculate a tangent value of a line connecting the start point and the end point of the sliding track of the first gesture, and an angle between the x-axis or the y-axis of the coordinate axis, according to the range of the tangent value. And the distribution of the point of the sliding trajectory of the first gesture in the terminal interface to determine whether the first gesture is a gesture input by the user's left hand or a gesture input by the user's right hand.
  • the above S301a Can be replaced with S901-S902.
  • S301a in FIG. 3 may be replaced with S901-S902:
  • the terminal calculates a tangent value of a line connecting the start point and the end point of the sliding track of the first gesture with the x-axis or the y-axis of the coordinate axis in response to the first gesture input by the user at the first interface.
  • the mobile phone 100 receives the sliding track 1001 corresponding to the first gesture input by the user on its touch screen, and the starting point of the sliding track 1001 is the point D, and the ending point of the sliding track 1001 For point E.
  • the coordinates of the point D are D (x 1 , y 1 )
  • the coordinates of the point E are E (x 2 , y 2 )
  • the line connecting the start point and the end point of the sliding track 1001 is Line segment EF.
  • the angle between the line segment EF and the x-axis is ⁇ as shown in (b) of FIG.
  • the terminal can count the line connecting the start point and the end point of the sliding track of the gesture input by the finger on the left side of the user, and the tangent value of the angle between the x-axis or the y-axis of the coordinate axis, and determine the data value interval corresponding to the left side;
  • the line connecting the start point and the end point of the sliding track of the gesture input by the finger on the right side of the user, and the tangent value of the angle between the x-axis or the y-axis of the coordinate axis are determined, and the data value interval corresponding to the right side is determined.
  • the terminal can determine whether the preset proportion of the sliding track of the first gesture is close to the right side of the terminal; The terminal can determine whether the point of the preset ratio in the sliding track of the first gesture is close to the left side of the terminal. Hypothesis And as shown in (a) of FIG. 10, all points of the sliding track 1001 of the first gesture are distributed in the right display area of the mobile phone 100, and therefore, the mobile phone 100 can determine that the first gesture is the right finger input of the user. gesture.
  • the terminal may determine the value of the tangent of the angle between the start and end points of the sliding track of the first gesture, the x-axis or the y-axis of the coordinate axis, and the sliding path of the first gesture. With the distribution of points, it can be judged that the user operates the mobile phone with the left or right hand, so that the problem of high cost due to the additional hardware addition is avoided.
  • the terminal may determine the starting point coordinates and the ending point coordinates of the sliding trajectory of the first gesture, and then search for the starting point coordinates and the ending point coordinates in the terminal interface from the pre-saved left-hand trajectory model and the right-hand trajectory model.
  • a first sliding trajectory that matches a starting point coordinate and an ending point coordinate of the sliding trajectory of the first gesture. If the terminal finds the first sliding track in the left-hand trajectory model, it may be determined that the first gesture is a gesture input by the user's left hand; if the terminal finds the first sliding trajectory in the right-hand trajectory model, the first may be determined.
  • a gesture is a gesture entered by the user's right hand.
  • the above S301a may be replaced with S1101-S1103.
  • S301a in FIG. 3 may be replaced with S1101-S1103:
  • S1101 The terminal determines a start point coordinate and an end point coordinate of a sliding track of the first gesture in response to the first gesture input by the user at the first interface.
  • the method for determining the start point coordinate and the end point coordinate of the sliding track of the first gesture in response to the first gesture input by the user in the first interface may be referred to the related description of the present application, which is not described herein again.
  • S1102 The terminal searches for the first sliding trajectory from the left-hand trajectory model and the right-hand trajectory model, and the distribution of the starting point coordinates and the ending point coordinates of the first sliding trajectory at the terminal interface matches the starting point coordinates and the ending point coordinates of the sliding trajectory of the first gesture.
  • the left-hand trajectory model includes coordinates of at least one left-hand sliding trajectory
  • the right-hand trajectory model includes coordinates of at least one right-hand sliding trajectory
  • the distribution of the start point coordinate and the end point coordinate of the first sliding track in the terminal interface matches the start point coordinate and the end point coordinate of the sliding track of the first gesture, specifically: the starting point coordinate of the first sliding track and The starting point coordinates of the sliding track of the first gesture are the same, and the end point coordinates of the first sliding track are the same as the ending point coordinates of the sliding track of the first gesture.
  • the distribution of the start point coordinate and the end point coordinate of the first sliding track in the terminal interface matches the start point coordinate and the end point coordinate of the sliding track of the first gesture, specifically: the starting point coordinate of the first gesture is at the starting point of the first sliding track Within the range of values of the coordinates, the end point coordinate of the sliding track of the first gesture is within the range of the coordinates of the end point of the first sliding track.
  • a trajectory database 1201 as shown in FIG. 12 may be maintained in the terminal of the present application, and the trajectory database 1201 may include a left-hand trajectory model 1202 and a right-hand trajectory model 1203.
  • the left-hand trajectory model 1202 includes a value range of the starting point coordinates of at least two left-hand trajectories and a value range of the end point coordinates
  • the right-hand trajectory model 1202 includes a value range of the starting point coordinates of at least two right-hand trajectories and a value of the end point coordinates. range.
  • the left-hand trajectory model 1202 includes: a range of values of x in the coordinates of the starting point of the left-hand trajectory 1 [a1, b1], a range of values of y in the coordinates of the starting point of the left-hand trajectory 1 [c1, d1], and a left-hand trajectory
  • the right-hand trajectory model 1203 includes: a range of values of x in the coordinates of the starting point of the right-hand trajectory 1 [j1, k1], a range of values of y in the coordinates of the starting point of the right-hand trajectory 1 [o1, o1], a right-hand trajectory
  • the terminal can be from the right-hand trajectory model 1203 and the left-hand trajectory model shown in FIG. In 1202, a first sliding track of a left-hand trajectory or a right-hand trajectory in which x 1 , y 1 , x 2 , and y 2 fall within the coordinate range of the start point and the end point coordinate range, respectively, is searched.
  • the terminal can determine that the first gesture is a gesture input by the user's right hand finger.
  • the foregoing trajectory database may also be included in the cloud server.
  • a cloud database can maintain a track database for each user terminal.
  • a trajectory database 1201 and a trajectory database 1320 may be included in the cloud server.
  • the left-hand trajectory model and the right-hand trajectory model of the mobile phone 100 are saved in the trajectory database 1201, and the left-hand trajectory model and the right-hand trajectory model of the mobile phone 1310 are saved in the trajectory database 1320.
  • the mobile phone 100 may send the start point coordinate and the end point coordinate of the first gesture to the cloud server, and the left hand track model and the right hand track model of the track database 1201 by the cloud server. Looking for the first sliding track and returning the search result to the mobile phone 100.
  • the coordinate range of the sliding track saved in the left-hand trajectory model and the right-hand trajectory model may be obtained by the terminal counting the coordinates of the sliding trajectory of the plurality of gestures input by the user.
  • the terminal may update according to the coordinates of the sliding track collected in the near future (such as within one month).
  • the left-hand trajectory model and the right-hand trajectory model of the terminal are saved in the cloud server, the terminal may report the coordinates of the sliding trajectory collected by the cloud server in the near future (such as within one month), so that the cloud server can update the left-hand trajectory model of the terminal.
  • right hand trajectory model is a model of the sliding trajectory collected by the cloud server in the near future.
  • the terminal can count the coordinates of the sliding trajectory of the plurality of gestures input by the user without the user's perception.
  • the method of the present application may further include S1301:
  • the terminal determines, according to a fourth gesture input by the user at the terminal interface, a gesture that the fourth gesture is a finger input by the first side of the user, and saves coordinates of the sliding trajectory of the fourth gesture in the first side trajectory model.
  • the terminal determines that the fourth gesture is the gesture of the finger input by the user on the first side
  • the terminal determines that the fourth gesture is the gesture of the finger input by the user on the first side
  • the terminal can count the coordinates of the sliding trajectory of the plurality of gestures input by the user (corresponding to the gesture habit of the user on the touch screen of the terminal) without the user's perception, and save the sliding trajectory in the first trajectory model. Coordinates, so that after receiving the first gesture of the user input, the coordinates of the sliding trajectory in the left-hand trajectory model and the right-hand trajectory model can be compared to determine whether the first gesture is a gesture input by the user's left hand or a right-hand input by the user. Gesture.
  • the terminal can specifically guide the user to use a left hand input gesture or a right hand input gesture.
  • the terminal specifically guides the user to input the left-hand gesture the coordinates of the sliding trajectory of the left-hand gesture input by the user according to the guidance input of the terminal may be collected; when the terminal specifically guides the user to input the right-hand gesture, the right-hand gesture of the user according to the guiding input of the terminal may be collected.
  • the coordinates of the sliding track In this way, the coordinate accuracy of the sliding trajectory saved in the first side trajectory model can be improved.
  • the method of the present application may further include S1401.
  • the method of the present application may further include S1401-S1403:
  • the terminal displays a third interface, where the third interface includes first prompt information, where the first prompt information is used to prompt the user to slide on the terminal interface by using the finger on the first side.
  • the mobile phone 100 may display a third interface 1501.
  • the third interface 1501 may include a first prompt message “Please follow the habit of operating the mobile phone with your right hand, and input the sliding track on the touch screen with the right hand. "1502.
  • the first prompt information in the present application includes but is not limited to the first prompt information 1502 shown in FIG. 15 .
  • the terminal is a mobile phone
  • the mobile phone may display the third interface after the mobile phone is turned on, or after the mobile phone turns on the one-hand mode.
  • the above one-hand mode can be divided into a left-hand mode and a right-hand mode.
  • the left-hand mode refers to when the user holds the mobile phone to the left, the mobile phone controls the display mode displayed on the left side of the mobile phone by the interface element displayed on the touch screen of the mobile phone in order to facilitate the user to operate the mobile phone with the left hand.
  • the right-hand mode means that when the user holds the mobile phone right, the mobile phone controls the display mode displayed on the right side of the mobile phone by the interface element displayed on the touch screen of the mobile phone in order to facilitate the user to operate the mobile phone with the right hand.
  • S1402 The terminal calculates coordinates of the sliding track of the at least two third gestures according to the at least two third gestures input by the user in the third interface, and obtains coordinates of the at least one first side sliding track of the finger, the third gesture It is a gesture of finger input on the first side of the user.
  • the mobile phone 100 may receive a third gesture input by the user at the third interface 1501 (ie, a gesture corresponding to the sliding track 1601 ).
  • the terminal may receive a plurality of third gestures input by the user on the third interface, and then count the coordinates of the sliding trajectories of the plurality of third gestures, that is, classify the sliding trajectories of the plurality of third gestures to obtain one or The coordinates of the finger sliding trajectories of the plurality of first sides.
  • the terminal saves coordinates of the at least one first side sliding track of the finger in the first side trajectory model.
  • the terminal may directly save the coordinates of the finger sliding track of the at least one first side in the first side trajectory model.
  • the terminal may further determine that the third gesture is the left finger input of the user in response to the third gesture input by the user in the third interface.
  • the gesture is still a gesture input by the finger on the right side, and then determines to the user whether the judgment of the terminal is correct; when the user determines that the judgment of the terminal is correct, the coordinates of the corresponding sliding track are saved.
  • the above S1401-S1403 can be replaced by S1601-S1603:
  • the terminal displays a third interface, where the third interface includes first prompt information, where the first prompt information is used to prompt the user to slide on the terminal interface by using the finger on the first side.
  • the terminal determines, in response to the third gesture input by the user in the third interface, that the third gesture is a gesture of finger input by the first side of the user, and displays a fourth interface, where the fourth interface includes a third gesture for determining Whether the prompt information of the gesture input for the finger on the first side of the user.
  • the mobile phone 100 may display the fourth interface 1602 shown in (b) of FIG. 16 , the fourth interface.
  • the hint information of the gesture for determining whether the third gesture is the finger input of the first side of the user is included in 1602.
  • the fourth interface 1602 includes a prompt message "Please determine whether you just used the right hand input slide track?".
  • the terminal saves coordinates of a sliding track of the third gesture in the first side track model in response to the first input of the user in the fourth interface, where the first input is used to indicate that the third gesture is a finger input of the first side of the user. Gesture.
  • the first input of the user at the fourth interface may be a click operation of the user's "Yes” option in the fourth interface 1602 as shown in (b) of FIG.
  • the mobile phone 100 can save the coordinates of the sliding trajectory of the third gesture in the first side trajectory model.
  • the method of the present application may further include S1604-S1605:
  • S1604 The terminal statistics the coordinates of the sliding track of the at least two third gestures saved in the preset time, and obtains coordinates of the finger sliding track of the at least one first side, where the third gesture is a finger input gesture of the first side of the user;
  • S1605 The terminal saves coordinates of at least one first side sliding track of the finger in the first side trajectory model.
  • the terminal can not only guide the user to use the left hand input gesture or use the right hand input gesture. Moreover, after the user inputs the gesture according to the instruction of the terminal, the terminal may determine, by using a two-fold determination process, a gesture input by the user, which side of the user inputs the gesture. That is, the terminal may first determine whether the third gesture is a gesture input by the user's left finger or a gesture input by the right finger; and then determine to the user whether the terminal's judgment is correct; when the user determines that the terminal's judgment is correct, the terminal saves the corresponding The coordinates of the sliding track. Through the above two-fold judging process, the coordinate accuracy of the sliding trajectory saved in the first side trajectory model can be improved.
  • the terminal may determine, according to the coordinates of the finger sliding track in the first side trajectory model, the high frequency touch area of the finger on the first side of the user on the terminal interface, and then determine the high frequency touch area as the terminal.
  • the high frequency touch area is a touch area on the terminal interface that is touched by the user or that is operated by the user is higher than a preset threshold.
  • the above S301b may be replaced by including S1701.
  • S301b shown in FIG. 11 may be replaced with S1701:
  • the terminal determines, according to coordinates of the finger sliding trajectory of the at least one first side of the first side trajectory model, the high frequency touch area of the finger on the first side of the user on the terminal interface.
  • the terminal determines the high-frequency touch area on the left side of the terminal as an example, and the terminal may display the left side display area of the touch screen of the terminal (for example, the left side of the mobile phone 100 shown in (a) of FIG. 10
  • the display area is divided into at least two display areas, and then counts the number of times that each of the at least two display areas is operated by the user, and the display area whose number of times the user operates is higher than a preset threshold is determined as the left side.
  • High frequency touch area is optionally, the left display area of the terminal includes, but is not limited to, the left display area of the mobile phone 100 shown in (a) of FIG. 10 .
  • the terminal may analyze the coordinate distribution of the left-hand trajectory in the left-hand trajectory model 1202 shown in FIG. 12, and determine the area where the left-hand trajectory is densely distributed as the left-side high-frequency touch area; and analyze the right-hand trajectory model shown in FIG.
  • the coordinate distribution of the right-hand trajectory in 1203 determines the area where the right-hand trajectory is densely distributed as the right-side high-frequency touch area.
  • the terminal may select two high-frequency right-hand trajectories from the right-hand trajectory model 1203 shown in FIG. 12, and the two high-frequency left-hand trajectories are the user Terminal touch
  • the number or frequency of triggers are arranged in the order of the highest two to the right-hand trajectory of the first two digits; then, the terminal can determine the overlapping area of the two high-frequency right-hand trajectories; finally, this The overlapping area of the two high frequency right hand tracks is determined as the high frequency touch area on the right side of the terminal.
  • the sliding track 1801 shown in (a) of FIG. 18 and the sliding track 1802 shown in (b) of FIG. 18 are two high-frequency right-handed tracks of the mobile phone 100, and the terminal can refer to (c) of FIG.
  • the overlapping area 1803 of the illustrated sliding track 1801 and the sliding track 1802 is determined as the high frequency touch area on the right side of the terminal.
  • the sliding trajectory 1801 shown in (a) of FIG. 18 and the sliding trajectory 1802 shown in (b) of FIG. 18 are two high-frequency right-hand trajectories of the cellular phone 100 as an example.
  • the terminal can determine the origin O and the point C of the sector shown by (a) in FIG. (ie, the straight line OC of the starting point of the line A (x 1 , y 1 ) of the sliding track 1801 and the line connecting the end point B (x 2 , y 2 )), and the intersection D(x 3 , y 3 ) with the sliding track 1801 Determining the origin O and the point G of the sector shown by (b) in Fig.
  • the terminal can determine the point D (x 3 , y 3 ) and the point H (x 6 , y 6 ), and the overlapping region 1901 of the sliding trajectory 1801 and the sliding trajectory 1802. It is the high-frequency touch area on the right side of the terminal.
  • the high frequency touch area in the present application may also be a display area of a fixed shape including the overlapping area.
  • the high frequency touch area may be a rectangular display area 2001 including the overlap area 1803; or, as shown in (b) of FIG. 20, the high frequency touch area It may be a circular display area 2002 including the overlap region 1803 described above.
  • the display method of the terminal interface provided by the present application can recognize the gesture input by the user on the touch screen to determine whether the user operates the mobile phone with the left or right hand, so as to avoid the problem of high cost due to additional hardware equipment.
  • the terminal may display a touch area that can be used to operate the terminal interface on the first side close to the terminal when the user is operated to use the finger on the first side (such as the left hand or the right hand) to facilitate the operation.
  • the user can operate all the content on the terminal interface in the touch area. In this way, the user can operate the area of the terminal interface that the first side of the finger cannot reach without affecting the user's vision and operation experience.
  • the present application provides a display method of a terminal interface. As shown in FIG. 21, the display method of the terminal interface includes S2101-S2102:
  • the terminal determines a high-frequency touch area on the first side of the terminal in response to the first gesture input by the user on the first interface, where the first gesture is a finger input gesture on the first side of the user, and the high-frequency touch
  • the area is a touch area on the terminal interface that is operated by the user or the number of times is higher than the first threshold.
  • the first interface includes at least two application icons.
  • the terminal displays at least one high frequency application icon on the first side of the high frequency touch area, and the at least one high frequency application icon is an application that is operated by the user or the number of times of the at least two application icons is higher than the second threshold. icon.
  • the symbol 2202 is a high frequency application icon of the mobile phone 100, that is, the "Alipay” application icon 2201 and the "WeChat” application icon 2202 are application icons whose number of times of operation by the user is higher than a preset threshold.
  • the display area 2001 shown in (a) of FIG. 22A is the high-frequency touch area of the first side, and then, as shown in (b) of FIG. 22A, the mobile phone 100 can be in the high-frequency touch area. 2001 displays the high frequency application icons "Alipay" application icon 2201 and "WeChat” application icon 2202.
  • the present application provides a display method of a terminal interface.
  • the terminal determines that the user uses the finger on the first side to operate the terminal, the high frequency application icon in the terminal can be displayed on the first side of the high frequency touch area, so that the user can It is more convenient and comfortable to operate the high frequency application icon to improve the user experience.
  • the high frequency touch area of the first side may not be sufficient to display all the high frequency application icons.
  • the terminal will display the high-frequency application icon of the top N in the order of the frequency or the number of times the user is operated, in the order of the frequency of the user operation, and display the height on the first side.
  • N is the number of application icons that can be displayed in the high frequency touch area on the first side.
  • the terminal may further display a folder icon including all high frequency application icons in the high frequency touch area on the first side.
  • S2102 shown in FIG. 21 can be replaced with S2102a:
  • S2102a The terminal displays a folder icon including at least one high frequency application icon on the first side of the high frequency touch area.
  • the "Alipay” application icon 2201, the "WeChat” application icon 2202, and the "Photo” application icon 2203 shown in (a) of FIG. 22C are the high frequency application icons of the mobile phone 100, that is, the "Alipay” application icon. 2201.
  • the "WeChat” application icon 2202 and the “Photo” application icon 2203 are application icons that are operated by the user for a number of times higher than a preset threshold.
  • the display area 2001 shown in (a) of FIG. 22C is the high-frequency touch area on the first side.
  • the mobile phone 100 can display a folder icon including the "Alipay” application icon 2201, the "WeChat” application icon 2202, and the "Photo” application icon 2203 in the high frequency touch area 2001. 2204.
  • the terminal may display a folder icon including all high frequency application icons in the high frequency touch area on the first side.
  • the high frequency application icons of the terminal can be solved, and the high frequency touch area of the first side is insufficient to display all the problems of the high frequency application icons.
  • displaying the folder icon including all the high frequency application icons on the first side of the high frequency touch area the user can conveniently manipulate all the high frequency application icons in the terminal.
  • the method of the present application may further include: S2102b: the terminal displays a folder expansion window corresponding to the folder icon in the high frequency touch area on the first side in response to the user inputting the folder icon, At least one high frequency application icon is displayed in the folder expansion window.
  • the mobile phone 100 can display the high-frequency touch area 2001 on the right side in response to the user's click operation on the folder icon 2204.
  • the folder expansion window 2205 corresponding to the folder icon 2204.
  • the folder expansion window 2205 includes an "Alipay” application icon 2201, a "WeChat” application icon 2202, and a "Photo” application icon.
  • the terminal may also display the folder expansion window corresponding to the folder icon on the first side of the high frequency touch area in response to the input of the folder icon including the high frequency application icon, which may be convenient for the user. Manipulate all high frequency application icons in the terminal.
  • the present application provides a display method of a terminal interface, as shown in FIG. 23, the display method of the terminal interface includes S2301-S2302:
  • the terminal determines, in response to the first gesture input by the user in the first interface, that the first gesture is a gesture input by the finger on the first side of the user, where the first interface includes a first interface element, where the first interface element includes a navigation bar. Icon and / or dock bar icon, the finger on the first side is the user's left hand or right hand finger.
  • the navigation bar in the present application is a shortcut button bar at the bottom of the mobile phone screen, and generally appears in the form of a virtual button at the bottom of the mobile phone screen.
  • the navigation bar includes three buttons Back button and Home button by default.
  • the Recent button where the Back button is used to return to the previous interface, the Home button is used to return to the desktop, and the Recent button is used to display recently used applications.
  • the navigation bar 2402 includes a Back key 2403, a Home key 2404, and a Recent key 2405.
  • the dock bar (Dock Bar) in this application is a part of an entire window filled with a mobile phone screen or an interactive interface (Activity, which is an application area for displaying an application icon) suspended on another window.
  • the dock bar is located below the Activity and above the navigation bar.
  • the dock bar and the navigation bar belong to two window levels, and the dock bar is located at the lower level of the navigation bar.
  • the dock bar 2401 of the mobile phone 100 includes the following dock bar icons: "WeChat” application icon, "dial” icon, "contact” icon, and "short message” icon.
  • the terminal moves the first interface element to a display area near the first side of the terminal.
  • the mobile phone 100 may determine that the gesture input by the user is a gesture input by the user's right hand finger; then, the mobile phone 100 may then use the first interface.
  • the element (such as the dock bar icon) is moved to the display area near the right side of the terminal, that is, the display interface shown in (b) of FIG. 24 is displayed.
  • the first interface element may include a dock bar icon and a navigation bar icon.
  • the mobile phone 100 may determine that the gesture input by the user is a gesture input by the user's right hand finger; then, the mobile phone 100 may display the dock bar icon and The navigation bar icon is moved to the display area near the right side of the terminal, that is, the display interface shown in FIG. 25 is displayed.
  • the present application provides a display method of a terminal interface.
  • the terminal bar icon and/or the navigation bar icon in the terminal interface may be moved to the first side of the terminal.
  • the display area is displayed so that the user can operate the dock bar icon and/or the navigation bar icon more conveniently and comfortably, which can improve the user experience.
  • the terminal may display a prompt window, a popup button, and a floating button.
  • the terminal can display the prompt window, the popup button, and the hover button on the terminal interface; however, the center display of the button or the window may not be convenient for the user to operate.
  • the terminal in the present application may display a button or window to be displayed in a display area close to the first side of the terminal after determining that the user operates the terminal using the finger on the first side.
  • the terminal may further display a button or a window to be displayed in the high frequency touch area on the first side after determining that the user operates the terminal by using the finger on the first side.
  • the display interface shown in Fig. 26 can be displayed.
  • the prompt window 1602 is displayed near the lower right of the mobile phone 100.
  • the user may touch the touch screen of the terminal at the same time, causing a false touch on the touch screen.
  • the user right hand holds the mobile phone 100.
  • the user's thumb 2704 clicks the "Settings” icon 2703 on the touch screen to control the mobile phone 100 to display the setting interface
  • the user's right hand ring finger 2702 may
  • the "photo" icon 2701 on the touch screen is touched, that is, the user's right hand ring finger accidentally touches the touch screen of the mobile phone 100.
  • the terminal can recognize that the user operates the mobile phone with the right hand.
  • the terminal can ignore or block the user's input to the left icon, and only respond to the user's input to the right icon.
  • the mobile phone 100 can ignore the user's right hand ring finger's false touch on the "photo" icon 2701, and only respond to the user thumb 2704's click operation on the "set” icon 2703, displaying the setting interface shown in (b) of FIG.
  • the user right hand holds the mobile phone 100, and when the user's thumb 2802 clicks the "Settings" icon 2801 on the touch screen to control the mobile phone 100 to display the setting interface, the root 2804 of the thumb 2802 is displayed.
  • the "short message" icon 2803 on the touch screen may be touched, that is, the root 2804 of the thumb 2802 accidentally touches the touch screen of the mobile phone 100.
  • the terminal can recognize that the user operates the mobile phone with the right hand.
  • the terminal can ignore or block the user's input to the Launcher or the navigation bar, and only respond to the user's input to the upper half of the Activity.
  • the mobile phone 100 can ignore the false touch of the "short message" icon 2803 by the root 2804 of the user's thumb 2802, and only respond to the click operation of the "set" icon 2801 by the user's thumb 2802, as shown in (b) of FIG. Settings interface.
  • a left-handed false touch model and a right-handed false touch model may be pre-stored, and the left-handed false touched model includes at least one left-handed anti-missing rule, which is used to indicate when the user holds the mobile phone left At the same time, if at least two inputs of the user in different areas of the touch screen are detected at the same time, how should the terminal selectively respond to the at least two inputs.
  • the right-handed false touch model includes at least one right-hand anti-missing rule for indicating that when the user right-hands the mobile phone, if at least two inputs of the user in different areas of the touch screen are detected at the same time, How the terminal should selectively respond to at least two inputs.
  • the display method of the terminal interface provided by the present application, after the terminal determines that the user uses the finger of the first side (left hand or right hand) to operate the terminal, if the terminal simultaneously detects at least two inputs of the user in different areas of the touch screen, the terminal may follow The first side misunderstands the indication of the anti-missing rule in the model (ie, the left-hand false touch model or the right-hand false touch model), selectively responds to at least two inputs to prevent the terminal from displaying the user in response to the user's false touch on the touch screen.
  • the terminal interface that does not correspond to the operation.
  • the above terminal and the like include hardware structures and/or software modules corresponding to each function.
  • the embodiments of the present invention can be implemented in a combination of hardware or hardware and computer software in combination with the elements and algorithm steps of the various examples described in the embodiments disclosed herein. Whether a function is implemented in hardware or computer software to drive hardware depends on the specific application and design constraints of the solution. Professional technicians can use it for each specific application Different methods are used to implement the described functionality, but such implementations are not considered to be outside the scope of the embodiments of the invention.
  • the embodiment of the present application may perform the division of the function modules on the terminal or the like according to the foregoing method example.
  • each function module may be divided according to each function, or two or more functions may be integrated into one processing module.
  • the above integrated modules can be implemented in the form of hardware or in the form of software functional modules. It should be noted that the division of the module in the embodiment of the present invention is schematic, and is only a logical function division, and the actual implementation may have another division manner.
  • FIG. 29 is a schematic diagram showing a possible structure of a terminal involved in the foregoing embodiment, where the terminal 2900 includes: an input unit 2901, a determining unit 2902, and a display unit 2903. .
  • the input unit 2901 is configured to support the terminal to perform the foregoing method, receive the first gesture described in S301, S301a, S2101, and S2301, receive the second gesture in S303, and receive the foregoing in S1402 and S1602.
  • a third gesture receiving the first input described in S1603, S2302, and/or other processes for the techniques described herein.
  • the determining unit 2902 is configured to support the terminal to perform the above method, in the S301, S301b, S2101, and S2301, determining the high-frequency touch area, and determining, in S301a and S1602, that the first gesture is a finger input gesture of the first side of the user, S901- S902, S1101S1103, S1301, S1701, and/or other processes for the techniques described herein.
  • the display unit 2903 is configured to support the terminal execution method in the embodiment, display the first interface in S301, S301a, S302, display the second interface in S303, S1401, S1601, display the fourth interface in S1602, S2102, S2102a And/or other processes for the techniques described herein.
  • the foregoing terminal 2900 may further include: a statistics unit and a storage unit.
  • the statistical unit is configured to support the terminal to perform coordinates of the statistical sliding track in S1402 in the method embodiment, S1604, and/or other processes for the techniques described herein.
  • the storage unit is configured to support the terminal to perform S1403, S1605, and/or other processes for the techniques described herein in the method embodiments.
  • the terminal 2900 includes, but is not limited to, the unit modules enumerated above.
  • the terminal 2900 may further include a communication unit for communicating with other terminals.
  • the specific functions that can be implemented by the foregoing functional units include, but are not limited to, the functions corresponding to the method steps described in the foregoing examples.
  • the terminal 2900 reference may be made to the detailed description of the corresponding method steps. The examples are not described here.
  • the above determining unit 2902 and the statistical unit and the like may be integrated in one processing module.
  • the communication unit may be an RF circuit of the terminal, a WiFi module or a Bluetooth module, and the storage unit may be a storage module of the terminal.
  • the above display unit may be a display module such as a touch screen.
  • FIG. 30 is a schematic diagram showing a possible structure of a terminal involved in the above embodiment.
  • the terminal 3000 includes a processing module 3001, a storage module 3002, a display module 3003, and a communication module 3004.
  • the processing module 3001 is configured to control and manage the actions of the terminal.
  • the display module 3003 is configured to display an image generated by the processing module 3001.
  • the storage module 3002 is configured to save program codes and data of the terminal.
  • the communication module 3004 is for communicating with other terminals. For example, the communication module 3004 is configured to perform voice communication with other terminals, and receive or send an avatar to other terminals.
  • the processing module 3001 may be a processor or a controller, for example, may be a central processing unit (CPU), a general-purpose processor, a digital signal processor (DSP), and an application specific integrated circuit (Application-Specific Integrated Circuit, ASIC), Field Programmable Gate Array (Field) Programmable Gate Array (FPGA) or other programmable logic device, transistor logic device, hardware component, or any combination thereof. It is possible to implement or carry out the various illustrative logical blocks, modules and circuits described in connection with the present disclosure.
  • the processor may also be a combination of computing functions, for example, including one or more microprocessor combinations, a combination of a DSP and a microprocessor, and the like.
  • the communication module 3004 can be a transceiver, a transceiver circuit, a communication interface, or the like.
  • the storage module 3002 can be a memory.
  • the processing module 3001 is a processor (such as the processor 101 shown in FIG. 2)
  • the communication module 3004 is an RF circuit (such as the RF circuit 102 shown in FIG. 2)
  • the storage module 3002 is a memory (such as the memory shown in FIG. 2). 103)
  • the display module 3003 is a touch screen (including the touch panel 104-1 and the display 104-2 shown in FIG. 2)
  • the terminal provided by the present application may be the mobile phone 100 shown in FIG. 2.
  • the communication module 3004 may include not only an RF circuit but also a WiFi module and a Bluetooth module. Communication modules such as RF circuits, WiFi modules, and Bluetooth modules can be collectively referred to as communication interfaces. Wherein, the above processor, communication interface, touch screen and memory can be coupled together by a bus.
  • the embodiment of the present application further provides a graphical user interface (GUI), where the graphical user interface is stored in the terminal, and the terminal includes a touch screen, a memory, a processor, and a communication interface, the processor configured to execute one or more computer programs stored in the memory, the graphical user interface comprising: a first GUI; responsive to the first gesture entered in the first GUI, Displaying a second GUI, the high-frequency touch area on the first side of the second GUI includes a first touch panel, and the first touch panel is configured to operate the first GUI in response to a gesture input by a user, where the first gesture is It is a gesture of finger input on the first side of the user, and the high-frequency touch area is a touch area on the second GUI that is operated by the user or whose frequency is higher than the first threshold.
  • GUI graphical user interface
  • the GUI further includes: a fourth GUI, where the fourth GUI includes first prompt information, where the first prompt information is used to prompt the user to slide on the fourth GUI by using the finger of the first side. .
  • an embodiment of the present application further provides a graphical user interface (GUI), which is stored in a terminal, where the terminal includes a touch screen, a memory, a processor, and a communication interface, and the processor is configured to execute the storage in the memory.
  • GUI graphical user interface
  • the method includes: a first GUI, where the first GUI includes at least two application icons.
  • the high frequency touch area of the first side of the second GUI includes at least one high frequency application icon
  • the first gesture is the user's first a finger input gesture
  • the high frequency touch area is a touch area on the second GUI that is operated by a user or a frequency higher than a first threshold
  • the at least one high frequency application icon is the at least two application icons The application icon in which the frequency or number of times the user is operated is higher than the second threshold.
  • the second GUI includes a folder icon
  • the folder icon includes the at least one high frequency application icon.
  • the GUI further includes: displaying, in response to the input of the folder icon in the second GUI, a third GUI, where the third GUI includes a file corresponding to the folder icon
  • the folder expands the window, and the above-mentioned folder expansion window displays the above-mentioned at least one high frequency application icon.
  • the embodiment of the present application further provides a graphical user interface (GUI), which is stored in a terminal, where the terminal includes a touch screen, a memory, a processor, and a communication interface, and the processor is configured to execute One or more computer programs stored in a memory, the graphical user interface being stored in a terminal, the terminal comprising a touch screen, a memory, a processor, and a communication interface, the processor for executing one or more computer programs stored in the memory, the
  • the graphical user interface includes: a first GUI, the first GUI includes a first interface element, the first interface element includes a navigation bar icon and/or a dock bar icon; and in response to the first gesture input in the first GUI, displaying the first
  • the GUI includes the first interface element in the display area of the first side of the second GUI.
  • the present application also provides a computer storage medium having stored therein computer program code, when the processor executes the computer program code, the terminal performs FIG. 3, FIG. 9, FIG. 11, FIG. 14, FIG.
  • the related method steps in any of Figures 21, 22B and 23 implement the display method of the terminal interface in the above embodiment.
  • the present application also provides a computer program product that, when executed on a computer, causes the computer to perform any of Figures 3, 9, 11, 14, 17, 17, 21B, and 23
  • the related method steps in a drawing implement the display method of the terminal interface in the above embodiment.
  • the terminal 2900, the terminal 3000, the computer storage medium or the computer program product provided by the application are all used to perform the corresponding method provided above. Therefore, the beneficial effects that can be achieved can be referred to the corresponding ones provided above. The beneficial effects in the method are not described here.
  • the disclosed system, apparatus, and method may be implemented in other manners.
  • the device embodiments described above are merely illustrative.
  • the division of the modules or units is only a logical function division.
  • there may be another division manner for example, multiple units or components may be used. Combinations can be integrated into another system, or some features can be ignored or not executed.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, device or unit, and may be in an electrical, mechanical or other form.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware or in the form of a software functional unit.
  • the integrated unit if implemented in the form of a software functional unit and sold or used as a standalone product, may be stored in a computer readable storage medium.
  • the technical solution of the present application may contribute to the prior art or all or part of the technical solution may be in the form of a software product.
  • the computer software product is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor to perform the methods described in various embodiments of the present application. All or part of the steps.
  • the foregoing storage medium includes: a flash memory, a mobile hard disk, a read only memory, a random access memory, a magnetic disk, or an optical disk, and the like, which can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

本申请提供一种终端界面的显示方法及终端,涉及通信技术领域,可以在不影响用户视觉和操作体验的前提下,使得用户可以操作终端界面中,第一侧的手指无法触及的区域。具体方案包括:终端响应于用户在第一界面输入的第一手势,确定终端的第一侧的高频触控区域,第一手势是用户的第一侧的手指输入的手势,高频触控区域是终端界面上被用户操作的频率或者次数高于第一阈值的触控区域,第一界面中包括至少两个应用图标;终端在第一侧的高频触控区域,显示至少一个高频应用图标,至少一个高频应用图标是至少两个应用图标中被用户操作的频率或者次数高于第二阈值的应用图标。

Description

一种终端界面的显示方法及终端 技术领域
本申请涉及通信技术领域,尤其涉及一种终端界面的显示方法及终端。
背景技术
随着电子技术的发展,触屏手机的触摸屏越来越大,并且触屏手机越来越普及。但是,当触屏手机的触摸屏较大时,用户单手无法便捷的操作该触屏手机。
现有技术可以采用以下方式解决用户单手无法便捷操作大屏手机的问题。具体的,可以在触屏手机上配置传感器,以识别用户当前使用左手或者右手操作手机;根据识别结果更新手机的显示界面,以方便用户操作。例如,当识别到用户当前使用右手操作手机时,可以将手机中的应用图标显示在手机触摸屏的右侧。
但是,存在的问题是:现有技术在识别用户的左手/右手操作时,需要在手机中增加额外的硬件设备(如传感器),成本较高,并且将手机中的应用图标显示在手机触摸屏的右侧显示时,势必会缩小应用图标或者缩小应用图标之间的空隙,影响用户视觉和操作体验。
发明内容
本申请提供一种终端界面的显示方法,可以使用户可以更加便捷、舒适的操作该高频应用图标,可以提高用户体验。
第一方面,本申请提供一种终端界面的显示方法,该方法可以包括:终端响应于用户在第一界面输入的第一手势,确定该终端的第一侧的高频触控区域,该第一手势是用户的第一侧的手指输入的手势,该高频触控区域是终端界面上被用户操作的频率或者次数高于第一阈值的触控区域,该第一界面中包括至少两个应用图标;在第一侧的高频触控区域,显示至少一个高频应用图标,该至少一个高频应用图标是上述至少两个应用图标中被用户操作的频率或者次数高于第二阈值的应用图标。其中,当上述第一侧的手指是左手手指时,终端的第一侧是终端的左侧,当用户的第一侧的手指是右手手指时,终端的第一侧是终端的右侧。
其中,本申请中,当终端确定用户使用第一侧的手指操作终端时,可以将终端中的高频应用图标显示在第一侧的高频触控区域,以便于用户可以更加便捷、舒适的操作该高频应用图标,可以提高用户体验。
在一种可能的设计方法中,上述终端在第一侧的高频触控区域,显示至少一个高频应用图标,包括:终端在第一侧的高频触控区域,显示包括所述至少一个高频应用图标的文件夹图标。
本申请中,终端可以在第一侧的高频触控区域,显示包括所有高频应用图标的文件夹图标。这样,便可以解决终端的高频应用图标较多,而第一侧的高频触控区域不足以显示所有的高频应用图标的问题。并且,将包括所有高频应用图标的文件夹图标显示在第一侧的高频触控区域,可以方便用户操纵该终端中的所有高频应用图标。
在另一种可能的设计方法中,在上述终端在第一侧的高频触控区域,显示包括所 述至少一个高频应用图标的文件夹图标之后,本申请的方法还包括:终端响应于用户对所述文件夹图标的输入,在第一侧的高频触控区域,显示文件夹图标对应的文件夹展开窗口,该文件夹展开窗口中显示上述至少一个高频应用图标。
本申请中,终端还可以响应于用于对包括高频应用图标的文件夹图标的输入,将该文件夹图标对应的文件夹展开窗口显示在第一侧的高频触控区域,可以方便用户操纵该终端中的所有高频应用图标。
在另一种可能的设计方法中,上述确定所述终端的第一侧的高频触控区域,包括:终端根据第一侧轨迹模型中的至少一个第一侧的手指滑动轨迹的坐标,确定用户第一侧的手指在终端界面上的高频触控区域。其中,第一侧轨迹模型是左手轨迹模型或者右手轨迹模型,该右手轨迹模型中包括至少一个右手滑动轨迹的坐标,该左手轨迹模型中包括至少一个左手滑动轨迹的坐标。
本申请中,终端可以根据第一侧轨迹模型中的至少一个第一侧的手指滑动轨迹的坐标,判断第一手势是用户使用左手手指或者右手手指(第一侧的手指)输入的手势,然后确定用户第一侧的手指在终端界面上的高频触控区域。根据轨迹模型中的滑动轨迹的坐标,判断第一手势是用户使用左手手指或者右手手指输入的手势,不需要额外增加硬件设备,可以降低判断用户使用左手或者右手操作手机的成本。
在另一种可能的设计方法中,上述终端响应于用户在第一界面输入的第一手势,确定终端的第一侧的高频触控区域,可以包括:终端响应于用户在第一界面输入的所述第一手势,计算第一手势的滑动轨迹的起点和终点的连线,与坐标轴的x轴或者y轴的夹角的正切值;当正切值属于终端的第一侧对应的取值区间,并且第一手势的滑动轨迹中预设比例的点靠近终端的第一侧时,终端确定第一侧的高频触控区域。
本申请中,终端可以通过判断第一手势的滑动轨迹的起点和终点的连线,与坐标轴的x轴或者y轴的夹角的正切值的取值区间,以及第一手势的滑动轨迹的点的分布,便可以判断用户使用左手或者右手操作手机,如此可以避免由于额外增加硬件设备,导致成本较高的问题。
在另一种可能的设计方法中,上述终端响应于用户在第一界面输入的第一手势,确定终端的第一侧的高频触控区域,包括:终端响应于用户在第一界面输入的第一手势,确定第一手势的滑动轨迹的起点坐标和终点坐标;终端从左手轨迹模型和右手轨迹模型中,查找第一滑动轨迹,该第一滑动轨迹的起点坐标和终点坐标在终端界面的分布与第一手势的滑动轨迹的起点坐标和终点坐标相匹配,该左手轨迹模型中包括至少一个左手滑动轨迹的坐标,该右手轨迹模型中包括至少一个右手滑动轨迹的坐标;当终端从第一侧轨迹模型中查找到第一滑动轨迹时,确定终端的第一侧的高频触控区域,该第一侧轨迹模型是左手轨迹模型或者右手轨迹模型。
其中,本申请中第一滑动轨迹的起点坐标和终点坐标在终端界面的分布与第一手势的滑动轨迹的起点坐标和终点坐标相匹配,具体指:第一滑动轨迹的起点坐标与第一手势的滑动轨迹的起点坐标相同,第一滑动轨迹的终点坐标与第一手势的滑动轨迹的终点坐标相同。或者,左手轨迹模型或者右手轨迹模型中包括的是第一滑动轨迹的起点坐标的取值范围和终点坐标的取值范围。本申请中第一滑动轨迹的起点坐标和终点坐标在终端界面的分布与第一手势的滑动轨迹的起点坐标和终点坐标相匹配,具体 指:第一手势的起点坐标在第一滑动轨迹的起点坐标的取值范围内,第一手势的滑动轨迹的终点坐标在第一滑动轨迹的终点坐标的取值范围内。
在另一种可能的设计方法中,在终端响应于用户在第一界面输入的第一手势,确定该终端的第一侧的高频触控区域之前,本申请的方法还包括:终端响应于用户在终端界面输入的第四手势,确定第四手势是用户第一侧的手指输入的手势,并在第一侧轨迹模型中保存第四手势的滑动轨迹的坐标。
本申请中,终端可以在用户无感知的情况下,统计用户输入的多个手势的滑动轨迹的坐标(相当于用户在终端的触摸屏上的手势习惯),在第一侧轨迹模型中保存滑动轨迹的坐标,以便于后续在接收到用户输入的第一手势后,可以对比该左手轨迹模型和右手轨迹模型中的滑动轨迹的坐标,判断该第一手势是用户左手输入的手势,还是用户右手输入的手势。
在另一种可能的设计方法中,在终端响应于用户在第一界面输入的第一手势,确定终端的第一侧的高频触控区域之前,本申请的方法还包括:终端显示第三界面,该第三界面中包括第一提示信息,该第一提示信息用于提示用户使用第一侧的手指在终端界面上滑动;终端响应于用户在第三界面中输入的至少两个第三手势,统计至少两个第三手势的滑动轨迹的坐标,得到至少一个第一侧的手指滑动轨迹的坐标,该第三手势是用户第一侧的手指输入的手势;终端在第一侧轨迹模型中保存至少一个第一侧的手指滑动轨迹的坐标。
本申请中,终端可以针对性的引导用户使用左手输入手势或者使用右手输入手势。当终端针对性引导用户输入左手手势后,可以收集用户根据终端的引导输入的左手手势的滑动轨迹的坐标;当终端针对性引导用户输入右手手势后,可以收集用户根据终端的引导输入的右手手势的滑动轨迹的坐标。这样,可以提高在第一侧轨迹模型中保存的滑动轨迹的坐标准确性。
在另一种可能的设计方法中,为了进一步提高在第一侧轨迹模型中保存的滑动轨迹的坐标准确性,终端还可以响应于用户在第三界面中输入的第三手势,先判断第三手势是用户的左侧手指输入的手势,还是右侧手指输入的手势,然后向用户确定终端的判断是否正确;当用户确定终端的判断正确后,才保存对应的滑动轨迹的坐标。具体的,在终端响应于用户在第一界面输入的第一手势,确定所述终端的第一侧的高频触控区域之前,本申请的方法还包括:终端显示上述第三界面;终端响应于用户在第三界面中输入的第三手势,确定第三手势是用户第一侧的手指输入的手势,并显示第四界面,该第四界面中包括用于确定第三手势是否为用户第一侧的手指输入的手势的提示信息;终端响应于用户在第四界面的第一输入,在第一侧轨迹模型中保存第三手势的滑动轨迹的坐标,该第一输入用于指示第三手势是用户第一侧的手指输入的手势。
本申请中,终端不仅可以针对性的引导用户使用左手输入手势或者使用右手输入手势。并且,在用户按照终端的指示输入手势后,终端可以通过两重的判断过程,来确定用户输入的手势是该用户哪一侧手指输入的手势。即终端可以先判断判断第三手势是用户的左侧手指输入的手势,还是右侧手指输入的手势;然后向用户确定终端的判断是否正确;当用户确定终端的判断正确后,才保存对应的滑动轨迹的坐标。通过上述两重判断过程,可以提高在第一侧轨迹模型中保存的滑动轨迹的坐标准确性。
第二方面,本申请提供一种终端界面的显示方法,该方法包括:终端响应于用户在第一界面输入的第一手势,确定终端的第一侧的高频触控区域,该第一手势是用户的第一侧的手指输入的手势,该高频触控区域是终端界面上被用户操作的频率或者次数高于第一阈值的触控区域;终端在第一侧的高频触控区域显示第一触控面板,该第一触控面板用于响应于用户输入的手势操作所述第一界面;终端响应于用户在第一触控面板中的输入的第二手势,显示第二界面,该第二界面中包括终端响应于用户在第一界面的对应位置输入的第三手势所显示的界面元素。其中,当第一侧的手指是左手手指时,终端的第一侧是终端的左侧,当用户的第一侧的手指是右手手指时,终端的第一侧是终端的右侧。
本申请提供的一种终端界面的显示方法,可以识别用户在触摸屏上输入的手势,以判断用户使用左手或者右手操作手机,如此便可以避免由于额外增加硬件设备,导致成本较高的问题。并且,在本申请中,终端可以在识别到用户使用第一侧的手指(如左手或者右手)操作终端时,在终端界面上显示可以用于操作终端界面的第一触控面板,以便于用户可以在该第一触控面板操作终端界面上的所有内容。如此,便可以在不影响用户视觉和操作体验的前提下,使得用户可以操作终端界面中,第一侧的手指无法触及的区域。并且,上述第一触控面板显示在第一侧的高频触控区域,可以进一步方便用户可以在该第一触控面板操作终端界面上的所有内容。
需要说明的是,本申请第二方面中,“终端响应于用户在第一界面输入的第一手势,确定终端的第一侧的高频触控区域”的方法,可以参考第一方面的上述可能的设计方法中的相关描述,本申请这里不再赘述。
在一种可能的设计方法中,在终端响应于用户在第一界面输入的第一手势,确定该终端的第一侧的高频触控区域之前,本申请的方法还包括:终端响应于用户在终端界面输入的第四手势,确定第四手势是用户第一侧的手指输入的手势,并在第一侧轨迹模型中保存第四手势的滑动轨迹的坐标。其中,终端响应于第四手势,并保存第四手势的滑动轨迹的坐标的具体方式和效果分析,可以参考本申请第一方面的可能的设计方法终端详细描述,本申请这里不再赘述。
在另一种可能的设计方法中,终端可以针对性的引导用户使用左手输入手势或者使用右手输入手势,并保存用户输入的手势的滑动轨迹的坐标。其中,终端针对性引导用户使用左手输入手势或者使用右手输入手势,并保存用户输入的手势的滑动轨迹的坐标的具体方式和效果分析,可以参考本申请第一方面的可能的设计方法终端详细描述,本申请这里不再赘述。
第三方面,本申请提供一种终端界面的显示方法,该方法包括:终端响应于用户在第一界面输入的第一手势,确定该第一手势是用户第一侧的手指输入的手势,该第一界面中包括第一界面元素,该第一界面元素包括导航栏图标和/或码头栏图标;终端将第一界面元素移动至靠近终端的第一侧的显示区域显示。
本申请中,当终端确定用户使用第一侧的手指操作终端时,可以将终端界面中的码头栏图标和/或导航栏图标,移动至靠近终端的第一侧的显示区域显示,以便于用户可以更加便捷、舒适的操作该码头栏图标和/或导航栏图标,可以提高用户体验。
第四方面,本申请提供一种终端,该终端包括:输入单元、确定单元和显示单元。 其中,该输入单元,用于接收用户在第一界面输入的第一手势,上述第一手势是用户的第一侧的手指输入的手势。该确定单元,用于响应于上述输入单元接收的用户在上述第一界面输入的上述第一手势,确定上述终端的第一侧的高频触控区域,上述高频触控区域是终端界面上被用户操作的频率或者次数高于第一阈值的触控区域,上述第一界面中包括至少两个应用图标。该显示单元,用于在上述确定单元确定的上述第一侧的高频触控区域,显示至少一个高频应用图标,上述至少一个高频应用图标是上述至少两个应用图标中被用户操作的频率或者次数高于第二阈值的应用图标。其中,当上述第一侧的手指是左手手指时,上述终端的第一侧是终端的左侧,当上述第一侧的手指是右手手指时,上述终端的第一侧是终端的右侧。
在一种可能的设计方法中,上述显示单元,具体用于:在上述第一侧的高频触控区域,显示包括上述至少一个高频应用图标的文件夹图标。
在另一种可能的设计方法中,上述输入单元,还用于在上述显示单元在上述第一侧的高频触控区域,显示包括上述至少一个高频应用图标的文件夹图标之后,接收用户对上述文件夹图标的输入。上述显示单元,还用于响应于用户对上述文件夹图标的输入,在上述第一侧的高频触控区域,显示上述文件夹图标对应的文件夹展开窗口,上述文件夹展开窗口中显示上述至少一个高频应用图标。
在另一种可能的设计方法中,上述确定单元,具体用于:根据第一侧轨迹模型中的至少一个第一侧的手指滑动轨迹的坐标,确定上述终端的第一侧的高频触控区域。其中,上述第一侧轨迹模型是左手轨迹模型或者上述右手轨迹模型,上述右手轨迹模型中包括至少一个右手滑动轨迹的坐标,上述左手轨迹模型中包括至少一个左手滑动轨迹的坐标。
在另一种可能的设计方法中,上述确定单元,具体用于:计算上述第一手势的滑动轨迹的起点和终点的连线,与坐标轴的x轴或者y轴的夹角的正切值;当上述正切值属于上述终端的第一侧对应的取值区间,并且上述第一手势的滑动轨迹中预设比例的点靠近上述终端的第一侧时,确定上述第一侧的高频触控区域。
在另一种可能的设计方法中,上述确定单元,具体用于:确定上述第一手势的滑动轨迹的起点坐标和终点坐标;从左手轨迹模型和右手轨迹模型中,查找第一滑动轨迹,上述第一滑动轨迹的起点坐标和终点坐标在终端界面的分布与上述第一手势的滑动轨迹的起点坐标和终点坐标相匹配,上述左手轨迹模型中包括至少一个左手滑动轨迹的坐标,上述右手轨迹模型中包括至少一个右手滑动轨迹的坐标;当从第一侧轨迹模型中查找到上述第一滑动轨迹时,确定上述终端的第一侧的高频触控区域,上述第一侧轨迹模型是上述左手轨迹模型或者上述右手轨迹模型。
在另一种可能的设计方法中,上述显示单元,还应用于在上述确定单元确定上述终端的第一侧的高频触控区域之前,显示第三界面,上述第三界面中包括第一提示信息,上述第一提示信息用于提示用户使用上述第一侧的手指在终端界面上滑动。上述输入单元,还用于接收用户在上述第三界面中输入的至少两个第三手势。在这种设计方法中,上述终端还包括:统计单元和存储单元。该统计单元,用于响应于上述输入单元接收的用户在上述第三界面中输入的至少两个第三手势,统计上述至少两个第三手势的滑动轨迹的坐标,得到至少一个第一侧的手指滑动轨迹的坐标,上述第三手势 是用户第一侧的手指输入的手势。该存储单元,用于在上述第一侧轨迹模型中保存上述至少一个第一侧的手指滑动轨迹的坐标。
第五方面,本申请提供一种终端,该终端包括:输入单元、确定单元和显示单元。其中,该输入单元,用于接收用户在第一界面输入的第一手势。该确定单元,用于响应于上述输入单元接收的用户在上述第一界面输入的上述第一手势,确定上述终端的第一侧的高频触控区域,上述第一手势是用户的第一侧的手指输入的手势,上述高频触控区域是终端界面上被用户操作的频率或者次数高于第一阈值的触控区域。该显示单元,还用于在上述显示单元显示的上述第一侧的高频触控区域显示第一触控面板,上述第一触控面板用于响应于用户输入的手势操作上述第一界面。上述输入单元,还用于接收用户在上述显示单元显示的上述第一触控面板中的输入的第二手势。上述显示单元,还用于响应于上述输入单元接收的用户在上述第一触控面板中的输入的上述第二手势,显示第二界面,上述第二界面中包括上述终端响应于用户在上述第一界面的对应位置输入的第三手势所显示的界面元素。其中,当上述第一侧的手指是左手手指时,上述终端的第一侧是终端的左侧,当上述第一侧的手指是右手手指时,上述终端的第一侧是终端的右侧。
需要说明的是,本申请第五方面中,“确定单元确定上述终端的第一侧的高频触控区域”的方法,可以参考第四方面的上述可能的设计方法中对确定单元的相关描述,本申请这里不再赘述。
在一种可能的设计方法中,上述输入单元,还用于在上述确定单元确定该终端的第一侧的高频触控区域之前,接收用户在终端界面输入的第四手势;上述确定单元,还用于响应于用户在终端界面输入的第四手势,确定第四手势是用户第一侧的手指输入的手势。存储单元,用于在第一侧轨迹模型中保存第四手势的滑动轨迹的坐标。
第六方面,本申请提供一种终端,该终端包括:输入单元、确定单元和显示单元。该输入单元,用于接收用户在第一界面输入的第一手势。该确定单元,用于响应于用户在第一界面输入的第一手势,确定该第一手势是用户第一侧的手指输入的手势,该第一界面中包括第一界面元素,该第一界面元素包括导航栏图标和/或码头栏图标;该显示单元,用于将第一界面元素移动至靠近终端的第一侧的显示区域显示。
第七方面,本申请提供一种终端,该终端包括:处理器、存储器和触摸屏,上述存储器、上述触摸屏与上述处理器耦合,上述存储器用于存储计算机程序代码,上述计算机程序代码包括计算机指令,当上述处理器执行上述计算机指令时,上述终端执行如下操作:上述触摸屏,用于显示第一界面,上述第一界面中包括至少两个应用图标。上述处理器,用于响应于用户在上述触摸屏上显示的上述第一界面输入的第一手势,确定上述终端的第一侧的高频触控区域,上述第一手势是用户的第一侧的手指输入的手势,上述高频触控区域是终端界面上被用户操作的频率或者次数高于第一阈值的触控区域。上述触摸屏,还用于在上述处理器确定的上述第一侧的高频触控区域,显示至少一个高频应用图标,上述至少一个高频应用图标是上述至少两个应用图标中被用户操作的频率或者次数高于第二阈值的应用图标。其中,当上述第一侧的手指是左手手指时,上述终端的第一侧是终端的左侧,当上述第一侧的手指是右手手指时,上述终端的第一侧是终端的右侧。
在一种可能的设计方法中,上述触摸屏,具体用于:在上述第一侧的高频触控区域,显示包括上述至少一个高频应用图标的文件夹图标。
在一种可能的设计方法中,上述处理器,还用于在上述第一侧的高频触控区域,显示包括上述至少一个高频应用图标的文件夹图标之后,接收用户对上述触摸屏显示的上述文件夹图标的输入。上述触摸屏,还用于响应于用户对上述文件夹图标的输入,在上述第一侧的高频触控区域,显示上述文件夹图标对应的文件夹展开窗口,上述文件夹展开窗口中显示上述至少一个高频应用图标。
在一种可能的设计方法中,上述处理器,具体用于:根据第一侧轨迹模型中的至少一个第一侧的手指滑动轨迹的坐标,确定上述终端的第一侧的高频触控区域。其中,上述第一侧轨迹模型是左手轨迹模型或者上述右手轨迹模型,上述右手轨迹模型中包括至少一个右手滑动轨迹的坐标,上述左手轨迹模型中包括至少一个左手滑动轨迹的坐标。
第八方面,本申请提供一种终端,该终端包括:处理器、存储器和触摸屏,上述存储器、上述触摸屏与上述处理器耦合,上述存储器用于存储计算机程序代码,上述计算机程序代码包括计算机指令,当上述处理器执行上述计算机指令时,上述终端执行如下操作:上述触摸屏,用于显示第一界面。上述处理器,用于响应于用户在上述触摸屏上显示的上述第一界面输入的第一手势,确定上述终端的第一侧的高频触控区域,上述第一手势是用户的第一侧的手指输入的手势,上述高频触控区域是终端界面上被用户操作的频率或者次数高于第一阈值的触控区域。上述触摸屏,还用于在上述处理器确定的上述第一侧的高频触控区域显示第一触控面板,上述第一触控面板用于响应于用户输入的手势操作上述第一界面。上述处理器,还用于接收用户在上述触摸屏显示的上述第一触控面板中的输入的第二手势。上述触摸屏,还用于响应于用户在上述第一触控面板中的输入的第二手势,显示第二界面,上述第二界面中包括上述终端响应于用户在上述第一界面的对应位置输入的第三手势所显示的界面元素。其中,当上述第一侧的手指是左手手指时,上述终端的第一侧是终端的左侧,当上述第一侧的手指是右手手指时,上述终端的第一侧是终端的右侧。
在一种可能的设计方法中,上述处理器,具体用于:根据第一侧轨迹模型中的至少一个第一侧的手指滑动轨迹的坐标,确定上述终端的第一侧的高频触控区域。其中,上述第一侧轨迹模型是左手轨迹模型或者上述右手轨迹模型,上述右手轨迹模型中包括至少一个右手滑动轨迹的坐标,上述左手轨迹模型中包括至少一个左手滑动轨迹的坐标。
在一种可能的设计方法中,上述处理器,具体用于:计算上述第一手势的滑动轨迹的起点和终点的连线,与坐标轴的x轴或者y轴的夹角的正切值;当上述正切值属于上述终端的第一侧对应的取值区间,并且上述第一手势的滑动轨迹中预设比例的点靠近上述终端的第一侧时,确定上述第一侧的高频触控区域。
在另一种可能的设计方法中,上述处理器,具体用于:确定上述第一手势的滑动轨迹的起点坐标和终点坐标;从左手轨迹模型和右手轨迹模型中,查找第一滑动轨迹,上述第一滑动轨迹的起点坐标和终点坐标在终端界面的分布与上述第一手势的滑动轨迹的起点坐标和终点坐标相匹配,上述左手轨迹模型中包括至少一个左手滑动轨迹的 坐标,上述右手轨迹模型中包括至少一个右手滑动轨迹的坐标;当从第一侧轨迹模型中查找到上述第一滑动轨迹时,确定上述终端的第一侧的高频触控区域,上述第一侧轨迹模型是上述左手轨迹模型或者上述右手轨迹模型。
在另一种可能的设计方法中,上述触摸屏,还用于在上述处理器确定上述终端的第一侧的高频触控区域之前,显示第三界面,上述第三界面中包括第一提示信息,上述第一提示信息用于提示用户使用上述第一侧的手指在终端界面上滑动。上述处理器,还用于接收用户在上述触摸屏显示的上述第三界面中输入的至少两个第三手势;响应于用户在上述第三界面中输入的至少两个第三手势,统计上述至少两个第三手势的滑动轨迹的坐标,得到至少一个第一侧的手指滑动轨迹的坐标,上述第三手势是用户第一侧的手指输入的手势。上述存储器,还用于在上述第一侧轨迹模型中保存上述至少一个第一侧的手指滑动轨迹的坐标。
在另一种可能的设计方法中,上述处理器,还用于在响应于用户在第一界面输入的第一手势,确定上述第一手势是用户左手输入的手势或者右手输入的手势之前,响应于用户在终端界面输入的第四手势,确定上述第四手势是用户第一侧的手指输入的手势。上述存储器,还用于在第一侧轨迹模型中保存上述第四手势的滑动轨迹的坐标。
第九方面,本申请提供一种终端,该终端包括:处理器、存储器和触摸屏,上述存储器、上述触摸屏与上述处理器耦合,上述存储器用于存储计算机程序代码,上述计算机程序代码包括计算机指令,当上述处理器执行上述计算机指令时,上述终端执行如下操作:该处理器,用于接收用户在第一界面输入的第一手势;响应于用户在第一界面输入的第一手势,确定该第一手势是用户第一侧的手指输入的手势,该第一界面中包括第一界面元素,该第一界面元素包括导航栏图标和/或码头栏图标;该触摸屏,用于将第一界面元素移动至靠近终端的第一侧的显示区域显示。
第十方面,本申请提供一种图形用户界面(Graphical User Interface,GUI),上述图形用户界面存储在终端中,上述终端包括触摸屏、存储器和处理器,上述处理器用于执行存储在上述存储器中的一个或多个计算机程序,上述图形用户界面包括:显示在上述触摸屏上的第一GUI,上述第一GUI包括至少两个应用图标。响应于在上述第一GUI中输入的第一手势,显示第二GUI,上述第二GUI的第一侧的高频触控区域包括至少一个高频应用图标,上述第一手势是用户的第一侧的手指输入的手势,上述高频触控区域是上述第二GUI上被用户操作的频率或者次数高于第一阈值的触控区域,上述至少一个高频应用图标是上述至少两个应用图标中被用户操作的频率或者次数高于第二阈值的应用图标。
在一种可能的设计方法中,上述第二GUI中包括文件夹图标,上述文件夹图标中包括上述至少一个高频应用图标。
在一种可能的设计方法中,上述GUI还包括:响应于对上述第二GUI中的上述文件夹图标的输入,显示第三GUI,上述第三GUI包括上述文件夹图标对应的文件夹展开窗口,上述文件夹展开窗口中显示上述至少一个高频应用图标。
第十一方面,本申请提供一种图形用户界面(GUI),上述图形用户界面存储在终端中,上述终端包括触摸屏、存储器和处理器,上述处理器用于执行存储在上述存储器中的一个或多个计算机程序,其特征在于,上述图形用户界面包括:显示在上述 触摸屏上的第一GUI;响应于在上述第一GUI中输入的第一手势,显示第二GUI,上述第二GUI的第一侧的高频触控区域包括第一触控面板,上述第一触控面板用于响应于用户输入的手势操作上述第一GUI,上述第一手势是用户的第一侧的手指输入的手势,上述高频触控区域是上述第二GUI上被用户操作的频率或者次数高于第一阈值的触控区域。响应于在上述第二GUI中的上述第一触控面板的输入的第二手势,显示第三GUI,上述第三GUI包括上述终端响应于用户在上述第一GUI的对应位置输入的第三手势所显示的界面元素。
在一种可能的设计方法中,上述GUI还包括:显示在上述触摸屏上的第四GUI,上述第四GUI中包括第一提示信息,上述第一提示信息用于提示用户使用上述第一侧的手指在上述第四GUI上滑动。
第十二方面,本申请提供一种图形用户界面(GUI),上述图形用户界面存储在终端中,上述终端包括触摸屏、存储器和处理器,上述处理器用于执行存储在上述存储器中的一个或多个计算机程序,其特征在于,上述图形用户界面包括:显示在上述触摸屏上的第一GUI,该第一GUI中包括第一界面元素,该第一界面元素包括导航栏图标和/或码头栏图标;响应于在第一GUI输入的第一手势,显示第二GUI,该第二GUI的第一侧的显示区域中包括上述第一界面元素。
第十三方面,本申请提供一种计算机存储介质,该计算机存储截止包括计算机指令,当该计算机指令在终端上运行时,使得所述终端执行如本申请第一方面、第二方面、第三方面,及其任一种可能的设计方法所述的终端界面的显示方法。
第十四方面,本申请提供一种计算机程序产品,当所述计算机程序产品在计算机上运行时,使得所述计算机执行如本申请第一方面、第二方面、第三方面,及其任一种可能的设计方法所述的终端界面的显示方法。
可以理解,上述提供的第四方面至第九方面及其可能的设计方法所述的终端、第十方面至第十二方面所述的GUI,第十三方面所述的计算机存储介质,以及第十四方面所述的计算机程序产品均用于执行上文所提供的对应的方法,因此,其所能达到的有益效果可参考上文所提供的对应的方法中的有益效果,此处不再赘述。
附图说明
图1为本申请提供的一种手机的终端界面实例示意图一;
图2为本申请提供的一种手机的硬件结构示意图;
图3为本申请提供的一种终端界面的显示方法流程图一;
图4为本申请提供的一种手机的终端界面实例示意图二;
图5为本申请提供的一种第一界面与第一触控面板的映射实例示意图;
图6A为本申请提供的一种手机的终端界面实例示意图三;
图6B为本申请提供的一种手机的终端界面实例示意图四;
图7为本申请提供的一种手机的终端界面实例示意图五;
图8为本申请提供的一种手机的终端界面实例示意图六;
图9为本申请提供的一种终端界面的显示方法流程图二;
图10为本申请提供的一种手机上的坐标轴及坐标轴中的触摸点的实例示意图;
图11为本申请提供的一种终端界面的显示方法流程图三;
图12为本申请提供的一种轨迹数据库的实例示意图;
图13为本申请提供的一种终端界面的显示方法所应用的网络架构实例示意图;
图14为本申请提供的一种终端界面的显示方法流程图四;
图15为本申请提供的一种手机的终端界面实例示意图七;
图16为本申请提供的一种手机的终端界面实例示意图八;
图17为本申请提供的一种终端界面的显示方法流程图五;
图18为本申请提供的一种滑动轨迹的实例示意图一;
图19为本申请提供的一种滑动轨迹的实例示意图二;
图20为本申请提供的一种高频触控区域实例示意图;
图21为本申请提供的一种终端界面的显示方法流程图六;
图22A为本申请提供的一种手机的终端界面实例示意图九;
图22B为本申请提供的一种终端界面的显示方法流程图七;
图22C为本申请提供的一种手机的终端界面实例示意图十;
图22D为本申请提供的一种手机的终端界面实例示意图十一;
图23为本申请提供的一种终端界面的显示方法流程图八;
图24为本申请提供的一种手机的终端界面实例示意图十二;
图25为本申请提供的一种手机的终端界面实例示意图十三;
图26为本申请提供的一种手机的终端界面实例示意图十四;
图27为本申请提供的一种手机的终端界面实例示意图十五;
图28为本申请提供的一种手机的终端界面实例示意图十六;
图29为本申请提供的一种终端的结构组成示意图一;
图30为本申请提供的一种终端的结构组成示意图二。
具体实施方式
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征。在本申请的描述中,除非另有说明,“多个”的含义是两个或两个以上。
其中,用户在使用触屏手机时,当触屏手机的触摸屏较大时,用户单手无法便捷的操作该触屏手机。例如,如图1中的(a)所示,当用户左手持手机100时,用户的左手无法触及并操作显示在手机100的触摸屏右上方的应用图标,如“招商银行”图标01;如图1中的(b)所示,当用户右手持手机100时,用户的右手无法触及并操作显示在手机100的触摸屏左上方的应用图标,如“照片”图标02。
本申请提供的一种终端界面的显示方法及终端,可以识别用户在触摸屏上输入的手势,以判断用户使用左手或者右手操作手机,如此便可以避免由于额外增加硬件设备,导致成本较高的问题。并且,在本申请中,终端可以在识别到用户使用第一侧(如左手或者右手)的手指操作终端时,在靠近终端的第一侧显示可以用于操作终端界面的触控区域,以便于用户可以在该触控区域操作终端界面上的所有内容。如此,便可以在不影响用户视觉和操作体验的前提下,使得用户可以操作终端界面中,第一侧的手指无法触及的区域。例如,用户可以在该触控区域,操作如图1中的(b)所示其右 手无法触及“照片”图标02。
其中,本申请提供的终端界面的显示方法的执行主体可以为终端界面的显示装置,该终端界面的显示装置可以为图1或者图2所示的手机100。同时,该终端界面的显示装置还可以为该终端的中央处理器(英文:Central Processing Unit,简称:CPU),或者该终端中的用于执行终端界面的显示方法的控制模块。本发明实施例中以终端执行终端界面的显示方法为例,说明本发明实施例提供的终端界面的显示方法。
示例性的,本申请中的终端可以为可以安装应用程序并显示应用程序图标的手机(如图2所示的手机100)、平板电脑、个人计算机(Personal Computer,PC)、个人数字助理(personal digital assistant,PDA)、智能手表、上网本、可穿戴电子设备等,本申请对该设备的具体形式不做特殊限制。
如图2所示,以手机100作为上述终端举例,手机100具体可以包括:处理器101、射频(Radio Frequency,RF)电路102、存储器103、触摸屏104、蓝牙装置105、一个或多个传感器106、无线保真(Wireless Fidelity,WiFi)装置107、定位装置108、音频电路109、外设接口110以及电源装置111等部件。这些部件可通过一根或多根通信总线或信号线(图2中未示出)进行通信。本领域技术人员可以理解,图2中示出的硬件结构并不构成对手机的限定,手机100可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。
下面结合图2对手机100的各个部件进行具体的介绍:
处理器101是手机100的控制中心,利用各种接口和线路连接手机100的各个部分,通过运行或执行存储在存储器103内的应用程序,以及调用存储在存储器103内的数据,执行手机100的各种功能和处理数据。在一些实施例中,处理器101可包括一个或多个处理单元;举例来说,处理器101可以是华为技术有限公司制造的麒麟960芯片。在本申请一些实施例中,上述处理器101还可以包括指纹验证芯片,用于对采集到的指纹进行验证。
射频电路102可用于在收发信息或通话过程中,无线信号的接收和发送。特别地,射频电路102可以将基站的下行数据接收后,给处理器101处理;另外,将涉及上行的数据发送给基站。通常,射频电路包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器、双工器等。此外,射频电路102还可以通过无线通信和其他设备通信。所述无线通信可以使用任一通信标准或协议,包括但不限于全球移动通讯***、通用分组无线服务、码分多址、宽带码分多址、长期演进、电子邮件、短消息服务等。
存储器103用于存储应用程序以及数据,处理器101通过运行存储在存储器103的应用程序以及数据,执行手机100的各种功能以及数据处理。存储器103主要包括存储程序区以及存储数据区,其中,存储程序区可存储操作***、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等);存储数据区可以存储根据使用手机100时所创建的数据(比如音频数据、电话本等)。此外,存储器103可以包括高速随机存取存储器(Random Access Memory,RAM),还可以包括非易失存储器,例如磁盘存储器件、闪存器件或其他易失性固态存储器件等。存储器103可以存储各种操作***,例如,苹果公司所开发的
Figure PCTCN2017103288-appb-000001
操作***,谷歌公司所开发的
Figure PCTCN2017103288-appb-000002
操作***等。上述存储器103可以是独立的,通过上述通信总线与处理器101相连接;存储器103也可以和处理器101集成在一起。
触摸屏104具体可以包括触控板104-1和显示器104-2。
其中,触控板104-1可采集手机100的用户在其上或附近的触摸事件(比如用户使用手指、触控笔等任何适合的物体在触控板104-1上或在触控板104-1附近的操作),并将采集到的触摸信息发送给其他器件(例如处理器101)。其中,用户在触控板104-1附近的触摸事件可以称之为悬浮触控;悬浮触控可以是指,用户无需为了选择、移动或拖动目标(例如图标等)而直接接触触控板,而只需用户位于设备附近以便执行所想要的功能。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型来实现触控板104-1。
显示器(也称为显示屏)104-2可用于显示由用户输入的信息或提供给用户的信息以及手机100的各种菜单。可以采用液晶显示器、有机发光二极管等形式来配置显示器104-2。触控板104-1可以覆盖在显示器104-2之上,当触控板104-1检测到在其上或附近的触摸事件后,传送给处理器101以确定触摸事件的类型,随后处理器101可以根据触摸事件的类型在显示器104-2上提供相应的视觉输出。虽然在图2中,触控板104-1与显示屏104-2是作为两个独立的部件来实现手机100的输入和输出功能,但是在某些实施例中,可以将触控板104-1与显示屏104-2集成而实现手机100的输入和输出功能。可以理解的是,触摸屏104是由多层的材料堆叠而成,本申请实施例中只展示出了触控板(层)和显示屏(层),其他层在本申请实施例中不予记载。另外,触控板104-1可以以全面板的形式配置在手机100的正面,显示屏104-2也可以以全面板的形式配置在手机100的正面,这样在手机的正面就能够实现无边框的结构。
另外,手机100还可以具有指纹识别功能。例如,可以在手机100的背面(例如后置摄像头的下方)配置指纹识别器112,或者在手机100的正面(例如触摸屏104的下方)配置指纹识别器112。又例如,可以在触摸屏104中配置指纹采集器件112来实现指纹识别功能,即指纹采集器件112可以与触摸屏104集成在一起来实现手机100的指纹识别功能。在这种情况下,该指纹采集器件112配置在触摸屏104中,可以是触摸屏104的一部分,也可以以其他方式配置在触摸屏104中。本申请实施例中的指纹采集器件112的主要部件是指纹传感器,该指纹传感器可以采用任何类型的感测技术,包括但不限于光学式、电容式、压电式或超声波传感技术等。
手机100还可以包括蓝牙装置105,用于实现手机100与其他短距离的设备(例如手机、智能手表等)之间的数据交换。本申请实施例中的蓝牙装置可以是集成电路或者蓝牙芯片等。
手机100还可以包括至少一种传感器106,比如光传感器、运动传感器以及其他传感器。具体地,光传感器可包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节触摸屏104的显示器的亮度,接近传感器可在手机100移动到耳边时,关闭显示器的电源。作为运动传感器的一种,加速计传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别手机姿态的应用(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;至于手机100还可配置的陀螺仪、气压计、湿度计、 温度计、红外线传感器等其他传感器,在此不再赘述。
WiFi装置107,用于为手机100提供遵循WiFi相关标准协议的网络接入,手机100可以通过WiFi装置107接入到WiFi接入点,进而帮助用户收发电子邮件、浏览网页和访问流媒体等,它为用户提供了无线的宽带互联网访问。在其他一些实施例中,该WiFi装置107也可以作为WiFi无线接入点,可以为其他设备提供WiFi网络接入。
定位装置108,用于为手机100提供地理位置。可以理解的是,该定位装置108具体可以是全球定位***(Global Positioning System,GPS)或北斗卫星导航***、俄罗斯GLONASS等定位***的接收器。定位装置108在接收到上述定位***发送的地理位置后,将该信息发送给处理器101进行处理,或者发送给存储器103进行保存。在另外的一些实施例中,该定位装置108还可以是辅助全球卫星定位***(Assisted Global Positioning System,AGPS)的接收器,AGPS***通过作为辅助服务器来协助定位装置108完成测距和定位服务,在这种情况下,辅助定位服务器通过无线通信网络与设备例如手机100的定位装置108(即GPS接收器)通信而提供定位协助。在另外的一些实施例中,该定位装置108也可以是基于WiFi接入点的定位技术。由于每一个WiFi接入点都有一个全球唯一的(Media Access Control,MAC)地址,设备在开启WiFi的情况下即可扫描并收集周围的WiFi接入点的广播信号,因此可以获取到WiFi接入点广播出来的MAC地址;设备将这些能够标示WiFi接入点的数据(例如MAC地址)通过无线通信网络发送给位置服务器,由位置服务器检索出每一个WiFi接入点的地理位置,并结合WiFi广播信号的强弱程度,计算出该设备的地理位置并发送到该设备的定位装置108中。
音频电路109、扬声器113、麦克风114可提供用户与手机100之间的音频接口。音频电路109可将接收到的音频数据转换后的电信号,传输到扬声器113,由扬声器113转换为声音信号输出;另一方面,麦克风114将收集的声音信号转换为电信号,由音频电路109接收后转换为音频数据,再将音频数据输出至RF电路102以发送给比如另一手机,或者将音频数据输出至存储器103以便进一步处理。
外设接口110,用于为外部的输入/输出设备(例如键盘、鼠标、外接显示器、外部存储器、用户识别模块卡等)提供各种接口。例如通过通用串行总线(Universal Serial Bus,USB)接口与鼠标连接,通过用户识别模块卡卡槽上的金属触点与电信运营商提供的用户识别模块卡(Subscriber Identification Module,SIM)卡进行连接。外设接口110可以被用来将上述外部的输入/输出***设备耦接到处理器101和存储器103。
在本发明实施例中,手机100可通过外设接口110与设备组内的其他设备进行通信,例如,通过外设接口110可接收其他设备发送的显示数据进行显示等,本发明实施例对此不作任何限制。
手机100还可以包括给各个部件供电的电源装置111(比如电池和电源管理芯片),电池可以通过电源管理芯片与处理器101逻辑相连,从而通过电源装置111实现管理充电、放电、以及功耗管理等功能。
尽管图2未示出,手机100还可以包括摄像头(前置摄像头和/或后置摄像头)、闪光灯、微型投影装置、近场通信(Near Field Communication,NFC)装置等,在此不再赘述。
以下实施例中的方法均可以在具有上述硬件结构的手机100中实现。
本申请提供一种终端界面的显示方法,该终端界面的显示方法包括S301-S303:
S301、终端响应于用户在第一界面输入的第一手势,确定终端的第一侧的高频触控区域。
其中,用户第一侧的手指是用户的左手手指或者右手手指。
本申请中,终端可以响应于用户在第一界面输入的第一手势,先判断该第一手势是用户左手输入的手势,还是用户右手输入的手势;当确定第一手势是用户左手输入的手势时,确定终端的左侧的高频触控区域;当确定第一手势是用户右手输入的手势时,确定终端的右侧的高频触控区域。具体的,上述S301可以替换为S301a-S301b。如图3所示,上述终端界面的显示方法包括S301a-S301b、S302和S303:
S301a、终端响应于用户在第一界面输入的第一手势,确定第一手势是用户第一侧的手指输入的手势。
举例来说,假设终端是上述手机100,手机100可以显示图4中的(a)所示的第一界面401。当然,本申请中终端所显示的第一界面包括但不限于图4中的(a)所示的包括应用图标的显示桌面401。例如,该第一界面还可以是终端中任一应用程序的任一显示界面。
上述第一手势可以是终端在第一界面的任意区域内输入滑动轨迹。例如,如图4中的(a)所示,用户在第一界面401中输入的滑动轨迹402可以是第一手势对应的滑动轨迹。
在一种实现方式中,终端可以计算第一手势的滑动轨迹的起点和终点的连线,与坐标轴的x轴或者y轴的夹角的正切值,根据该正切值的取值范围,以及该第一手势的滑动轨迹的点在终端界面中的分布情况,来判断该第一手势是用户左手输入的手势,还是用户右手输入的手势。
在另一种实现方式中,终端可以确定第一手势的滑动轨迹的起点坐标和终点坐标,然后从预先保存的左手轨迹模型和右手轨迹模型中,查找起点坐标和终点坐标在终端界面的分布与该第一手势的滑动轨迹的起点坐标和终点坐标相匹配的第一滑动轨迹。如果终端在第一侧轨迹模型(如左手轨迹模型)中查找到该第一滑动轨迹,则可以确定该第一手势是用户第一侧的手指(如左手)输入的手势。
S301b、终端确定终端的第一侧的高频触控区域。
其中,高频触控区域是终端界面上被用户操作的频率或者次数高于第一阈值的触控区域。第一侧的高频触控区域可以是终端的左侧的高频触控区域,或者右侧的高频触控区域。
举例来说,终端可以获取用户在终端界面上输入的左手手势(即左手手指输入的手势)和右手手势(即右手手指输入的手势)的滑动轨迹;统计左手手势的滑动轨迹的点(简称左侧轨迹点)在终端界面中的分布情况,将终端界面中左侧轨迹点分布较多的区域确定为左侧的高频触控区域;统计右手手势的滑动轨迹的点(简称右侧轨迹点)在终端界面中的分布情况,将终端界面中右侧轨迹点分布较多的区域确定为右侧的高频触控区域。其中,轨迹点分布较多的区域是指终端界面中轨迹点的密集程度高于一定阈值的区域。
S302、终端在第一侧的高频触控区域显示第一触控面板,该第一触控面板用于响应于用户输入的手势操作第一界面。
其中,当第一侧的手指是左手手指时,终端的第一侧是终端的左侧,当用户的第一侧的手指是右手手指时,终端的第一侧是终端的右侧。
示例性的,如图4中的(a)所示,假设手机100确定滑动轨迹402对应的手势(即第一手势)是用户的右手手指输入的手势,那么如图4中的(b)所示,手机100则可以在第一界面401中右侧的高频触控区域,显示第一触控面板403。
需要说明的是,本申请中,终端的左侧是指终端的触摸屏沿着其竖向的中线(平行于手机左右两侧的边框的中线)划分为两份后,靠近手机左侧边框的一侧;终端的右侧是指终端的触摸屏沿着其竖向的中线划分为两份后,靠近手机右侧边框的一侧。
一般而言,如图4中的(a)所示,当用户右手持手机时,用户手指一般握住的是手机的右下方;因此,当终端的第一侧是终端的右侧时,该第一侧可以具体指向终端的触摸屏的右下方。如图4中的(b)所示,手机100可以在其触摸屏的右下方显示第一触控面板403。同理,当终端的第一侧是终端的左侧时,该第一侧可以具体指向终端的触摸屏的左下方。
S303、终端响应于用户在第一触控面板中的输入的第二手势,显示第二界面,该第二界面中包括终端响应于用户在第一界面的对应位置输入的第三手势所显示的界面元素。
其中,上述第一触控面板用于响应于用户输入的手势操作第一界面,即用户在上述第一触控面板上的操作可以映射为用户在上述第一界面的相同操作。也即,该第一触控面板上的触摸点可以一一映射为上述第一界面的对应位置上的触摸点。例如,如图5所示,第一触控面板403上的触摸点a可以映射为第一界面401中的触摸点A,第一触控面板403上的触摸点b可以映射为第一界面401中的触摸点B,第一触控面板403上的触摸点c可以映射为第一界面401中的触摸点C。
举例来说,假设当用户点击第一界面401中的触摸点A时,终端可以响应于用户对第一界面401中的触摸点A的点击操作,显示终端界面X。那么,当用户点击第一触控面板403上的触摸点a时,终端可以响应于用户对第一触控面板403上的触摸点a的点击操作,显示包括上述终端界面X中的所有界面元素的终端界面Y。不同的是,该终端界面Y中还可以包括上述第一触控面板403。当然,终端界面Y中也可以不包括上述第一触控面板403,即终端界面Y与终端界面X完全相同。
例如,如图6A中的(a)所示,假设第一触控面板403上的触摸点a可以映射为第一界面401中的“照片”应用的图标所在的触摸点A。那么,如图6A中的(a)所示,当用户手指点击第一触控面板403上的触摸点a时,手机100可以打开“照片”应用,显示图6A中的(b)所示的照片列表界面601和第一触控面板403。当然,图6A中的(b)中的第一触控面板403是可选的,手机100在图6A中的(b)中可以不显示该第一触控面板403。
需要强调的是,为了避免当用户通过第一触控面板操作第一界面时,无法确定用户手指接触第一触控面板的触摸点在第一界面上对应的触摸点,当用户手指接触第一触控面板上的任一触摸点时,终端在第一界面上该触摸点所映射的触摸点对应的位置 处显示光标。该光标可以随用户手指在第一触控面板的移动而移动。
例如,如图6A中的(a)所示,当用户手指点击第一触控面板403上的触摸点a时,手机100可以在“照片”应用的图标所在位置显示光标602。并且,手机100还可以在用户手指点击第一触控面板403上的触摸点a时,以“照片”应用的图标被点击时,该“照片”应用的图标所呈现的动态显示方式,显示该“照片”应用的图标。
可选的,为了方便用户操作,第一触控面板上还可以显示第一界面中的部分可操作界面元素,如“返回”按钮、“分享按钮”等。例如,如图6B中的(a)所示,当第一界面显示照片界面603时,第一触控面板403中还可以包括照片列表界面601的“返回”按钮604。当用户点击“返回”按钮604后,手机100可以显示图6B中的(b)所示的显示界面。相比于图6A中的(b)所示的第一触控面板403,图6B中的(b)所示的第一触控面板403还可以包括“返回相机”按钮605。其中,当用户点击“返回相机”按钮605后,手机100可以响应于用户对“返回相机”按钮605的点击操作,启动相机。
其中,终端可以根据用户使用终端的习惯,确定上述第一触控面板的大小和形状。例如,以用户右手持手机100为例,如图7中的(a)所示,手机100可以统计用户右手持手机时,该用户的右手大拇指可以接触到距离手机100右侧边框的最远距离L1,该用户的右手大拇指可以接触到距离手机100下边框的最远距离L2。然后手机100可以根据L1和L2的大小,确定出如图7中的(b)所示,当上述第一手势是用户的右手手指输入的手势时,手机100所要显示的第一触控面板403。
本申请中的第一触控面板包括但不限于图7中的(b)所示的第一触控面板403,为了更加符合用户的使用习惯,手机100可以统计用户右手持手机时,该用户的右手大拇指可以接触到距离手机100右侧边框的最远距离L1,该用户的右手大拇指可以接触到距离手机100下边框的最远距离L2。然后手机100可以根据L1和L2的大小,确定出如图8中的(a)所示的扇形曲线801。当上述第一手势是用户的右手手指输入的手势时,手机100则可以显示如图8中的(b)所示与扇形曲线801的形状对应的第一触控面板802。
其中,图7和图8仅以举例方式给出本申请中的第一触控面板的两种可能的实例,第一触控面板的大小和形状包括但不限于图7和图8所示的第一触控面板。
本申请提供的一种终端界面的显示方法,可以识别用户在触摸屏上输入的手势,以判断用户使用左手或者右手操作手机,如此便可以避免由于额外增加硬件设备,导致成本较高的问题。并且,在本申请中,终端可以在识别到用户使用第一侧的手指(如左手或者右手)操作终端时,在终端界面上显示可以用于操作终端界面的第一触控面板,以便于用户可以在该第一触控面板操作终端界面上的所有内容。如此,便可以在不影响用户视觉和操作体验的前提下,使得用户可以操作终端界面中,第一侧的手指无法触及的区域。并且,上述第一触控面板显示在第一侧的高频触控区域,可以进一步方便用户可以在该第一触控面板操作终端界面上的所有内容。
在一种可能的设计方法中,终端可以计算第一手势的滑动轨迹的起点和终点的连线,与坐标轴的x轴或者y轴的夹角的正切值,根据该正切值的取值范围,以及该第一手势的滑动轨迹的点在终端界面中的分布情况,来判断该第一手势是用户左手输入的手势,还是用户右手输入的手势。具体的,在这种可能的设计方法中,上述S301a 可以替换为S901-S902。例如,如图9所示,图3中的S301a可以替换为S901-S902:
S901、终端响应于用户在第一界面输入的第一手势,计算第一手势的滑动轨迹的起点和终点的连线,与坐标轴的x轴或者y轴的夹角的正切值。
示例性的,如图10中的(a)所示,假设手机100接收到用户在其触摸屏上输入的第一手势对应的滑动轨迹1001,滑动轨迹1001的起点为点D,滑动轨迹1001的终点为点E。如图10中的(b)所示,点D的坐标为D(x1,y1),点E的坐标为E(x2,y2),滑动轨迹1001的起点和终点的连线为线段EF。那么,线段EF与x轴的夹角则是图10中的(b)所示的α,
Figure PCTCN2017103288-appb-000003
S902、当正切值属于终端的第一侧对应的取值区间,并且第一手势的滑动轨迹中预设比例的点靠近终端的第一侧时,终端确定第一手势是用户第一侧的手指输入的手势。
其中,终端可以统计用户左侧手指输入的手势的滑动轨迹的起点和终点的连线,与坐标轴的x轴或者y轴的夹角的正切值,确定出左侧对应的数据取值区间;统计用户右侧手指输入的手势的滑动轨迹的起点和终点的连线,与坐标轴的x轴或者y轴的夹角的正切值,确定出右侧对应的数据取值区间。
如图10中的(b)所示,假设右侧对应的数据取值区间是[p,q],那么当
Figure PCTCN2017103288-appb-000004
时,终端则可以判断第一手势的滑动轨迹中预设比例的点是否靠近终端的右侧;当
Figure PCTCN2017103288-appb-000005
时,终端则可以判断第一手势的滑动轨迹中预设比例的点是否靠近终端的左侧。假设
Figure PCTCN2017103288-appb-000006
并且如图10中的(a)所示,第一手势的滑动轨迹1001的所有点都分布在手机100的右侧显示区域,因此,手机100可以确定第一手势是用户的右侧手指输入的手势。
本申请中,终端可以通过判断第一手势的滑动轨迹的起点和终点的连线,与坐标轴的x轴或者y轴的夹角的正切值的取值区间,以及第一手势的滑动轨迹的点的分布,便可以判断用户使用左手或者右手操作手机,如此可以避免由于额外增加硬件设备,导致成本较高的问题。
在另一种可能的设计方法中,终端可以确定第一手势的滑动轨迹的起点坐标和终点坐标,然后从预先保存的左手轨迹模型和右手轨迹模型中,查找起点坐标和终点坐标在终端界面的分布与该第一手势的滑动轨迹的起点坐标和终点坐标相匹配的第一滑动轨迹。如果终端在左手轨迹模型中查找到该第一滑动轨迹,则可以确定该第一手势是用户左手输入的手势;如果终端在右手轨迹模型中查找到该第一滑动轨迹,则可以确定该第一手势是用户右手输入的手势。具体的,在这种可能的设计方法中,上述S301a可以替换为S1101-S1103。例如,如图11所示,图3中的S301a可以替换为S1101-S1103:
S1101、终端响应于用户在第一界面输入的第一手势,确定第一手势的滑动轨迹的起点坐标和终点坐标。
其中,终端响应于用户在第一界面输入的第一手势,确定第一手势的滑动轨迹的起点坐标和终点坐标的方法可以参考本申请上述相关描述,本申请这里不再赘述。
S1102、终端从左手轨迹模型和右手轨迹模型中,查找第一滑动轨迹,第一滑动轨迹的起点坐标和终点坐标在终端界面的分布与第一手势的滑动轨迹的起点坐标和终点坐标相匹配。
其中,左手轨迹模型中包括至少一个左手滑动轨迹的坐标,右手轨迹模型中包括至少一个右手滑动轨迹的坐标。
需要说明的是,本申请中第一滑动轨迹的起点坐标和终点坐标在终端界面的分布与第一手势的滑动轨迹的起点坐标和终点坐标相匹配,具体指:第一滑动轨迹的起点坐标与第一手势的滑动轨迹的起点坐标相同,第一滑动轨迹的终点坐标与第一手势的滑动轨迹的终点坐标相同。
或者,左手轨迹模型或者右手轨迹模型中包括的是第一滑动轨迹的起点坐标的取值范围和终点坐标的取值范围。本申请中第一滑动轨迹的起点坐标和终点坐标在终端界面的分布与第一手势的滑动轨迹的起点坐标和终点坐标相匹配,具体指:第一手势的起点坐标在第一滑动轨迹的起点坐标的取值范围内,第一手势的滑动轨迹的终点坐标在第一滑动轨迹的终点坐标的取值范围内。
例如,本申请的终端中可以维护一个如图12所示的轨迹数据库1201,该轨迹数据库1201中可以包括左手轨迹模型1202和右手轨迹模型1203。左手轨迹模型1202中包括至少两个左手轨迹的起点坐标的取值范围和终点坐标的取值范围,右手轨迹模型1202中包括至少两个右手轨迹的起点坐标的取值范围和终点坐标的取值范围。
如图12所示,左手轨迹模型1202包括:左手轨迹1的起点坐标中x的取值范围[a1,b1],左手轨迹1的起点坐标中y的取值范围[c1,d1],左手轨迹1的终点坐标中x的取值范围[e1,f1],左手轨迹1的终点坐标中y的取值范围[g1,h1];左手轨迹2的起点坐标中x的取值范围[a2,b2],左手轨迹2的起点坐标中y的取值范围[c2,d2],左手轨迹2的终点坐标中x的取值范围[e2,f2],左手轨迹2的终点坐标中y的取值范围[g2,h2];……;左手轨迹m的起点坐标中x的取值范围[am,bm],左手轨迹m的起点坐标中y的取值范围[cm,dm],左手轨迹m的终点坐标中x的取值范围[em,fm],左手轨迹m的终点坐标中y的取值范围[gm,hm]。
如图12所示,右手轨迹模型1203包括:右手轨迹1的起点坐标中x的取值范围[j1,k1],右手轨迹1的起点坐标中y的取值范围[o1,o1],右手轨迹1的终点坐标中x的取值范围[r1,s1],右手轨迹1的终点坐标中y的取值范围[w1,v1];右手轨迹2的起点坐标中x的取值范围[j2,k2],右手轨迹2的起点坐标中y的取值范围[o2,o2],右手轨迹2的终点坐标中x的取值范围[r2,s2],右手轨迹2的终点坐标中y的取值范围[w2,v2];……;右手轨迹n的起点坐标中x的取值范围[jn,kn],右手轨迹n的起点坐标中y的取值范围[on,on],右手轨迹n的终点坐标中x的取值范围[rn,sn],右手轨迹n的终点坐标中y的取值范围[wn,vn]。
S1103、当终端从第一侧轨迹模型中查找到第一滑动轨迹时,确定第一手势是用户第一侧的手指输入的手势,第一侧轨迹模型是左手轨迹模型或者右手轨迹模型。
其中,假设第一手势的滑动轨迹的起点坐标为D(x1,y1),终点坐标为E(x2,y2),终端可以从图12所示的右手轨迹模型1203和左手轨迹模型1202中,查找x1、y1、x2、y2分别落入其起点坐标范围和终点坐标范围的左手轨迹或者右手轨迹的第一滑动轨 迹。例如,假设x1∈[j2,k2]、y1∈[o2,o2]、x2∈[r2,s2]、y2∈[w2,v2],即x1、y1、x2、y2分别落入右手轨迹模型1203中的右手轨迹2的起点坐标范围和终点坐标范围。那么终端则可以确定上述第一手势是用户的右手手指输入的手势。
可选的,上述轨迹数据库也可以包括在云服务器中。其中,由于不同的用户使用终端的习惯不同,因此云服务器中可以针对每一个用户终端维护一个轨迹数据库。如图13所示,云服务器中可以包括轨迹数据库1201和轨迹数据库1320。其中,轨迹数据库1201中保存了手机100的左手轨迹模型和右手轨迹模型,轨迹数据库1320中保存了手机1310的左手轨迹模型和右手轨迹模型。以手机100为例,当手机100接收到用户输入的第一手势后,可以向云服务器发送该第一手势的起点坐标和终点坐标,由云服务器从轨迹数据库1201的左手轨迹模型和右手轨迹模型中,查找第一滑动轨迹,并向手机100返回查找结果。
其中,上述左手轨迹模型和右手轨迹模型中保存的滑动轨迹的坐标范围,可以是终端统计用户输入的多个手势的滑动轨迹的坐标得到的。
可选的,由于使用终端的用户可能会发生变化,或者一个用户在不同时间段使用手机的习惯可能会发生变化,因此终端可以根据其近期(如一个月内)收集的滑动轨迹的坐标,更新该终端的左手轨迹模型和右手轨迹模型。或者,当终端的左手轨迹模型和右手轨迹模型保存在云服务器时,终端可以向云服务器上报其近期(如一个月内)收集的滑动轨迹的坐标,使得云服务器可以更新该终端的左手轨迹模型和右手轨迹模型。
在一种可能的设计方法中,终端可以在用户无感知的情况下,统计用户输入的多个手势的滑动轨迹的坐标。具体的,在上述S301a或者S1102之前,本申请的方法还可以包括S1301:
S1301、终端响应于用户在终端界面输入的第四手势,确定第四手势是用户第一侧的手指输入的手势,并在第一侧轨迹模型中保存第四手势的滑动轨迹的坐标。
其中,S1301中“终端确定第四手势是用户第一侧的手指输入的手势”的具体方法可以参考本申请S301中的详细描述,本申请这里不再赘述。
本申请中,终端可以在用户无感知的情况下,统计用户输入的多个手势的滑动轨迹的坐标(相当于用户在终端的触摸屏上的手势习惯),在第一侧轨迹模型中保存滑动轨迹的坐标,以便于后续在接收到用户输入的第一手势后,可以对比该左手轨迹模型和右手轨迹模型中的滑动轨迹的坐标,判断该第一手势是用户左手输入的手势,还是用户右手输入的手势。
在另一种可能的设计方法中,终端可以针对性的引导用户使用左手输入手势或者使用右手输入手势。当终端针对性引导用户输入左手手势后,可以收集用户根据终端的引导输入的左手手势的滑动轨迹的坐标;当终端针对性引导用户输入右手手势后,可以收集用户根据终端的引导输入的右手手势的滑动轨迹的坐标。这样,可以提高在第一侧轨迹模型中保存的滑动轨迹的坐标准确性。
具体的,在这种可能的设计方法中,在上述S301a或S1102之前,本申请的方法还可以包括S1401。例如,如图14所示,在图11所示的S1102之前,本申请的方法还可以包括S1401-S1403:
S1401、终端显示第三界面,该第三界面中包括第一提示信息,该第一提示信息用于提示用户使用第一侧的手指在终端界面上滑动。
示例性的,如图15所示,手机100可以显示第三界面1501,该第三界面1501中可以包括第一提示信息“请按照您使用右手操作手机的习惯,使用右手在触摸屏上输入滑动轨迹”1502。其中,本申请中的第一提示信息包括但不限于图15所示的第一提示信息1502。
可选的,以终端是手机为例,手机可以在该手机开机后,或者手机开启单手模式后,显示上述第三界面。其中,上述单手模式可以分为左手模式和右手模式。左手模式是指当用户左手持手机时,手机为了方便用户左手操作手机,控制手机触摸屏上显示的界面元素靠近手机的左侧显示的显示模式。右手模式是指当用户右手持手机时,手机为了方便用户右手操作手机,控制手机触摸屏上显示的界面元素靠近手机的右侧显示的显示模式。
S1402、终端响应于用户在第三界面中输入的至少两个第三手势,统计至少两个第三手势的滑动轨迹的坐标,得到至少一个第一侧的手指滑动轨迹的坐标,该第三手势是用户第一侧的手指输入的手势。
示例性的,如图16中的(a)所示,手机100可以接收用户在第三界面1501输入的第三手势(即滑动轨迹1601对应的手势)。其中,终端可以在上述第三界面接收用户输入的多个第三手势,然后统计这多个第三手势的滑动轨迹的坐标,即对这多个第三手势的滑动轨迹进行分类,得到一个或多个第一侧的手指滑动轨迹的坐标。
S1403、终端在第一侧轨迹模型中保存至少一个第一侧的手指滑动轨迹的坐标。
其中,本申请中,终端可以在上述第一侧轨迹模型中,直接保存至少一个第一侧的手指滑动轨迹的坐标。当然,为了进一步提高在第一侧轨迹模型中保存的滑动轨迹的坐标准确性,终端还可以响应于用户在第三界面中输入的第三手势,先判断第三手势是用户的左侧手指输入的手势,还是右侧手指输入的手势,然后向用户确定终端的判断是否正确;当用户确定终端的判断正确后,才保存对应的滑动轨迹的坐标。具体的,上述S1401-S1403可以替换为S1601-S1603:
S1601、终端显示第三界面,该第三界面中包括第一提示信息,该第一提示信息用于提示用户使用第一侧的手指在终端界面上滑动。
其中,S1601中的第三界面可以参考本申请对S1401中所述的第三界面的详细介绍,本申请这里不再赘述。
S1602、终端响应于用户在第三界面中输入的第三手势,确定第三手势是用户第一侧的手指输入的手势,并显示第四界面,该第四界面中包括用于确定第三手势是否为用户第一侧的手指输入的手势的提示信息。
其中,S1602中“终端确定第三手势是用户第一侧的手指输入的手势”的具体方法可以参考本申请S301中的详细描述,本申请这里不再赘述。
示例性的,如图16中的(a)所示,当用户在第三界面输入滑动轨迹1601后,手机100可以显示图16中的(b)所示的第四界面1602,该第四界面1602中包括用于确定第三手势是否为用户第一侧的手指输入的手势的提示信息。如图16中的(b)所示,第四界面1602中包括提示信息“请确定您刚刚是否使用右手输入滑动轨迹?”。
S1603、终端响应于用户在第四界面的第一输入,在第一侧轨迹模型中保存第三手势的滑动轨迹的坐标,该第一输入用于指示第三手势是用户第一侧的手指输入的手势。
例如,用户在第四界面的第一输入可以是用户对图16中的(b)所示,第四界面1602中“是”选项的点击操作。当用户点击第四界面1602中“是”选项后,手机100则可以在第一侧轨迹模型中保存第三手势的滑动轨迹的坐标。
进一步的,在S1601-S1603中,由于终端每次只能对一个第三手势的滑动轨迹进行判断,因此在S1603之后,本申请的方法还可以包括S1604-S1605:
S1604、终端统计预设时间内保存的至少两个第三手势的滑动轨迹的坐标,得到至少一个第一侧的手指滑动轨迹的坐标,第三手势是用户第一侧的手指输入的手势;
S1605、终端在第一侧轨迹模型中保存至少一个第一侧的手指滑动轨迹的坐标。
其中,S1604-S1605可以参考本申请S1402-S1403中的详细描述,本申请这里不再赘述。
本申请中,终端不仅可以针对性的引导用户使用左手输入手势或者使用右手输入手势。并且,在用户按照终端的指示输入手势后,终端可以通过两重的判断过程,来确定用户输入的手势是该用户哪一侧手指输入的手势。即终端可以先判断判断第三手势是用户的左侧手指输入的手势,还是右侧手指输入的手势;然后向用户确定终端的判断是否正确;当用户确定终端的判断正确后,才保存对应的滑动轨迹的坐标。通过上述两重判断过程,可以提高在第一侧轨迹模型中保存的滑动轨迹的坐标准确性。
进一步的,终端可以根据第一侧轨迹模型中的手指滑动轨迹的坐标,确定出用户第一侧的手指在终端界面上的高频触控区域,然后将该高频触控区域确定为终端的第一侧的高频触控区域。该高频触控区域是终端界面上被用户触摸的频率或者被用户操作的次数高于预设阈值的触控区域。具体的,上述S301b可以替换为包括S1701。例如,如图17所示,图11所示的S301b可以替换为包括S1701:
S1701、终端根据第一侧轨迹模型中的至少一个第一侧的手指滑动轨迹的坐标,确定用户第一侧的手指在终端界面上的高频触控区域。
其中,第一侧轨迹模型的详细内容可以参考上述实施例中对第一侧轨迹模型的介绍,本申请这里不再赘述。
示例性的,以终端确定该终端左侧的高频触控区域为例,终端可以将该终端的触摸屏的左侧显示区域(例如,图10中的(a)所示的手机100的左侧显示区域),划分为至少两个显示区域,然后统计该至少两个显示区域中每个显示区域被用户操作的次数,将被用户操作的次数高于预设阈值的显示区域确定为左侧的高频触控区域。可选的,终端的左侧显示区域包括但不限于图10中的(a)所示的手机100的左侧显示区域。
或者,终端可以分析图12所示的左手轨迹模型1202中的左手轨迹的坐标分布情况,将左手轨迹分布较为密集的区域确定为左侧高频触控区域;分析图12所示的右手轨迹模型1203中的右手轨迹的坐标分布情况,将右手轨迹分布较为密集的区域确定为右侧高频触控区域。
又或者,以终端确定该终端右侧的高频触控区域为例,终端可以从图12所示的右手轨迹模型1203选择出两个高频右手轨迹,这两个高频左手轨迹是用户在终端的触摸 屏上触发的右手轨迹中,触发的次数或者频率按照由高到低的顺序排列在前两位的右手轨迹;然后,终端可以确定这两个高频右手轨迹的交叠区域;最后,将这两个高频右手轨迹的交叠区域确定为该终端右侧的高频触控区域。
例如,图18中的(a)所示的滑动轨迹1801和图18中的(b)所示的滑动轨迹1802是手机100的两个高频右手轨迹,终端可以将图18中的(c)所示的滑动轨迹1801和滑动轨迹1802的交叠区域1803确定为该终端右侧的高频触控区域。
再或者,仍以图18中的(a)所示的滑动轨迹1801和图18中的(b)所示的滑动轨迹1802是手机100的两个高频右手轨迹为例。终端可以确定经过图19中的(a)所示的扇形的原点O和点C
Figure PCTCN2017103288-appb-000007
(即滑动轨迹1801的起点A(x1,y1)和终点B(x2,y2)的连线的中点)的直线OC,与滑动轨迹1801的交点D(x3,y3);确定经过图19中的(b)所示的扇形的原点O和点G
Figure PCTCN2017103288-appb-000008
(即滑动轨迹1802的起点E(x4,y4)和终点F(x5,y5)的连线的中点)的直线OG,与滑动轨迹1802的交点H(x6,y6)。然后,如图19中的(c)所示,终端可以将点D(x3,y3)与点H(x6,y6),以及滑动轨迹1801和滑动轨迹1802的交叠区域1901确定为该终端右侧的高频触控区域。
其中,本申请中的高频触控区域还可以是包括上述交叠区域的固定形状的显示区域。例如,如图20中的(a)所示,高频触控区域可以为包括上述交叠区域1803的矩形显示区域2001;或者,如图20中的(b)所示,高频触控区域可以为包括上述交叠区域1803的圆形显示区域2002。
本申请提供的一种终端界面的显示方法,可以识别用户在触摸屏上输入的手势,以判断用户使用左手或者右手操作手机,如此便可以避免由于额外增加硬件设备,导致成本较高的问题。并且,在本申请中,终端可以在识别到用户使用第一侧的手指(如左手或者右手)操作终端时,在靠近终端的第一侧显示可以用于操作终端界面的触控区域,以便于用户可以在该触控区域操作终端界面上的所有内容。如此,便可以在不影响用户视觉和操作体验的前提下,使得用户可以操作终端界面中,第一侧的手指无法触及的区域。
本申请提供一种终端界面的显示方法,如图21所示,该终端界面的显示方法包括S2101-S2102:
S2101、终端响应于用户在第一界面输入的第一手势,确定终端的第一侧的高频触控区域,该第一手势是用户的第一侧的手指输入的手势,该高频触控区域是终端界面上被用户操作的频率或者次数高于第一阈值的触控区域,该第一界面中包括至少两个应用图标。
其中,S2101中“终端响应于用户在第一界面输入的第一手势,确定终端的第一侧的高频触控区域”的方法,可以参考本申请对S301的详细介绍,本申请这里不再赘述。
S2102、终端在第一侧的高频触控区域,显示至少一个高频应用图标,该至少一个高频应用图标是至少两个应用图标中被用户操作的频率或者次数高于第二阈值的应用图标。
举例来说,假设图22A中的(a)所示的“支付宝”应用图标2201和“微信”应用图 标2202是手机100的高频应用图标,即“支付宝”应用图标2201和“微信”应用图标2202是被用户操作的次数高于预设阈值的应用图标。并且,图22A中的(a)所示的显示区域2001是上述第一侧的高频触控区域,那么,如图22A中的(b)所示,手机100则可以在高频触控区域2001显示高频应用图标“支付宝”应用图标2201和“微信”应用图标2202。
本申请提供一种终端界面的显示方法,当终端确定用户使用第一侧的手指操作终端时,可以将终端中的高频应用图标显示在第一侧的高频触控区域,以便于用户可以更加便捷、舒适的操作该高频应用图标,可以提高用户体验。
进一步的,当终端的高频应用图标较多时,上述第一侧的高频触控区域可能不足以显示所有的高频应用图标。
针对这种情况下,在一种可能的实现方式中,终端将按照被用户操作的频率或者次数由大到小的顺序,排在前N位的高频应用图标,显示在第一侧的高频触控区域中。其中,N是第一侧的高频触控区域中可以显示的应用图标的个数。
在另一种可能的实现方式中,终端还可以在上述第一侧的高频触控区域,显示包括所有高频应用图标的文件夹图标。具体的,如图22B所示,图21所示的S2102可以替换为S2102a:
S2102a、终端在第一侧的高频触控区域,显示包括至少一个高频应用图标的文件夹图标。
举例来说,假设图22C中的(a)所示的“支付宝”应用图标2201、“微信”应用图标2202和“照片”应用图标2203是手机100的高频应用图标,即“支付宝”应用图标2201、“微信”应用图标2202和“照片”应用图标2203是被用户操作的次数高于预设阈值的应用图标。并且,图22C中的(a)所示的显示区域2001是上述第一侧的高频触控区域。如此,如图22C中的(b)所示,手机100则可以在高频触控区域2001显示包括“支付宝”应用图标2201、“微信”应用图标2202和“照片”应用图标2203的文件夹图标2204。
本申请中,终端可以在第一侧的高频触控区域,显示包括所有高频应用图标的文件夹图标。这样,便可以解决终端的高频应用图标较多,而第一侧的高频触控区域不足以显示所有的高频应用图标的问题。并且,将包括所有高频应用图标的文件夹图标显示在第一侧的高频触控区域,可以方便用户操纵该终端中的所有高频应用图标。
进一步的,在S2102a之后,本申请的方法还可以包括S2102b:终端响应于用户对文件夹图标的输入,在第一侧的高频触控区域,显示文件夹图标对应的文件夹展开窗口,该文件夹展开窗口中显示至少一个高频应用图标。
示例性的,当用户点击图22D中的(a)所示的文件夹图标2204后,手机100可以响应于用户对文件夹图标2204的点击操作,在右侧的高频触控区域2001,显示文件夹图标2204对应的文件夹展开窗口2205。如图22D中的(b)所示,该文件夹展开窗口2205中包括“支付宝”应用图标2201、“微信”应用图标2202和“照片”应用图标。
本申请中,终端还可以响应于用于对包括高频应用图标的文件夹图标的输入,将该文件夹图标对应的文件夹展开窗口显示在第一侧的高频触控区域,可以方便用户操纵该终端中的所有高频应用图标。
本申请提供一种终端界面的显示方法,如图23所示,该终端界面的显示方法包括 S2301-S2302:
S2301、终端响应于用户在第一界面输入的第一手势,确定第一手势是用户第一侧的手指输入的手势,该第一界面中包括第一界面元素,该第一界面元素包括导航栏图标和/或码头栏图标,该第一侧的手指是用户的左手手指或者右手手指。
其中,S2301中“终端响应于用户在第一界面输入的第一手势,确定第一手势是用户第一侧的手指输入的手势”的方法,可以参考本申请对S301的详细介绍,本申请这里不再赘述。
其中,本申请中的导航栏(Navigation Bar):是手机屏幕底部的快捷按钮栏,一般以虚拟按键的形式出现在手机屏幕最底端,导航栏上默认包含三个按钮Back键、Home键、最近(Recent)键,其中Back键用于返回上一界面,Home键用于返回桌面,Recent键用于显示最近使用过的应用程序。如图24中的(b)所示,导航栏2402中包括Back键2403、Home键2404和Recent键2405。
本申请中的码头栏(Dock Bar):是布满手机屏幕的整个窗口或者悬浮于其他窗口上的交互界面(Activity,即用于显示应用图标的应用区域)中的一部分。从视觉而言,码头栏位于Activity的下方,位于导航栏的上方。其中,码头栏与导航栏分属于两个窗口层级,码头栏位于导航栏的下层。
如图24中的(a)所示,手机100的码头栏2401中包括以下码头栏图标:“微信”应用图标、“拨号”图标、“联系人”图标和“短消息”图标。
S2302、终端将第一界面元素移动至靠近终端的第一侧的显示区域显示。
示例性的,如图24中的(a)所示,当用户右手操作手机100时,手机100可以确定用户输入的手势是用户的右手手指输入的手势;然后,手机100则可以将第一界面元素(如码头栏图标)移动至靠近终端的右侧的显示区域显示,即显示图24中的(b)所示的显示界面。
或者,上述第一界面元素可以包括码头栏图标和导航栏图标。此时,如图24中的(a)所示,当用户右手操作手机100时,手机100可以确定用户输入的手势是用户的右手手指输入的手势;然后,手机100则可以将码头栏图标和导航栏图标,移动至靠近终端的右侧的显示区域显示,即显示图25所示的显示界面。
本申请提供一种终端界面的显示方法,当终端确定用户使用第一侧的手指操作终端时,可以将终端界面中的码头栏图标和/或导航栏图标,移动至靠近终端的第一侧的显示区域显示,以便于用户可以更加便捷、舒适的操作该码头栏图标和/或导航栏图标,可以提高用户体验。
可以理解,用户在使用上述终端的过程中,终端可能会显示提示窗、弹窗按钮和悬浮按钮等显示。一般而言,终端可以将提示窗、弹窗按钮和悬浮按钮等居中显示在终端界面上;但是,上述按钮或者窗口的居中显示可能并不方便用户操作。为了便于用户操作这些按钮或者窗口,本申请中的终端可以在确定用户使用第一侧的手指操作终端后,在靠近该终端的第一侧的显示区域显示待显示的按钮或者窗口。或者,终端还可以在确定用户使用第一侧的手指操作终端后,在第一侧的高频触控区域显示待显示的按钮或者窗口。
例如,以16中的(b)所示的提示窗1602为例,当手机100确定用户使用右手操 作终端后,则可以显示图26所示的显示界面。在图26所示的显示界面中,提示窗1602靠近手机100的右下方显示。
进一步的,用户手持上述终端时,可能会因为用户的多个手指同时接触到该终端的触摸屏,造成对触摸屏的误触。
例如,如图27中的(a)所示,用户右手持手机100,当用户的大拇指2704点击触摸屏上的“设置”图标2703,控制手机100显示设置界面时,用户右手的无名指2702可能会接触到触摸屏上的“照片”图标2701,即用户的右手无名指误触手机100的触摸屏。在这种情况下,采用本申请的方法,终端可以识别到用户使用右手操作手机,此时,如果终端同时检测到用户对左侧图标(触摸屏上靠近手机左侧的图标)的输入和用户对右侧图标(触摸屏上靠近手机右侧的图标)的输入时,该终端可以忽略或者屏蔽用户对左侧图标的输入,仅响应用户对右侧图标的输入。例如,手机100可以忽略用户右手无名指对“照片”图标2701的误触,仅响应用户大拇指2704对“设置”图标2703的点击操作,显示图27中的(b)所示的设置界面。
又例如,如图28中的(a)所示,用户右手持手机100,当用户的大拇指2802点击触摸屏上的“设置”图标2801,控制手机100显示设置界面时,大拇指2802的根部2804可能会接触到触摸屏上的“短消息”图标2803,即大拇指2802的根部2804误触手机100的触摸屏。在这种情况下,采用本申请的方法,终端可以识别到用户使用右手操作手机,此时,如果终端同时检测到用户对触摸屏的Activity的上半区域(即Activity中除了Launcher之外的区域)的输入,以及用户对Launcher或者导航栏的输入时,该终端可以忽略或者屏蔽用户对Launcher或者导航栏的输入,仅响应用户对Activity的上半区域的输入。例如,手机100可以忽略用户大拇指2802的根部2804对“短消息”图标2803的误触,仅响应用户大拇指2802对“设置”图标2801的点击操作,显示图28中的(b)所示的设置界面。
其中,本申请的终端中可以预先保存有左手误触模型和右手误触模型,该左手误触模型中包括至少一个左手防误触规则,该左手防误触规则用于指示当用户左手持手机时,如果同时检测到用户在触摸屏的不同区域的至少两个输入,该终端应该如何选择性响应该至少两个输入。同样的,该右手误触模型中包括至少一个右手防误触规则,该右手防误触规则用于指示当用户右手持手机时,如果同时检测到用户在触摸屏的不同区域的至少两个输入,该终端应该如何选择性响应至少两个输入。
本申请提供的终端界面的显示方法,当终端确定用户使用第一侧(左手或者右手)的手指操作终端后,如果终端同时检测到用户在触摸屏的不同区域的至少两个输入,该终端可以按照第一侧误触模型(即左手误触模型或者右手误触模型)中的防误触规则的指示,选择性响应至少两个输入,以防止终端响应于用户对触摸屏的误触,显示与用户操作不对应的终端界面。
可以理解的是,上述终端等为了实现上述功能,其包含了执行各个功能相应的硬件结构和/或软件模块。本领域技术人员应该很容易意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,本发明实施例能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用 不同方法来实现所描述的功能,但是这种实现不应认为超出本发明实施例的范围。
本申请实施例可以根据上述方法示例对上述终端等进行功能模块的划分,例如,可以对应各个功能划分各个功能模块,也可以将两个或两个以上的功能集成在一个处理模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。需要说明的是,本发明实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
在采用对应各个功能划分各个功能模块的情况下,图29示出了上述实施例中所涉及的终端的一种可能的结构示意图,该终端2900包括:输入单元2901、确定单元2902和显示单元2903。
其中,输入单元2901用于支持终端执行上述方法实施例中,接收S301、S301a、S2101和S2301中所述的第一手势,接收S303中所述第二手势,接收S1402和S1602中所述的第三手势,接收S1603所述的第一输入,S2302,和/或用于本文所描述的技术的其它过程。确定单元2902用于支持终端执行上述方法实施例中,S301、S301b、S2101和S2301中确定高频触控区域,S301a和S1602中确定第一手势是用户第一侧的手指输入的手势,S901-S902,S1101S1103,S1301,S1701,和/或用于本文所描述的技术的其它过程。显示单元2903用于支持终端执行方法实施例中,显示S301、S301a中的第一界面,S302,显示S303中的第二界面,S1401,S1601、显示S1602中所述的第四界面,S2102、S2102a,和/或用于本文所描述的技术的其它过程。
进一步的,上述终端2900还可以包括:统计单元和存储单元。其中,统计单元用于支持终端执行方法实施例中的S1402中统计滑动轨迹的坐标,S1604,和/或用于本文所描述的技术的其它过程。存储单元用于支持终端执行方法实施例中的S1403,S1605,和/或用于本文所描述的技术的其它过程。
其中,上述方法实施例涉及的各步骤的所有相关内容均可以援引到对应功能模块的功能描述,在此不再赘述。
当然,终端2900包括但不限于上述所列举的单元模块,例如,终端2900还可以包括通信单元,通信单元用于与其他终端通信。并且,上述功能单元的具体所能够实现的功能也包括但不限于上述实例所述的方法步骤对应的功能,终端2900的其他单元的详细描述可以参考其所对应方法步骤的详细描述,本申请实施例这里不再赘述。
在采用集成的单元的情况下,上述确定单元2902和统计单元等可以集成在一个处理模块中实现,通信单元可以是终端的RF电路、WiFi模块或者蓝牙模块,上述存储单元可以是终端的存储模块,上述显示单元可以是显示模块,如触摸屏。
图30示出了上述实施例中所涉及的终端的一种可能的结构示意图。该终端3000包括:处理模块3001、存储模块3002、显示模块3003和通信模块3004。处理模块3001用于对终端的动作进行控制管理。显示模块3003用于显示处理模块3001生成的图像。存储模块3002,用于保存终端的程序代码和数据。通信模块3004用于与其他终端通信。如通信模块3004用于与其他终端进行语音通信,接收或者向其他终端发送头像。
其中,处理模块3001可以是处理器或控制器,例如可以是中央处理器(Central Processing Unit,CPU),通用处理器,数字信号处理器(Digital Signal Processor,DSP),专用集成电路(Application-Specific Integrated Circuit,ASIC),现场可编程门阵列(Field  Programmable Gate Array,FPGA)或者其他可编程逻辑器件、晶体管逻辑器件、硬件部件或者其任意组合。其可以实现或执行结合本发明公开内容所描述的各种示例性的逻辑方框,模块和电路。所述处理器也可以是实现计算功能的组合,例如包含一个或多个微处理器组合,DSP和微处理器的组合等等。通信模块3004可以是收发器、收发电路或通信接口等。存储模块3002可以是存储器。
当处理模块3001为处理器(如图2所示的处理器101),通信模块3004为RF电路(如图2所示的射频电路102),存储模块3002为存储器(如图2所示的存储器103),显示模块3003为触摸屏(包括图2所示的触控板104-1和显示器104-2)时,本申请所提供的终端可以为图2所示的手机100。其中,上述通信模块3004不仅可以包括RF电路,还可以包括WiFi模块和蓝牙模块。RF电路、WiFi模块和蓝牙模块等通信模块可以统称为通信接口。其中,上述处理器、通信接口、触摸屏和存储器可以通过总线耦合在一起。
参考图4、图6A、图6B、图8、图15中任一附图所示,本申请实施例还提供了一种图形用户界面(GUI),该图形用户界面存储在终端中,终端包括触摸屏、存储器、处理器和通信接口,处理器用于执行存储在存储器中的一个或多个计算机程序,该图形用户界面包括:第一GUI;响应于在上述第一GUI中输入的第一手势,显示第二GUI,上述第二GUI的第一侧的高频触控区域包括第一触控面板,上述第一触控面板用于响应于用户输入的手势操作上述第一GUI,上述第一手势是用户的第一侧的手指输入的手势,上述高频触控区域是上述第二GUI上被用户操作的频率或者次数高于第一阈值的触控区域。响应于在上述第二GUI中的上述第一触控面板的输入的第二手势,显示第三GUI,上述第三GUI包括上述终端响应于用户在上述第一GUI的对应位置输入的第三手势所显示的界面元素。
进一步的,参考图16,上述GUI还包括:第四GUI,上述第四GUI中包括第一提示信息,上述第一提示信息用于提示用户使用上述第一侧的手指在上述第四GUI上滑动。
参考图22A所示,本申请实施例还提供了一种图形用户界面(GUI),该图形用户界面存储在终端中,终端包括触摸屏、存储器、处理器和通信接口,处理器用于执行存储在存储器中的一个或多个计算机程序,该图形用户界面存储在终端中,终端包括触摸屏、存储器、处理器和通信接口,处理器用于执行存储在存储器中的一个或多个计算机程序,该图形用户界面包括:第一GUI,上述第一GUI包括至少两个应用图标。响应于在上述第一GUI中输入的第一手势,显示第二GUI,上述第二GUI的第一侧的高频触控区域包括至少一个高频应用图标,上述第一手势是用户的第一侧的手指输入的手势,上述高频触控区域是上述第二GUI上被用户操作的频率或者次数高于第一阈值的触控区域,上述至少一个高频应用图标是上述至少两个应用图标中被用户操作的频率或者次数高于第二阈值的应用图标。
进一步的,参考图22C所示,上述第二GUI中包括文件夹图标,上述文件夹图标中包括上述至少一个高频应用图标。
进一步的,参考图22D所示,上述GUI还包括:响应于对上述第二GUI中的上述文件夹图标的输入,显示第三GUI,上述第三GUI包括上述文件夹图标对应的文件 夹展开窗口,上述文件夹展开窗口中显示上述至少一个高频应用图标。
参考图24或图25所示,本申请实施例还提供了一种图形用户界面(GUI),该图形用户界面存储在终端中,终端包括触摸屏、存储器、处理器和通信接口,处理器用于执行存储在存储器中的一个或多个计算机程序,该图形用户界面存储在终端中,终端包括触摸屏、存储器、处理器和通信接口,处理器用于执行存储在存储器中的一个或多个计算机程序,该图形用户界面包括:第一GUI,该第一GUI中包括第一界面元素,该第一界面元素包括导航栏图标和/或码头栏图标;响应于在第一GUI输入的第一手势,显示第二GUI,该第二GUI的第一侧的显示区域中包括上述第一界面元素。
本申请还提供一种计算机存储介质,该计算机存储介质中存储有计算机程序代码,当上述处理器执行该计算机程序代码时,该终端执行图3、图9、图11、图14、图17、图21、图22B和图23中任一附图中的相关方法步骤实现上述实施例中的终端界面的显示方法。
本申请还提供了一种计算机程序产品,当该计算机程序产品在计算机上运行时,使得计算机执行图3、图9、图11、图14、图17、图21、图22B和图23中任一附图中的相关方法步骤实现上述实施例中的终端界面的显示方法。
其中,本申请提供的终端2900、终端3000、计算机存储介质或者计算机程序产品均用于执行上文所提供的对应的方法,因此,其所能达到的有益效果可参考上文所提供的对应的方法中的有益效果,此处不再赘述。
通过以上的实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。上述描述的***,装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的***,装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个***,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的 形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)或处理器执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:快闪存储器、移动硬盘、只读存储器、随机存取存储器、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (37)

  1. 一种终端界面的显示方法,其特征在于,包括:
    终端响应于用户在第一界面输入的第一手势,确定所述终端的第一侧的高频触控区域,所述第一手势是用户的第一侧的手指输入的手势,所述高频触控区域是终端界面上被用户操作的频率或者次数高于第一阈值的触控区域,所述第一界面中包括至少两个应用图标;
    所述终端在所述第一侧的高频触控区域,显示至少一个高频应用图标,所述至少一个高频应用图标是所述至少两个应用图标中被用户操作的频率或者次数高于第二阈值的应用图标;
    其中,当所述第一侧的手指是左手手指时,所述终端的第一侧是终端的左侧,当所述第一侧的手指是右手手指时,所述终端的第一侧是终端的右侧。
  2. 根据权利要求1所述的方法,其特征在于,所述终端在所述第一侧的高频触控区域,显示至少一个高频应用图标,包括:
    所述终端在所述第一侧的高频触控区域,显示包括所述至少一个高频应用图标的文件夹图标。
  3. 根据权利要求2所述的方法,其特征在于,在所述终端在所述第一侧的高频触控区域,显示包括所述至少一个高频应用图标的文件夹图标之后,所述方法还包括:
    所述终端响应于用户对所述文件夹图标的输入,在所述第一侧的高频触控区域,显示所述文件夹图标对应的文件夹展开窗口,所述文件夹展开窗口中显示所述至少一个高频应用图标。
  4. 根据权利要求1-3中任一项所述的方法,其特征在于,所述确定所述终端的第一侧的高频触控区域,包括:
    所述终端根据第一侧轨迹模型中的至少一个第一侧的手指滑动轨迹的坐标,确定所述终端的第一侧的高频触控区域;
    其中,所述第一侧轨迹模型是左手轨迹模型或者所述右手轨迹模型,所述右手轨迹模型中包括至少一个右手滑动轨迹的坐标,所述左手轨迹模型中包括至少一个左手滑动轨迹的坐标。
  5. 一种终端界面的显示方法,其特征在于,包括:
    终端响应于用户在第一界面输入的第一手势,确定所述终端的第一侧的高频触控区域,所述第一手势是用户的第一侧的手指输入的手势,所述高频触控区域是终端界面上被用户操作的频率或者次数高于第一阈值的触控区域;
    所述终端在所述第一侧的高频触控区域显示第一触控面板,所述第一触控面板用于响应于用户输入的手势操作所述第一界面;
    所述终端响应于用户在所述第一触控面板中的输入的第二手势,显示第二界面,所述第二界面中包括所述终端响应于用户在所述第一界面的对应位置输入的第三手势所显示的界面元素;
    其中,当所述第一侧的手指是左手手指时,所述终端的第一侧是终端的左侧,当所述第一侧的手指是右手手指时,所述终端的第一侧是终端的右侧。
  6. 根据权利要求5所述的方法,其特征在于,所述确定所述终端的第一侧的高频 触控区域,包括:
    所述终端根据第一侧轨迹模型中的至少一个第一侧的手指滑动轨迹的坐标,确定所述终端的第一侧的高频触控区域;
    其中,所述第一侧轨迹模型是左手轨迹模型或者所述右手轨迹模型,所述右手轨迹模型中包括至少一个右手滑动轨迹的坐标,所述左手轨迹模型中包括至少一个左手滑动轨迹的坐标。
  7. 根据权利要求5所述的方法,其特征在于,所述确定所述终端的第一侧的高频触控区域,包括:
    所述终端计算所述第一手势的滑动轨迹的起点和终点的连线,与坐标轴的x轴或者y轴的夹角的正切值;
    当所述正切值属于所述终端的第一侧对应的取值区间,并且所述第一手势的滑动轨迹中预设比例的点靠近所述终端的第一侧时,所述终端确定所述第一侧的高频触控区域。
  8. 根据权利要求5或6所述的方法,其特征在于,所述终端确定所述终端的第一侧的高频触控区域,包括:
    所述终端确定所述第一手势的滑动轨迹的起点坐标和终点坐标;
    所述终端从左手轨迹模型和右手轨迹模型中,查找第一滑动轨迹,所述第一滑动轨迹的起点坐标和终点坐标在终端界面的分布与所述第一手势的滑动轨迹的起点坐标和终点坐标相匹配,所述左手轨迹模型中包括至少一个左手滑动轨迹的坐标,所述右手轨迹模型中包括至少一个右手滑动轨迹的坐标;
    当所述终端从第一侧轨迹模型中查找到所述第一滑动轨迹时,确定所述终端的第一侧的高频触控区域,所述第一侧轨迹模型是所述左手轨迹模型或者所述右手轨迹模型。
  9. 根据权利要求6或8所述的方法,其特征在于,在所述终端响应于用户在第一界面输入的第一手势,确定所述终端的第一侧的高频触控区域之前,所述方法还包括:
    所述终端显示第三界面,所述第三界面中包括第一提示信息,所述第一提示信息用于提示用户使用所述第一侧的手指在终端界面上滑动;
    所述终端响应于用户在所述第三界面中输入的至少两个第三手势,统计所述至少两个第三手势的滑动轨迹的坐标,得到至少一个第一侧的手指滑动轨迹的坐标,所述第三手势是用户第一侧的手指输入的手势;
    所述终端在所述第一侧轨迹模型中保存所述至少一个第一侧的手指滑动轨迹的坐标。
  10. 根据权利要求6或8所述的方法,其特征在于,在所述终端响应于用户在第一界面输入的第一手势,确定所述第一手势是用户左手输入的手势或者右手输入的手势之前,所述方法还包括:
    所述终端响应于用户在终端界面输入的第四手势,确定所述第四手势是用户第一侧的手指输入的手势,并在第一侧轨迹模型中保存所述第四手势的滑动轨迹的坐标。
  11. 一种终端,其特征在于,包括:
    输入单元,用于接收用户在第一界面输入的第一手势,所述第一手势是用户的第 一侧的手指输入的手势;
    确定单元,用于响应于所述输入单元接收的用户在所述第一界面输入的所述第一手势,确定所述终端的第一侧的高频触控区域,所述高频触控区域是终端界面上被用户操作的频率或者次数高于第一阈值的触控区域,所述第一界面中包括至少两个应用图标;
    显示单元,用于在所述确定单元确定的所述第一侧的高频触控区域,显示至少一个高频应用图标,所述至少一个高频应用图标是所述至少两个应用图标中被用户操作的频率或者次数高于第二阈值的应用图标;
    其中,当所述第一侧的手指是左手手指时,所述终端的第一侧是终端的左侧,当所述第一侧的手指是右手手指时,所述终端的第一侧是终端的右侧。
  12. 根据权利要求11所述的终端,其特征在于,所述显示单元,具体用于:
    在所述第一侧的高频触控区域,显示包括所述至少一个高频应用图标的文件夹图标。
  13. 根据权利要求12所述的终端,其特征在于,所述输入单元,还用于在所述显示单元在所述第一侧的高频触控区域,显示包括所述至少一个高频应用图标的文件夹图标之后,接收用户对所述文件夹图标的输入;
    所述显示单元,还用于响应于用户对所述文件夹图标的输入,在所述第一侧的高频触控区域,显示所述文件夹图标对应的文件夹展开窗口,所述文件夹展开窗口中显示所述至少一个高频应用图标。
  14. 根据权利要求11-13中任一项所述的终端,其特征在于,所述确定单元,具体用于:
    根据第一侧轨迹模型中的至少一个第一侧的手指滑动轨迹的坐标,确定所述终端的第一侧的高频触控区域;
    其中,所述第一侧轨迹模型是左手轨迹模型或者所述右手轨迹模型,所述右手轨迹模型中包括至少一个右手滑动轨迹的坐标,所述左手轨迹模型中包括至少一个左手滑动轨迹的坐标。
  15. 一种终端,其特征在于,包括:
    输入单元,用于接收用户在第一界面输入的第一手势
    确定单元,用于响应于所述输入单元接收的用户在所述第一界面输入的所述第一手势,确定所述终端的第一侧的高频触控区域,所述第一手势是用户的第一侧的手指输入的手势,所述高频触控区域是终端界面上被用户操作的频率或者次数高于第一阈值的触控区域;
    显示单元,还用于在所述显示单元显示的所述第一侧的高频触控区域显示第一触控面板,所述第一触控面板用于响应于用户输入的手势操作所述第一界面;
    所述输入单元,还用于接收用户在所述显示单元显示的所述第一触控面板中的输入的第二手势;
    所述显示单元,还用于响应于所述输入单元接收的用户在所述第一触控面板中的输入的所述第二手势,显示第二界面,所述第二界面中包括所述终端响应于用户在所述第一界面的对应位置输入的第三手势所显示的界面元素;
    其中,当所述第一侧的手指是左手手指时,所述终端的第一侧是终端的左侧,当所述第一侧的手指是右手手指时,所述终端的第一侧是终端的右侧。
  16. 根据权利要求15所述的终端,其特征在于,所述确定单元,具体用于:
    根据第一侧轨迹模型中的至少一个第一侧的手指滑动轨迹的坐标,确定所述终端的第一侧的高频触控区域;
    其中,所述第一侧轨迹模型是左手轨迹模型或者所述右手轨迹模型,所述右手轨迹模型中包括至少一个右手滑动轨迹的坐标,所述左手轨迹模型中包括至少一个左手滑动轨迹的坐标。
  17. 根据权利要求15所述的终端,其特征在于,所述确定单元,具体用于:
    计算所述第一手势的滑动轨迹的起点和终点的连线,与坐标轴的x轴或者y轴的夹角的正切值;
    当所述正切值属于所述终端的第一侧对应的取值区间,并且所述第一手势的滑动轨迹中预设比例的点靠近所述终端的第一侧时,确定所述第一侧的高频触控区域。
  18. 根据权利要求15或16所述的终端,其特征在于,所述确定单元,具体用于:
    确定所述第一手势的滑动轨迹的起点坐标和终点坐标;
    从左手轨迹模型和右手轨迹模型中,查找第一滑动轨迹,所述第一滑动轨迹的起点坐标和终点坐标在终端界面的分布与所述第一手势的滑动轨迹的起点坐标和终点坐标相匹配,所述左手轨迹模型中包括至少一个左手滑动轨迹的坐标,所述右手轨迹模型中包括至少一个右手滑动轨迹的坐标;
    当从第一侧轨迹模型中查找到所述第一滑动轨迹时,确定所述终端的第一侧的高频触控区域,所述第一侧轨迹模型是所述左手轨迹模型或者所述右手轨迹模型。
  19. 根据权利要求16或18所述的终端,其特征在于,所述显示单元,还应用于在所述确定单元确定所述终端的第一侧的高频触控区域之前,显示第三界面,所述第三界面中包括第一提示信息,所述第一提示信息用于提示用户使用所述第一侧的手指在终端界面上滑动;
    所述输入单元,还用于接收用户在所述第三界面中输入的至少两个第三手势;
    所述终端还包括:统计单元和存储单元;
    所述统计单元,用于响应于所述输入单元接收的用户在所述第三界面中输入的至少两个第三手势,统计所述至少两个第三手势的滑动轨迹的坐标,得到至少一个第一侧的手指滑动轨迹的坐标,所述第三手势是用户第一侧的手指输入的手势;
    所述存储单元,用于在所述第一侧轨迹模型中保存所述至少一个第一侧的手指滑动轨迹的坐标。
  20. 根据权利要求16或18所述的终端,其特征在于,所述输入单元,还用于在所述确定单元确定所述第一手势是用户左手输入的手势或者右手输入的手势之前,接收终端响应于用户在终端界面输入的第四手势;
    所述确定单元,还用于响应于所述输入单元接收的用户在终端界面输入的所述第四手势,确定所述第四手势是用户第一侧的手指输入的手势;
    所述终端还包括:存储单元;
    所述存储单元,用于在第一侧轨迹模型中保存所述第四手势的滑动轨迹的坐标。
  21. 一种终端,其特征在于,所述终端包括:处理器、存储器和触摸屏,所述存储器、所述触摸屏与所述处理器耦合,所述存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令,当所述处理器执行所述计算机指令时,所述终端执行如下操作:
    所述触摸屏,用于显示第一界面,所述第一界面中包括至少两个应用图标;
    所述处理器,用于响应于用户在所述触摸屏上显示的所述第一界面输入的第一手势,确定所述终端的第一侧的高频触控区域,所述第一手势是用户的第一侧的手指输入的手势,所述高频触控区域是终端界面上被用户操作的频率或者次数高于第一阈值的触控区域;
    所述触摸屏,还用于在所述处理器确定的所述第一侧的高频触控区域,显示至少一个高频应用图标,所述至少一个高频应用图标是所述至少两个应用图标中被用户操作的频率或者次数高于第二阈值的应用图标;
    其中,当所述第一侧的手指是左手手指时,所述终端的第一侧是终端的左侧,当所述第一侧的手指是右手手指时,所述终端的第一侧是终端的右侧。
  22. 根据权利要求21所述的终端,其特征在于,所述触摸屏,具体用于:
    在所述第一侧的高频触控区域,显示包括所述至少一个高频应用图标的文件夹图标。
  23. 根据权利要求22所述的终端,其特征在于,所述处理器,还用于在所述第一侧的高频触控区域,显示包括所述至少一个高频应用图标的文件夹图标之后,接收用户对所述触摸屏显示的所述文件夹图标的输入;
    所述触摸屏,还用于响应于用户对所述文件夹图标的输入,在所述第一侧的高频触控区域,显示所述文件夹图标对应的文件夹展开窗口,所述文件夹展开窗口中显示所述至少一个高频应用图标。
  24. 根据权利要求21-23中任一项所述的终端,其特征在于,所述处理器,具体用于:
    根据第一侧轨迹模型中的至少一个第一侧的手指滑动轨迹的坐标,确定所述终端的第一侧的高频触控区域;
    其中,所述第一侧轨迹模型是左手轨迹模型或者所述右手轨迹模型,所述右手轨迹模型中包括至少一个右手滑动轨迹的坐标,所述左手轨迹模型中包括至少一个左手滑动轨迹的坐标。
  25. 一种终端,其特征在于,包括:处理器、存储器和触摸屏,所述存储器、所述触摸屏与所述处理器耦合,所述存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令,当所述处理器执行所述计算机指令时,所述终端执行如下操作:
    所述触摸屏,用于显示第一界面;
    所述处理器,用于响应于用户在所述触摸屏上显示的所述第一界面输入的第一手势,确定所述终端的第一侧的高频触控区域,所述第一手势是用户的第一侧的手指输入的手势,所述高频触控区域是终端界面上被用户操作的频率或者次数高于第一阈值的触控区域;
    所述触摸屏,还用于在所述处理器确定的所述第一侧的高频触控区域显示第一触 控面板,所述第一触控面板用于响应于用户输入的手势操作所述第一界面;
    所述处理器,还用于接收用户在所述触摸屏显示的所述第一触控面板中的输入的第二手势;
    所述触摸屏,还用于响应于用户在所述第一触控面板中的输入的第二手势,显示第二界面,所述第二界面中包括所述终端响应于用户在所述第一界面的对应位置输入的第三手势所显示的界面元素;
    其中,当所述第一侧的手指是左手手指时,所述终端的第一侧是终端的左侧,当所述第一侧的手指是右手手指时,所述终端的第一侧是终端的右侧。
  26. 根据权利要求25所述的终端,其特征在于,所述处理器,具体用于:
    根据第一侧轨迹模型中的至少一个第一侧的手指滑动轨迹的坐标,确定所述终端的第一侧的高频触控区域;
    其中,所述第一侧轨迹模型是左手轨迹模型或者所述右手轨迹模型,所述右手轨迹模型中包括至少一个右手滑动轨迹的坐标,所述左手轨迹模型中包括至少一个左手滑动轨迹的坐标。
  27. 根据权利要求25所述的终端,其特征在于,所述处理器,具体用于:
    计算所述第一手势的滑动轨迹的起点和终点的连线,与坐标轴的x轴或者y轴的夹角的正切值;
    当所述正切值属于所述终端的第一侧对应的取值区间,并且所述第一手势的滑动轨迹中预设比例的点靠近所述终端的第一侧时,确定所述第一侧的高频触控区域。
  28. 根据权利要求25或26所述的终端,其特征在于,所述处理器,具体用于:
    确定所述第一手势的滑动轨迹的起点坐标和终点坐标;
    从左手轨迹模型和右手轨迹模型中,查找第一滑动轨迹,所述第一滑动轨迹的起点坐标和终点坐标在终端界面的分布与所述第一手势的滑动轨迹的起点坐标和终点坐标相匹配,所述左手轨迹模型中包括至少一个左手滑动轨迹的坐标,所述右手轨迹模型中包括至少一个右手滑动轨迹的坐标;
    当从第一侧轨迹模型中查找到所述第一滑动轨迹时,确定所述终端的第一侧的高频触控区域,所述第一侧轨迹模型是所述左手轨迹模型或者所述右手轨迹模型。
  29. 根据权利要求26或28所述的终端,其特征在于,所述触摸屏,还用于在所述处理器确定所述终端的第一侧的高频触控区域之前,显示第三界面,所述第三界面中包括第一提示信息,所述第一提示信息用于提示用户使用所述第一侧的手指在终端界面上滑动;
    所述处理器,还用于接收用户在所述触摸屏显示的所述第三界面中输入的至少两个第三手势;响应于用户在所述第三界面中输入的至少两个第三手势,统计所述至少两个第三手势的滑动轨迹的坐标,得到至少一个第一侧的手指滑动轨迹的坐标,所述第三手势是用户第一侧的手指输入的手势;
    所述存储器,还用于在所述第一侧轨迹模型中保存所述至少一个第一侧的手指滑动轨迹的坐标。
  30. 根据权利要求26或28所述的终端,其特征在于,所述处理器,还用于在响应于用户在第一界面输入的第一手势,确定所述第一手势是用户左手输入的手势或者 右手输入的手势之前,响应于用户在终端界面输入的第四手势,确定所述第四手势是用户第一侧的手指输入的手势;
    所述存储器,还用于在第一侧轨迹模型中保存所述第四手势的滑动轨迹的坐标。
  31. 一种图形用户界面GUI,所述图形用户界面存储在终端中,所述终端包括触摸屏、存储器和处理器,所述处理器用于执行存储在所述存储器中的一个或多个计算机程序,其特征在于,所述图形用户界面包括:
    显示在所述触摸屏上的第一GUI,所述第一GUI包括至少两个应用图标;
    响应于在所述第一GUI中输入的第一手势,显示第二GUI,所述第二GUI的第一侧的高频触控区域包括至少一个高频应用图标,所述第一手势是用户的第一侧的手指输入的手势,所述高频触控区域是所述第二GUI上被用户操作的频率或者次数高于第一阈值的触控区域,所述至少一个高频应用图标是所述至少两个应用图标中被用户操作的频率或者次数高于第二阈值的应用图标。
  32. 根据权利要求31所述的GUI,其特征在于,所述第二GUI中包括文件夹图标,所述文件夹图标中包括所述至少一个高频应用图标。
  33. 根据权利要求32所述的GUI,其特征在于,所述GUI还包括:
    响应于对所述第二GUI中的所述文件夹图标的输入,显示第三GUI,所述第三GUI包括所述文件夹图标对应的文件夹展开窗口,所述文件夹展开窗口中显示所述至少一个高频应用图标。
  34. 一种图形用户界面GUI,所述图形用户界面存储在终端中,所述终端包括触摸屏、存储器和处理器,所述处理器用于执行存储在所述存储器中的一个或多个计算机程序,其特征在于,所述图形用户界面包括:
    显示在所述触摸屏上的第一GUI;
    响应于在所述第一GUI中输入的第一手势,显示第二GUI,所述第二GUI的第一侧的高频触控区域包括第一触控面板,所述第一触控面板用于响应于用户输入的手势操作所述第一GUI,所述第一手势是用户的第一侧的手指输入的手势,所述高频触控区域是所述第二GUI上被用户操作的频率或者次数高于第一阈值的触控区域;
    响应于在所述第二GUI中的所述第一触控面板的输入的第二手势,显示第三GUI,所述第三GUI包括所述终端响应于用户在所述第一GUI的对应位置输入的第三手势所显示的界面元素。
  35. 根据权利要求34所述的GUI,其特征在于,所述GUI还包括:
    显示在所述触摸屏上的第四GUI,所述第四GUI中包括第一提示信息,所述第一提示信息用于提示用户使用所述第一侧的手指在所述第四GUI上滑动。
  36. 一种计算机存储介质,其特征在于,包括计算机指令,当所述计算机指令在终端上运行时,使得所述终端执行如权利要求1-10中任一项所述的终端界面的显示方法。
  37. 一种计算机程序产品,其特征在于,当所述计算机程序产品在计算机上运行时,使得所述计算机执行如权利要求1-10中任一项所述的终端界面的显示方法。
PCT/CN2017/103288 2017-09-25 2017-09-25 一种终端界面的显示方法及终端 WO2019056393A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/CN2017/103288 WO2019056393A1 (zh) 2017-09-25 2017-09-25 一种终端界面的显示方法及终端
US16/650,264 US11307760B2 (en) 2017-09-25 2017-09-25 Terminal interface display method and terminal
CN201780073653.4A CN109997348B (zh) 2017-09-25 2017-09-25 一种终端界面的显示方法及终端

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/103288 WO2019056393A1 (zh) 2017-09-25 2017-09-25 一种终端界面的显示方法及终端

Publications (1)

Publication Number Publication Date
WO2019056393A1 true WO2019056393A1 (zh) 2019-03-28

Family

ID=65811032

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/103288 WO2019056393A1 (zh) 2017-09-25 2017-09-25 一种终端界面的显示方法及终端

Country Status (3)

Country Link
US (1) US11307760B2 (zh)
CN (1) CN109997348B (zh)
WO (1) WO2019056393A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112083858A (zh) * 2020-08-31 2020-12-15 珠海格力电器股份有限公司 控件的显示位置调整方法及装置
EP4033339A4 (en) * 2019-10-23 2022-11-30 Huawei Technologies Co., Ltd. USER INTERFACE DISPLAY METHOD AND ELECTRONIC DEVICE

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10770035B2 (en) 2018-08-22 2020-09-08 Google Llc Smartphone-based radar system for facilitating awareness of user presence and orientation
US10890653B2 (en) 2018-08-22 2021-01-12 Google Llc Radar-based gesture enhancement for voice interfaces
US10698603B2 (en) 2018-08-24 2020-06-30 Google Llc Smartphone-based radar system facilitating ease and accuracy of user interactions with displayed objects in an augmented-reality interface
USD930676S1 (en) * 2018-09-07 2021-09-14 Samsung Display Co., Ltd. Display device with generated image for display
USD918952S1 (en) * 2018-10-19 2021-05-11 Beijing Xiaomi Mobile Software Co., Ltd. Electronic device with graphical user interface
US10788880B2 (en) 2018-10-22 2020-09-29 Google Llc Smartphone-based radar system for determining user intention in a lower-power mode
CN110045890B (zh) * 2019-03-11 2021-01-08 维沃移动通信有限公司 应用标识的显示方法及终端设备
US11004247B2 (en) * 2019-04-02 2021-05-11 Adobe Inc. Path-constrained drawing with visual properties based on drawing tool
USD1009050S1 (en) * 2019-06-19 2023-12-26 F. Hoffman-La Roche Ag Display screen with transitional graphical user interface
US11307738B2 (en) * 2019-10-08 2022-04-19 Visa International Service Association Method and system for navigating through an application in a user device
CN111061419B (zh) 2019-10-23 2023-03-03 华为技术有限公司 一种应用栏显示方法及电子设备
CN110989880B (zh) * 2019-11-22 2023-12-26 珠海豹趣科技有限公司 一种界面元素处理方法、装置及可读存储介质
CN114125260A (zh) * 2020-08-31 2022-03-01 北京小米移动软件有限公司 应用界面的显示方法、应用界面的显示装置及存储介质
CN114579006A (zh) * 2020-11-30 2022-06-03 花瓣云科技有限公司 应用分类方法、电子设备及芯片***
US20220334578A1 (en) * 2021-04-14 2022-10-20 Ford Global Technologies, Llc Remote Control System For A Vehicle And Trailer
CN113157159A (zh) * 2021-04-16 2021-07-23 深圳传音控股股份有限公司 处理方法、移动终端及存储介质
CN113534957B (zh) * 2021-07-13 2024-05-14 南京统信软件技术有限公司 一种单手控制方法、装置及移动终端
CN115701576A (zh) * 2021-08-02 2023-02-10 北京小米移动软件有限公司 信息处理方法、装置、电子设备和存储介质
CN114398016B (zh) * 2022-01-12 2024-06-11 金华鸿正科技有限公司 界面显示方法和装置
CN116027934B (zh) * 2022-08-11 2023-10-20 荣耀终端有限公司 展示卡片的方法及装置

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014030456A1 (ja) * 2012-08-23 2014-02-27 株式会社エヌ・ティ・ティ・ドコモ ユーザインタフェース装置、ユーザインタフェース方法及びプログラム
WO2016145832A1 (zh) * 2015-08-04 2016-09-22 中兴通讯股份有限公司 终端的操作方法及装置
CN106899763A (zh) * 2017-02-27 2017-06-27 佛山市腾逸科技有限公司 一种大屏幕触摸手机的图标界面单手操作方法
CN107179875A (zh) * 2017-06-26 2017-09-19 深圳传音通讯有限公司 窗口调整的方法和装置

Family Cites Families (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4364273B2 (ja) * 2007-12-28 2009-11-11 パナソニック株式会社 携帯端末装置及び表示制御方法並びに表示制御プログラム
CN102375652A (zh) 2010-08-16 2012-03-14 ***通信集团公司 移动终端用户界面调整***及其调整方法
CN102203715B (zh) * 2011-05-23 2013-03-20 华为终端有限公司 一种输入方法、输入装置和终端设备
CN102810039A (zh) 2011-05-31 2012-12-05 中兴通讯股份有限公司 左右手自适应的虚拟键盘显示方法及终端
KR101824388B1 (ko) * 2011-06-10 2018-02-01 삼성전자주식회사 사용자의 신체특성을 고려한 동적 사용자 인터페이스 제공 장치 및 방법
JP5453351B2 (ja) * 2011-06-24 2014-03-26 株式会社Nttドコモ 移動情報端末、操作状態判定方法、プログラム
JP5588931B2 (ja) * 2011-06-29 2014-09-10 株式会社Nttドコモ 移動情報端末、配置領域取得方法、プログラム
CN102624977B (zh) 2012-02-17 2014-03-19 深圳市金立通信设备有限公司 根据用户左右手使用习惯进行手机界面切换的***及方法
KR102094695B1 (ko) 2012-05-21 2020-03-31 삼성전자주식회사 터치 스크린을 이용하는 사용자 인터페이스 제어 방법 및 장치
EP2863297A4 (en) * 2012-06-18 2016-03-30 Yulong Computer Telecomm Tech TERMINAL AND METHOD FOR MANAGING INTERFACE OPERATION
CN102799356B (zh) 2012-06-19 2018-07-17 中兴通讯股份有限公司 优化移动终端大屏触屏单手操作的***、方法及移动终端
CN102799268A (zh) 2012-07-03 2012-11-28 广东欧珀移动通信有限公司 一种手持终端左右手识别方法
KR20140017429A (ko) * 2012-08-01 2014-02-11 삼성전자주식회사 화면 동작 방법 및 그 전자 장치
JP6053500B2 (ja) * 2012-12-21 2016-12-27 京セラ株式会社 携帯端末ならびにユーザインターフェース制御プログラムおよび方法
US8769431B1 (en) * 2013-02-28 2014-07-01 Roy Varada Prasad Method of single-handed software operation of large form factor mobile electronic devices
CN103354581B (zh) * 2013-06-14 2015-02-18 广东欧珀移动通信有限公司 一种通过左右手来自动调整手机控件的方法及***
CN103365599B (zh) * 2013-07-31 2016-11-02 广州市动景计算机科技有限公司 基于滑屏轨迹的移动终端操作优化方法及装置
US10073493B2 (en) * 2014-03-19 2018-09-11 Sony Corporation Device and method for controlling a display panel
CN103870199B (zh) 2014-03-31 2017-09-29 华为技术有限公司 手持设备上用户操作模式的识别方法及手持设备
CN105204756A (zh) * 2014-06-30 2015-12-30 阿尔卡特朗讯 一种用于操作触摸屏设备的屏幕的方法与设备
KR102255143B1 (ko) * 2014-09-02 2021-05-25 삼성전자주식회사 벤디드 디스플레이를 구비한 휴대 단말기의 제어 방법 및 장치
CN104601795B (zh) 2014-11-03 2017-03-29 中国科学技术大学苏州研究院 一种智能手机用户左右手识别方法
US20160162149A1 (en) * 2014-12-05 2016-06-09 Htc Corporation Mobile electronic device, method for displaying user interface, and recording medium thereof
KR102332015B1 (ko) * 2015-02-26 2021-11-29 삼성전자주식회사 터치 처리 방법 및 이를 지원하는 전자 장치
CN104866199B (zh) * 2015-05-29 2019-02-12 小米科技有限责任公司 单手模式下的按键操作处理方法及装置、电子设备
CN104932825B (zh) 2015-06-15 2018-04-20 南京韵奇盈信息技术有限公司 一种自动感知左右手操作手机并确定拇指活动热区的方法
US20170024086A1 (en) * 2015-06-23 2017-01-26 Jamdeo Canada Ltd. System and methods for detection and handling of focus elements
CN105183273B (zh) * 2015-08-04 2019-03-15 京东方科技集团股份有限公司 一种显示面板、移动终端及移动终端的控制方法
WO2017039125A1 (en) * 2015-08-28 2017-03-09 Samsung Electronics Co., Ltd. Electronic device and operating method of the same
KR20170088691A (ko) * 2016-01-25 2017-08-02 엘지전자 주식회사 페어링된 장치, 알림 및 어플리케이션의 제어에 관한 한 손 조작 모드를 적용한 이동 통신 단말기
CN106095185B (zh) * 2016-06-21 2019-08-20 维沃移动通信有限公司 一种单手操作方法及移动终端
EP3477454A4 (en) * 2016-06-23 2020-01-15 KYOCERA Document Solutions Inc. PORTABLE TERMINAL DEVICE AND CONTROL METHOD FOR PORTABLE TERMINAL DEVICE
CN106293396A (zh) * 2016-08-05 2017-01-04 北京小米移动软件有限公司 终端控制方法、装置及终端
CN106527860A (zh) * 2016-11-07 2017-03-22 上海与德信息技术有限公司 一种屏幕界面显示方法及装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014030456A1 (ja) * 2012-08-23 2014-02-27 株式会社エヌ・ティ・ティ・ドコモ ユーザインタフェース装置、ユーザインタフェース方法及びプログラム
WO2016145832A1 (zh) * 2015-08-04 2016-09-22 中兴通讯股份有限公司 终端的操作方法及装置
CN106899763A (zh) * 2017-02-27 2017-06-27 佛山市腾逸科技有限公司 一种大屏幕触摸手机的图标界面单手操作方法
CN107179875A (zh) * 2017-06-26 2017-09-19 深圳传音通讯有限公司 窗口调整的方法和装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4033339A4 (en) * 2019-10-23 2022-11-30 Huawei Technologies Co., Ltd. USER INTERFACE DISPLAY METHOD AND ELECTRONIC DEVICE
CN112083858A (zh) * 2020-08-31 2020-12-15 珠海格力电器股份有限公司 控件的显示位置调整方法及装置

Also Published As

Publication number Publication date
CN109997348A (zh) 2019-07-09
US11307760B2 (en) 2022-04-19
US20200272322A1 (en) 2020-08-27
CN109997348B (zh) 2021-10-15

Similar Documents

Publication Publication Date Title
CN109997348B (zh) 一种终端界面的显示方法及终端
CN109891379B (zh) 一种防误触方法及终端
US10268302B2 (en) Method and apparatus for recognizing grip state in electronic device
CN111149086B (zh) 编辑主屏幕的方法、图形用户界面及电子设备
US9400561B2 (en) Method of operating gesture based communication channel and portable terminal system for supporting the same
WO2019183788A1 (zh) 一种基于场景推荐应用的方法及装置
US11526274B2 (en) Touch control method and apparatus
WO2018223270A1 (zh) 一种显示的处理方法及装置
WO2019104677A1 (zh) 不同屏显示不同的应用快捷菜单
US20140055395A1 (en) Method and apparatus for controlling scrolling
WO2019000287A1 (zh) 一种图标显示方法及装置
WO2019104478A1 (zh) 一种识别截图文字的方法及终端
WO2018082657A1 (zh) 一种查找图标的方法及终端
US20150160731A1 (en) Method of recognizing gesture through electronic device, electronic device, and computer readable recording medium
AU2017433305B2 (en) Task switching method and terminal
TW201516844A (zh) 一種物件選擇的方法和裝置
US20190387094A1 (en) Mobile terminal and method for operating same
WO2017128986A1 (zh) 多媒体菜单项的选择方法、装置及存储介质
WO2019047129A1 (zh) 一种移动应用图标的方法及终端
WO2020125476A1 (zh) 一种触控显示屏操作方法和用户设备
US9342736B2 (en) Electronic device having sensor unit and operating method thereof
WO2017035794A1 (zh) 显示器操作的方法、装置、用户界面及存储介质
WO2019183969A1 (zh) 一种终端的显示方法及终端
CN118113195A (zh) 悬浮控件的操作方法及电子设备
CN116048334A (zh) 页面切换方法、页面切换装置、电子设备和可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17925597

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17925597

Country of ref document: EP

Kind code of ref document: A1