WO2024114234A1 - 一种单手操作方法及电子设备 - Google Patents

一种单手操作方法及电子设备 Download PDF

Info

Publication number
WO2024114234A1
WO2024114234A1 PCT/CN2023/127972 CN2023127972W WO2024114234A1 WO 2024114234 A1 WO2024114234 A1 WO 2024114234A1 CN 2023127972 W CN2023127972 W CN 2023127972W WO 2024114234 A1 WO2024114234 A1 WO 2024114234A1
Authority
WO
WIPO (PCT)
Prior art keywords
area
interface
control
display
electronic device
Prior art date
Application number
PCT/CN2023/127972
Other languages
English (en)
French (fr)
Inventor
高超
赵增智
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2024114234A1 publication Critical patent/WO2024114234A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Definitions

  • the embodiments of the present application relate to the technical field of electronic devices, and more particularly to a one-handed operation method and an electronic device.
  • the present application provides a one-handed operation method and an electronic device, which can enhance the one-handed operation experience of a user.
  • a one-hand operation method which can be applied to an electronic device, and the electronic device can include a display screen.
  • the one-hand operation method includes: displaying a first interface; responding to a trigger instruction of a one-hand operation mode, determining a first area and a second area of the display screen; wherein the first area is an area on the display screen that can be touched by a finger of a user's hand when the user holds the electronic device with one hand, and the second area is the remaining area on the display screen except the first area; displaying a reduced first interface in the second area; responding to a touch operation on the first area, executing a function corresponding to the touch operation on the reduced first interface.
  • the electronic device when the electronic device displays the first interface through the screen display area of the display screen, if a trigger instruction to enter the one-hand operation mode is detected, the electronic device can use the area on the display screen that can be touched by the user with one hand (i.e., part of the screen display area of the electronic device display screen) as the touch area of the one-hand operation mode to receive the touch operation of the user with one hand, and at the same time use the remaining area on the display screen except the touch area as the display area of the one-hand operation mode, so as to reduce the original screen display area of the display screen and the first interface displayed in the original screen display area and display them in the display area of the one-hand operation mode.
  • the first interface may include a target control
  • the response to the touch operation on the first area and the execution of the function corresponding to the touch operation on the reduced first interface may include: responding to the touch operation on the target position in the first area, mapping the touch operation on the target position in the first area to a touch operation on the target control in the reduced first interface; wherein a coordinate mapping relationship is pre-established between the target position in the first area and the position of the target control in the first interface; responding to the touch operation on the target control in the reduced first interface, mapping the touch operation on the target position in the first area to a touch operation on the target control in the reduced first interface
  • the interface executes the function corresponding to the target control.
  • the electronic device can map the user's touch operation at a certain position in the touch area to the touch operation at the corresponding position in the display area, so that the electronic device can perform the function required to be performed when the corresponding position in the display area is touched. Therefore, the user does not need to operate in the display area that cannot be touched by one hand, and the user can directly perform touch operations in the touch area that can be touched by one hand, realizing full-range operation of the original interface that is reduced in the display area.
  • a coordinate mapping relationship can be pre-established between the center of the video playback interface and the center of the first area.
  • the above-mentioned response to the touch operation applied to the first area and executing the function corresponding to the touch operation on the reduced first interface can include: responding to the touch operation applied to the center of the first area, mapping the touch operation applied to the center of the first area to a touch operation applied to the playback control in the reduced video playback interface; responding to the touch operation applied to the playback control in the reduced video playback interface, and playing the video in the reduced video playback interface.
  • the electronic device can establish a mapping relationship based on the layout orientation of the target control in the first interface, such as the center position, upper left, upper right, lower left, lower right, etc. of the display area of the target control, and the corresponding position in the touch area of the one-handed operation mode, such as the center position, upper left, upper right, lower left, lower right, etc. of the touch area, to ensure that the user's touch position in the touch area corresponds to the position of the target control in the first interface, so that the user can quickly and accurately touch the control they want to operate in the touch area.
  • the first interface may include the target control, and after displaying the reduced first interface, the one-handed operation method further includes: displaying the target control in the first area. In this way, the electronic device can also map the controls in the first interface to the touch area for display, so that the user can intuitively and accurately control the controls they want to operate in the touch area.
  • the above-mentioned response to the touch operation on the first area and executing the function corresponding to the touch operation on the reduced first interface may include: responding to the touch operation on the target control in the first area and executing the function corresponding to the target control on the reduced first interface. Since the target control displayed in the touch area is mapped from the target control in the first interface, when the target control displayed in the touch area is triggered, it can be considered that the target control in the first interface is triggered, and the electronic device can directly execute the function required to be executed when the target control is triggered on the reduced first interface.
  • the one-handed operation method further includes: displaying the playback control in the first area.
  • the above-mentioned response to the touch operation on the first area and performing the function corresponding to the touch operation on the reduced first interface may include: responding to the touch operation on the playback control in the first area and playing the video in the reduced video playback interface.
  • the target control is a first control.
  • the one-hand operation method may further include: according to the display style of each control in the first interface, obtaining from the first interface a control that is fixedly displayed at a first position in the first interface as the first control; wherein the first position includes at least one of a top position and a bottom position.
  • controls located at the top or bottom of a user interface are fixed-position operation controls that do not move with page browsing. Therefore, the electronic device may map controls located at the top or bottom of the user interface to the touch area for display, so that the user can operate these fixed-position controls in the touch area at any time.
  • the corresponding controls mapped to the top or bottom of the touch area may also be displayed.
  • the first control may include at least one of a title bar, a navigation bar, and a menu bar.
  • the target control is a second control.
  • the one-hand operation method further includes: obtaining, from the first interface, a control that is non-fixedly displayed at a first position in the first interface as a second control according to the display style of each control in the first interface; wherein the first position includes at least one of a top position and a bottom position; displaying a selection cursor in the reduced first interface, the selection cursor including a boundary, which defines a selection area of the selection cursor in the reduced first interface, and the selection area is used to select at least one second control; displaying the target control in the first area includes: displaying the second control in the selection area of the selection cursor in the first area.
  • the electronic device Since the second control in the first interface moves with the page browsing and has a non-fixed position, it will be gradually displayed or gradually not displayed as the page is browsed. Therefore, the electronic device does not need to map the second control all the time, but when the user browses to the second control
  • the electronic device can determine the second control currently browsed by the user by displaying a selection cursor, so that the electronic device can map the second control selected by the selection cursor to the touch area for display.
  • the selection cursor can move with the user's sliding operation on the touch area.
  • the electronic device can follow the movement of the selection cursor and map the second control newly selected after the selection cursor moves to the touch area for display. In this way, the electronic device can map the control selected by the selection cursor to the touch area for display in real time.
  • the one-hand operation method may further include: when the selection area of the selection cursor includes the first option control, displaying the first option control in the first area.
  • the electronic device may display a selection cursor in the browsing interface, and the selection cursor may be any selection control in the browsing interface.
  • the electronic device displays the selection control selected by the selection cursor in the touch area, so that the user can intuitively and accurately control the selection control with one hand in the touch area.
  • the one-hand operation method may further include: combining multiple second controls with matching display styles to obtain a combined control; and determining the area occupied by the combined control as the selection area of the selection cursor.
  • the above-mentioned displaying the second control in the selection area of the selection cursor in the first area includes: displaying the combined control in the selection area of the selection cursor in the first area.
  • the electronic device can recombine these second controls to obtain a combined control that can be mapped together to the touch area display, and then the user can directly operate a specific control that he wants to operate in the combined control displayed in the touch area.
  • the one-hand operation method may further include: according to the display style of each second control, using multiple second controls arranged in the same direction as multiple second controls with matching display styles. It can be understood that since the controls arranged in the same direction are usually of the same size, the electronic device can recombine the multiple controls arranged in the same direction to obtain a combined control of a regular shape.
  • the first interface is an application interface
  • the one-hand operation method may further include: when the first interface is an application interface of a first application, multiple second controls arranged in a horizontal direction are used as multiple second controls with matching display styles; when the first interface is an application interface of a second application, multiple second controls arranged in a vertical direction are used as multiple second controls with matching display styles; wherein the first application is different from the second application.
  • the electronic device can adaptively generate matching combination controls according to different application types.
  • the browsing interface includes multiple second controls, and the multiple second controls include a first option control, a second option control, and a third option control arranged in a horizontal direction as an example
  • the one-hand operation method may also include: combining the first option control, the second option control, and the third option control arranged in a horizontal direction to obtain an option combination control; when the option combination control is included in the selection area of the selection cursor, displaying the option combination control in the first area; wherein the size of the selection area of the selection cursor matches the size of the area occupied by the option combination control.
  • the electronic device may combine multiple controls arranged in a row in a horizontal direction into a combination control, and the size of the combination control may be the size of the selection area of the selection cursor, so that when the selection cursor moves to the location of the combination control, the electronic device may directly map the combination control to the touch area for display, so that the user can intuitively and accurately control each control in the combination control with one hand in the touch area.
  • the one-hand operation method may further include: when it is detected that the size of the area occupied by the combined control does not match the size of the first area, adjusting the display style of each second control in the combined control to obtain an adjusted combined control; wherein the size of the area occupied by the adjusted combined control matches the size of the first area; and displaying the adjusted combined control in the first area.
  • the electronic device can adaptively adjust the size and position of each control in the combined control according to the size of the touch area, so that when the combined control displayed in the display area is too small, the combined control can be enlarged and displayed when mapped to the touch area, thereby improving the user's control experience in the touch area.
  • the above-mentioned adjustment of the display style of each second control in the combined control includes: adjusting the display size of each second control in the combined control according to the size of the first area; or adjusting the display spacing between two adjacent second controls in the combined control according to the size of the first area; or adjusting the display position of each second control in the combined control according to the size of the first area.
  • the electronic device can adaptively select and adjust the size of each control in the combined control according to the size of the touch area. At least one of size, position, or spacing.
  • taking the first interface as the homepage of an application as an example in response to the touch operation on the first area, executing the function corresponding to the touch operation on the reduced first interface includes: in response to the left swipe operation on the first area, exiting the application, and displaying the reduced desktop interface in the second area. In this way, the user can perform specific gestures in the touch area to achieve specific operations.
  • the above-mentioned response to the touch operation on the first area and the execution of the function corresponding to the touch operation on the reduced first interface include: in response to the left swipe operation on the first area, displaying the reduced second interface, the second interface being the upper level interface of the first interface. In this way, the same gesture may execute different functions in different interfaces.
  • the above-mentioned response to the touch operation on the first area and executing the function corresponding to the touch operation on the reduced first interface includes: detecting a preset gesture operation on the first area, the preset gesture operation is used to trigger a preset function in the third interface; responding to the preset gesture operation, switching the reduced first interface displayed in the second area to the reduced third interface, and executing the preset function.
  • the electronic device can bind a specific gesture operation to a function in a page, so that when the user performs the specific gesture operation in the touch area, the page can be opened with one click to execute the function.
  • the second area includes a target display area and a third area
  • displaying the reduced first interface in the second area includes: displaying the reduced first interface in the target display area; displaying multiple icon controls in the third area; wherein the icon controls include at least one of an application icon control and a shortcut function icon control.
  • the displaying of multiple icon controls in the third area includes: rearranging icon controls of multiple applications in the desktop interface; and displaying the rearranged icon controls of multiple applications in the third area.
  • the one-hand operation method further includes: displaying a switching control in the first area, wherein the action area of the first area is the target display area; in response to a touch operation on the switching control in the first area, determining that the action area of the first area is switched from the target display area to the third area; in response to the touch operation on the first area, performing a function corresponding to the touch operation on the first area on multiple icon controls in the third area.
  • the electronic device can provide a switching control in the touch area, so that the user can control whether the current touch area is to operate the first interface displayed in the original screen display area, or to operate multiple icons additionally displayed in the third area.
  • an electronic device comprising: a display unit, a determination unit and an execution unit.
  • the display unit is used to display a first interface.
  • the determination unit is used to respond to a trigger instruction of a one-handed operation mode to determine a first area and a second area of the display screen; wherein the first area is an area on the display screen that can be touched by the fingers of a user's one hand when the user holds the electronic device with one hand, and the second area is the remaining area on the display screen except the first area.
  • the determination unit is also used to display a reduced first interface in the second area.
  • the execution unit is used to respond to a touch operation applied to the first area and execute a function corresponding to the touch operation on the reduced first interface.
  • the determination unit may be used to: prompt the user to slide the finger of one hand on the display screen along a specified trajectory when holding the electronic device with one hand; determine the maximum area that the finger of one hand can touch on the display screen based on the sliding of the finger of one hand; and determine the first area of the display screen based on the maximum area.
  • the first interface may include a target control
  • the execution unit may be used to: respond to a touch operation on a target position in the first area, map the touch operation on the target position in the first area to a touch operation on a target control in the reduced first interface; wherein a coordinate mapping relationship is pre-established between the target position in the first area and the position of the target control in the first interface; and respond to the touch operation on the target control in the reduced first interface, execute the function corresponding to the target control on the reduced first interface.
  • a coordinate mapping relationship may be pre-established between the center of the video playback interface and the center of the first area, and the execution unit may be used to: respond to a touch operation applied to the center of the first area, and The operation is mapped to a touch operation acting on the playback control in the reduced video playback interface; in response to the touch operation acting on the playback control in the reduced video playback interface, the video in the reduced video playback interface is played.
  • the first interface may include a target control
  • the display unit may be further configured to: display the target control in the first area.
  • the execution unit may be configured to: respond to a touch operation on the target control in the first area, and execute a function corresponding to the target control on the reduced first interface.
  • the display unit can also be used to: display the playback control in the first area.
  • the execution unit can be used to: respond to a touch operation on the playback control in the first area and play the video in the reduced video playback interface.
  • the above-mentioned target control is a first control
  • the electronic device may further include: a first acquisition unit, used to acquire, from the first interface, a control fixedly displayed at a first position in the first interface according to a display style of each control in the first interface, as the first control; wherein the first position includes at least one of a top position and a bottom position.
  • the first control may include at least one of a title bar, a navigation bar, and a menu bar.
  • the target control is a second control
  • the electronic device may further include: a second acquisition unit, for acquiring, from the first interface, a control that is not fixedly displayed at a first position in the first interface, as a second control, according to the display style of each control in the first interface; wherein the first position includes at least one of a top position and a bottom position.
  • the display unit may also be used to: display a selection cursor in the reduced first interface, the selection cursor including a border, the border defining a selection area of the selection cursor in the reduced first interface, the selection area being used to select at least one second control; and display a second control in the selection area of the selection cursor in the first area.
  • the above-mentioned display unit can also be used for: when the selection area of the selection cursor includes the first option control, displaying the first option control in the first area.
  • the electronic device may further include: a combination unit and an area generation unit.
  • the combination unit is used to combine multiple second controls with matching display styles to obtain a combined control;
  • the area generation unit is used to determine the area occupied by the combined control as the selection area of the selection cursor.
  • the above-mentioned display unit may also be used to: display the combined control in the selection area of the selection cursor in the first area.
  • the electronic device may further include: a matching unit, configured to, according to the display style of each second control, use a plurality of second controls arranged along the same direction as a plurality of second controls with matching display styles.
  • the first interface is an application interface
  • the matching unit may also be used for: when the first interface is an application interface of a first application, using multiple second controls arranged in a horizontal direction as multiple second controls that match the display style; when the first interface is an application interface of a second application, using multiple second controls arranged in a vertical direction as multiple second controls that match the display style; wherein the first application is different from the second application.
  • the browsing interface includes multiple second controls, and the multiple second controls include a first option control, a second option control, and a third option control arranged in a horizontal direction
  • the above-mentioned combination unit can be used to: combine the first option control, the second option control, and the third option control arranged in a horizontal direction to obtain an option combination control.
  • the above-mentioned display unit can also be used to: when the selection area of the selection cursor includes the option combination control, display the option combination control in the first area; wherein the size of the selection area of the selection cursor matches the size of the area occupied by the option combination control.
  • the electronic device may further include: an adjustment unit, configured to adjust the display style of each second control in the combined control when detecting that the size of the area occupied by the combined control does not match the size of the first area, to obtain an adjusted combined control; wherein the size of the area occupied by the adjusted combined control matches the size of the first area.
  • the above-mentioned display unit may also be configured to: display the adjusted combined control in the first area.
  • the above-mentioned adjustment unit can also be used to: adjust the display size of each second control in the combined control according to the size of the first area; or adjust the display spacing between two adjacent second controls in the combined control according to the size of the first area; or adjust the display position of each second control in the combined control according to the size of the first area.
  • the above-mentioned execution unit can be used to: respond to a left swipe operation on the first area, exit the application, and display a reduced desktop interface in the second area.
  • the execution unit may be used to: respond to a left swipe operation on the first area, display a reduced second interface, where the second interface is an upper-level interface of the first interface.
  • the above-mentioned execution unit can be used to: detect a preset gesture operation acting on the first area, the preset gesture operation is used to trigger a preset function in the third interface; in response to the preset gesture operation, switch the reduced first interface displayed in the second area to the reduced third interface, and execute the preset function.
  • the second area includes a target display area and a third area
  • the display unit can be used to: display the reduced first interface in the target display area; display multiple icon controls in the third area; wherein the icon controls include at least one of an application icon control and a shortcut function icon control.
  • the display unit may be used to: rearrange the icon controls of multiple application programs in the desktop interface; and display the rearranged icon controls of the multiple application programs in the third area.
  • the display unit may be used to display a switching control in a first area, wherein the active area of the first area is a target display area.
  • the electronic device may further include a switching unit, which is used to respond to a touch operation on the switching control in the first area and determine that the active area of the first area is switched from the target display area to a third area.
  • the execution unit may also be used to respond to a touch operation on the first area and execute a function corresponding to the touch operation on the first area on multiple icon controls in the third area.
  • the present application provides an electronic device, including a display screen, one or more processors, and one or more memories.
  • the display screen, the one or more memories are coupled to the one or more processors, the one or more memories are used to store computer program codes, the computer program codes include computer instructions, and when the one or more processors execute the computer instructions, the electronic device executes the one-handed operation method in any possible implementation of the first aspect.
  • the present application provides a one-handed operation device, which is included in an electronic device, and has the function of implementing the electronic device behavior in any of the above-mentioned first aspect and the possible implementation methods of the first aspect.
  • the function can be implemented by hardware, or by hardware executing corresponding software.
  • the hardware or software includes one or more modules or units corresponding to the above-mentioned functions.
  • the present application provides a chip system, which is applied to an electronic device.
  • the chip system includes one or more interface circuits and one or more processors.
  • the interface circuit and the processor are interconnected by a line.
  • the interface circuit is used to receive a signal from a memory of the electronic device and send the signal to the processor, where the signal includes a computer instruction stored in the memory.
  • the processor executes the computer instruction
  • the electronic device executes the one-handed operation method in any possible implementation of the first aspect above.
  • the present application provides a computer storage medium, including computer instructions.
  • the computer instructions When the computer instructions are executed on an electronic device, the electronic device executes the one-handed operation method in any possible implementation of the first aspect.
  • the present application provides a computer program product.
  • the computer program product When the computer program product is run on an electronic device, the electronic device executes the one-handed operation method in any possible implementation of the first aspect.
  • the beneficial effects that can be achieved by the electronic device of the second aspect and any possible implementation thereof, the electronic device of the third aspect, the one-handed operation device of the fourth aspect, the chip system of the fifth aspect, the computer storage medium of the sixth aspect, and the computer program product of the seventh aspect can be referred to the beneficial effects of the first aspect and any possible implementation thereof, and will not be repeated here.
  • FIG1A is a schematic diagram of the hardware structure of an electronic device provided in an embodiment of the present application.
  • FIG1B is a schematic diagram of an example of a software architecture of an electronic device provided in an embodiment of the present application.
  • FIG2 is a schematic diagram of a scenario in which a user operates an electronic device when he/she is right-handed according to an embodiment of the present application;
  • FIG3 is a first schematic diagram of an interface of an electronic device provided in an embodiment of the present application.
  • FIG4 is a second schematic diagram of an interface of an electronic device provided in an embodiment of the present application.
  • FIG5 is a third schematic diagram of an interface of an electronic device provided in an embodiment of the present application.
  • FIG6 is a fourth schematic diagram of an interface of an electronic device provided in an embodiment of the present application.
  • FIG7 is a fifth schematic diagram of an interface of an electronic device provided in an embodiment of the present application.
  • FIG8 is a sixth schematic diagram of an interface of an electronic device provided in an embodiment of the present application.
  • FIG9 is a seventh schematic diagram of an interface of an electronic device provided in an embodiment of the present application.
  • FIG10 is a schematic diagram of an interface of an electronic device according to an embodiment of the present application.
  • FIG11 is a ninth schematic diagram of an interface of an electronic device provided in an embodiment of the present application.
  • FIG12 is a schematic diagram of an interface of an electronic device provided in an embodiment of the present application.
  • FIG13 is a schematic diagram eleven of an interface of an electronic device provided in an embodiment of the present application.
  • FIG14 is a twelfth schematic diagram of an interface of an electronic device provided in an embodiment of the present application.
  • FIG15 is a thirteenth schematic diagram of an interface of an electronic device provided in an embodiment of the present application.
  • FIG16 is a schematic diagram 14 of an interface of an electronic device provided in an embodiment of the present application.
  • FIG17 is a fifteenth schematic diagram of an interface of an electronic device provided in an embodiment of the present application.
  • FIG18 is a sixteenth schematic diagram of an interface of an electronic device provided in an embodiment of the present application.
  • FIG19 is a schematic diagram seventeen of an interface of an electronic device provided in an embodiment of the present application.
  • FIG20 is a schematic diagram 18 of an interface of an electronic device provided in an embodiment of the present application.
  • FIG21 is a nineteenth schematic diagram of an interface of an electronic device provided in an embodiment of the present application.
  • FIG22 is a schematic diagram 20 of an interface of an electronic device provided in an embodiment of the present application.
  • FIG23 is a schematic diagram 21 of an interface of an electronic device provided in an embodiment of the present application.
  • FIG24 is a method flow chart of a one-handed operation method provided in an embodiment of the present application.
  • FIG. 25 is a method flow chart of another one-handed operation method provided in an embodiment of the present application.
  • first and second are used for descriptive purposes only and cannot be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated.
  • the features defined as “first” and “second” may explicitly or implicitly include one or more of the features.
  • “multiple” means two or more.
  • the embodiment of the present application provides a one-handed operation method, which can be applied to electronic devices.
  • the electronic device can be a tablet, a mobile phone, a vehicle-mounted device, an augmented reality (AR)/virtual reality (VR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a personal digital assistant (PDA), and other electronic devices with a display screen.
  • AR augmented reality
  • VR virtual reality
  • UMPC ultra-mobile personal computer
  • PDA personal digital assistant
  • the embodiment of the present application does not impose any restrictions on the specific type of the electronic device.
  • FIG1A shows a schematic diagram of the structure of an electronic device 100.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, and a subscriber identification module (SIM) card interface 195, etc.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, a gravity sensor, etc.
  • the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the electronic device 100.
  • the electronic device 100 may include more or fewer components than shown in the figure, or combine some components, or split some components, or arrange the components differently.
  • the components shown in the figure may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (AP), a modem processor, a graphics processor (GPU), an image signal processor (ISP), a controller, a memory, a video codec, a digital signal processor (DSP), a baseband processor, and/or a neural-network processing unit (NPU), etc.
  • AP application processor
  • GPU graphics processor
  • ISP image signal processor
  • controller a memory
  • video codec a digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • Different processing units may be independent devices or integrated in one or more processors.
  • the controller may be the nerve center and command center of the electronic device 100.
  • the controller may generate an operation control signal according to the instruction operation code and the timing signal to complete the control of fetching and executing instructions.
  • the processor 110 may also be provided with a memory for storing instructions and data.
  • the memory in the processor 110 is a cache memory.
  • the memory may store instructions or data that the processor 110 has just used or cyclically used. If the processor 110 needs to use the instruction or data again, it may be directly called from the memory. This avoids repeated access, reduces the waiting time of the processor 110, and thus improves the efficiency of the system.
  • processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, pulse code modulation (pulse code modulation, PCM) interface, universal asynchronous receiver/transmitter (universal asynchronous receiver/transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general-purpose input/output (general-purpose input/output, GPIO) interface, subscriber identity module (subscriber identity module, SIM) interface, and/or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous receiver/transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • subscriber identity module subscriber identity module
  • SIM subscriber identity module
  • USB universal serial bus
  • the USB interface 130 is an interface that complies with the USB standard specification, and specifically can be a Mini USB interface, a Micro USB interface, a USB Type C interface, etc.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transmit data between the electronic device 100 and a peripheral device. It can also be used to connect headphones to play audio through the headphones.
  • the interface can also be used to connect other electronic devices, such as AR devices, etc.
  • the interface connection relationship between the modules illustrated in the embodiment of the present application is only a schematic illustration and does not constitute a structural limitation on the electronic device 100.
  • the electronic device 100 may also adopt different interface connection methods in the above embodiments, or a combination of multiple interface connection methods.
  • the charging management module 140 is used to receive charging input from a charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 may receive charging input from a wired charger through the USB interface 130.
  • the charging management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. While the charging management module 140 is charging the battery 142, it may also power the electronic device through the power management module 141.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and provides power to the processor 110, the internal memory 121, the external memory, the display screen 194, the camera 193, and the wireless communication module 160.
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle number, battery health status (leakage, impedance), etc.
  • the power management module 141 can also be set in the processor 110.
  • the power management module 141 and the charging management module 140 can also be set in the same device.
  • the wireless communication function of the electronic device 100 can be implemented through the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor and the baseband processor.
  • the electronic device 100 implements the display function through a GPU, a display screen 194, and an application processor.
  • the GPU is a microprocessor for image processing, which connects the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • the processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos, etc.
  • the display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (AMOLED), a flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diodes (QLED), etc.
  • the electronic device 100 may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the display screen 194 can be used to display a touch window and an application window.
  • the touch window is displayed on the display screen 194 in an area that can be touched by a user with one hand
  • the application window is displayed on the display screen 194 in the remaining area except the area where the touch window is located.
  • the electronic device 100 can realize the shooting function through ISP, camera 193, video codec, GPU, display screen 194 and application processor.
  • ISP is used to process the data fed back by camera 193. For example, when taking a photo, the shutter is opened, and the light is transmitted to the camera photosensitive element through the lens. The light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to ISP for processing and converts it into an image visible to the naked eye. ISP can also perform algorithm optimization on the noise, brightness, and skin color of the image. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene. In some embodiments, ISP can be set in camera 193.
  • the camera 193 is used to capture still images or videos.
  • the object generates an optical image through the lens and projects it onto the photosensitive element.
  • the photosensitive element can be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, which is then transmitted to the ISP for conversion into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • the DSP converts the digital image signal into an image signal.
  • the signal is converted into an image signal in a standard RGB, YUV or other format.
  • the electronic device 100 may include 1 or N cameras 193, where N is a positive integer greater than 1.
  • the digital signal processor is used to process digital signals, and can process not only digital image signals but also other digital signals. For example, when the electronic device 100 is selecting a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy.
  • Video codecs are used to compress or decompress digital videos.
  • the electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record videos in a variety of coding formats, such as Moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • MPEG Moving Picture Experts Group
  • MPEG2 MPEG2, MPEG3, MPEG4, etc.
  • the camera 193 can be used to collect the user's hand information.
  • the electronic device 100 prompts the user to enter the hand information.
  • the processor 110 turns on the camera 193 to shoot to obtain the user's hand information.
  • the hand information may include information such as the size of the palm, the length of each finger, and the fingerprint of each finger.
  • the electronic device 100 obtains the user's hand information mainly to obtain the length of the user's thumb.
  • the length of the thumb is the distance between the farthest touch point that can be touched by the user's thumb and the holding point when performing a touch operation on the display screen 194.
  • the holding point may be the point where the user's palm contacts the edge of the display screen 194.
  • the distance between the farthest touch point 201 of the thumb and the holding point 202 is the length of the thumb.
  • the electronic device 100 obtains the user's hand information to obtain the length of the thumb, which is used to determine the position and size of the touch window display when the "one-handed operation" function is subsequently turned on, to ensure that the user can control the display content in the application window through touch operations acting on the touch window when holding the mobile phone.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music and videos can be stored in the external memory card.
  • the internal memory 121 can be used to store computer executable program codes, and the executable program codes include instructions.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by running the instructions stored in the internal memory 121.
  • the internal memory 121 may include a program storage area and a data storage area.
  • the program storage area may store an operating system, an application required for at least one function (such as a sound playback function, an image playback function, etc.), etc.
  • the data storage area may store data created during the use of the electronic device 100 (such as audio data, a phone book, etc.), etc.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one disk storage device, a flash memory device, a universal flash storage (UFS), etc.
  • UFS universal flash storage
  • the electronic device 100 can implement audio functions such as music playing and recording through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone jack 170D, and the application processor.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signals.
  • the audio module 170 can also be used to encode and decode audio signals.
  • the audio module 170 can be arranged in the processor 110, or some functional modules of the audio module 170 can be arranged in the processor 110.
  • the pressure sensor 180A is used to sense the pressure signal and can convert the pressure signal into an electrical signal.
  • the pressure sensor 180A can be set on the display screen 194.
  • the capacitive pressure sensor can be a parallel plate including at least two conductive materials.
  • the electronic device 100 determines the intensity of the pressure according to the change in capacitance.
  • the electronic device 100 detects the touch operation intensity according to the pressure sensor 180A.
  • the electronic device 100 can also calculate the touch position according to the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but with different touch operation intensities can correspond to different operation instructions. For example: when a touch operation with a touch operation intensity less than the first pressure threshold acts on the short message application icon, an instruction to view the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, an instruction to create a new short message is executed.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to implement fingerprint unlocking, access application locks, fingerprint photography, fingerprint call answering, etc.
  • the temperature sensor 180J is used to detect temperature.
  • the electronic device 100 uses the temperature sensor 180J to detect temperature, and executes a temperature handling strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 reduces the performance of a processor located near the temperature sensor 180J to reduce power consumption and implement thermal protection. In other embodiments, when the temperature is lower than another threshold, the electronic device 100 heats the battery 142 to avoid abnormal shutdown of the electronic device 100 due to low temperature. In other embodiments, when the temperature is lower than another threshold, the electronic device 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
  • the touch sensor 180K is also called a "touch panel”.
  • the touch sensor 180K can be set on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, also called a "touch screen”.
  • the touch sensor 180K is used to detect touch operations (such as long press, swipe up, swipe left, single click, double click, etc.) acting on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to the touch operation can be provided through the display screen 194.
  • the touch sensor 180K can also be set on the surface of the electronic device 100, which is different from the position of the display screen 194.
  • the key 190 includes a power key, a volume key, etc.
  • the key 190 may be a mechanical key or a touch key.
  • the electronic device 100 may receive key input and generate key signal input related to user settings and function control of the electronic device 100.
  • Motor 191 can generate vibration prompts.
  • Motor 191 can be used for incoming call vibration prompts, and can also be used for touch vibration feedback.
  • touch operations acting on different applications can correspond to different vibration feedback effects.
  • touch operations acting on different areas of the display screen 194 can also correspond to different vibration feedback effects.
  • Different application scenarios for example: time reminders, receiving messages, alarm clocks, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 may be an indicator light, which may be used to indicate the charging status, power changes, messages, missed calls, notifications, etc.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be connected to or separated from the electronic device 100 by inserting it into the SIM card interface 195 or pulling it out from the SIM card interface 195.
  • the electronic device 100 can support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM cards, Micro SIM cards, SIM cards, and the like. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the multiple cards can be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 can also be compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to implement functions such as calls and data communications.
  • the electronic device 100 uses an eSIM, i.e., an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
  • the electronic device 100 may be equipped with Android Microsoft Or terminal devices with other operating systems, the embodiments of the present application do not limit the operating system installed in the electronic device.
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. Taking the system as an example, the software structure of the electronic device 100 is exemplified.
  • FIG1B is a software structure diagram of the electronic device 100 of an embodiment of the present application.
  • the layered architecture divides the software into several layers, each layer has a clear role and division of labor.
  • the layers communicate with each other through software interfaces.
  • the Android system is divided into four layers, from top to bottom, namely, the application layer, the application framework layer, the Android runtime (Android runtime) and the system library, and the kernel layer.
  • the application layer can include a series of application packages.
  • the application package may include applications (APP) such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, short message, etc.
  • applications such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, short message, etc.
  • APP applications
  • the application on the electronic device can be a native application or a third-party application, which is not limited in the embodiments of the present application.
  • the application framework layer provides application programming interface (API) and programming framework for the applications in the application layer.
  • API application programming interface
  • the application framework layer includes some predefined functions.
  • the application framework layer may include a window manager service (VMS), an activity manager service (AMS), an input event management server (IMS), a content provider, a view system, a phone manager, a resource manager, a notification manager, etc.
  • VMS window manager service
  • AMS activity manager service
  • IMS input event management server
  • the window management server is used to manage window programs.
  • the window management server can obtain the display screen size, determine whether there is a status bar, lock the screen, capture the screen, etc.
  • the activity manager service is responsible for managing activities, starting, switching, and scheduling components in the system, and managing and scheduling applications.
  • the input manager service can be used to translate and encapsulate the original input events to obtain input events containing more information and send them to the window management server, which stores the clickable area of each application (such as controls), the location information of the focus window, etc. Therefore, the window management server can correctly distribute the input events to the specified control or focus window.
  • Content providers are used to store and retrieve data and make it accessible to applications.
  • the data can include videos, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying images, etc.
  • the view system can be used to build applications.
  • a display interface can be composed of one or more views.
  • a display interface including a text notification icon can include a view for displaying text and a view for displaying images.
  • the phone manager is used to provide communication functions of the electronic device 100, such as management of call status (including connecting, hanging up, etc.).
  • the resource manager provides various resources for applications, such as localized strings, icons, images, layout files, video files, and so on.
  • the notification manager enables applications to display notification information in the status bar. It can be used to convey notification-type messages and can disappear automatically after a short stay without user interaction. For example, the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also be a notification that appears in the system top status bar in the form of a chart or scroll bar text, such as notifications of applications running in the background, or a notification that appears on the screen in the form of a dialog window. For example, a text message is displayed in the status bar, a prompt sound is emitted, an electronic device vibrates, an indicator light flashes, etc.
  • Android Runtime includes core libraries and virtual machines. Android Runtime is responsible for the scheduling and management of the Android system.
  • the core library consists of two parts: one part is the function that needs to be called by the Java language, and the other part is the Android core library.
  • the application layer and the application framework layer run in a virtual machine.
  • the virtual machine executes the Java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • the system library can include multiple functional modules, such as surface manager, media library, 3D graphics processing library (such as OpenGL ES), 2D graphics engine (such as SGL), etc.
  • functional modules such as surface manager, media library, 3D graphics processing library (such as OpenGL ES), 2D graphics engine (such as SGL), etc.
  • the surface manager is used to manage the display subsystem and provide the fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as static image files, etc.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing.
  • a 2D graphics engine is a drawing engine for 2D drawings.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer can include display drivers, input/output device drivers (for example, keyboards, touch screens, headphones, speakers, microphones, etc.), device nodes, camera drivers, audio drivers, and sensor drivers. Users perform input operations through input devices, and the kernel layer can generate corresponding raw input events based on the input operations and store them in device nodes.
  • the embodiment of the present application provides a one-hand operation method.
  • the electronic device displays the first interface through the screen display area of the display screen
  • the electronic device can use the area on the display screen that can be touched by the user with one hand (i.e., part of the display screen of the electronic device) as the touch area of the one-hand operation mode to receive the touch operation of the user's one hand, and at the same time, use the remaining area on the display screen except the touch area as the display area of the one-hand operation mode, so as to reduce the original screen display area of the display screen and the first interface displayed in the original screen display area to display in the display area of the one-hand operation mode.
  • the electronic device can sense the touch operation of the user in the touch area of the one-hand operation mode, so as to control the reduced original screen display area and the first interface displayed in the original screen display area of the display area of the one-hand operation mode according to the touch operation sensed in the touch area.
  • the user can perform a full range of operations on the original interface displayed by the electronic device by performing a touch operation in the touch area.
  • the electronic device can divide the display screen into two areas, one is the touch area of the one-handed operation mode, and the other is the display area of the one-handed operation mode.
  • the touch area of the one-handed operation mode can be the area that the user's finger can touch or easily touch in the display screen when the user operates the electronic device with one hand
  • the display area of the one-handed operation mode can be the remaining area of the display screen except the aforementioned touch area.
  • the electronic device can then reduce the content originally displayed on the display screen by a corresponding multiple so that it can be concentrated in the display area of the one-handed operation mode, and establish a mapping relationship between the operation area of the one-handed operation mode and the display area of the one-handed operation mode, so that when the electronic device detects a touch operation in the touch area of the one-handed operation mode, the electronic device can execute the function that needs to be executed when the display area of the one-handed operation mode is touched according to the mapping relationship.
  • the entire screen display area of the display screen is usually used to display content, and when the electronic device switches from normal mode to one-handed operation mode, a display area for the one-handed operation mode will appear on the display screen of the electronic device. Since the size of the display area for the one-handed operation mode is smaller than the size of the entire screen display area of the display screen, it is necessary to reduce the original entire screen display area of the display screen and the display content therein to ensure that all displays in the normal mode are included in the display area of the one-handed operation mode, and that the information that should be displayed in the normal mode is not missing.
  • the electronic device may also display the touch area of the one-hand operation mode to let the user know the position of the area that can be touched by one hand on the display screen, so that the user can accurately perform touch operations on the display area of the one-hand operation mode within the touch area of the one-hand operation mode.
  • the electronic device may also display the display area of the one-hand operation mode to let the user know the position of the user interface on the display screen during one-hand operation.
  • the electronic device can display the operation area of the one-hand operation mode in the form of a window, and the displayed window can be called a touch window.
  • the position and size of the touch window are consistent with the position and size of the operation area of the one-hand operation mode. That is, once the one-hand operation mode of the electronic device is triggered, the electronic device will display the touch window in the operation area of the one-hand operation mode of the display screen, and will display the reduced original display area and the display content therein in the display area of the one-hand operation mode of the display screen.
  • the electronic device may also highlight the touch area of the one-handed operation mode by highlighting it or by displaying it in a preset color.
  • the electronic device may also highlight the display area of the one-handed operation mode by highlighting it or by displaying it in a preset color. This embodiment of the application is not limited to this.
  • the size and position of the touch area in the one-hand operation mode may be preset by the operating system of the electronic device, for example, the operating system presets at least one of the coordinates and size of the touch area, generates a corresponding configuration file of the touch area, and displays the touch area on the display screen of the electronic device when the electronic device calls the above configuration file.
  • the electronic device may display the touch area in at least one of the upper left corner, lower left corner, upper right corner, and lower right corner of the display screen according to the configuration file, and the size may be smaller than the display size of the display screen.
  • the user can also customize the size and position of the touch area in the one-handed operation mode.
  • the embodiment of the present application provides an initialization setting for the touch window in the one-hand operation mode.
  • the user normally holds the phone in one hand.
  • the phone analyzes the user's one-handed holding posture to determine the area on the display screen where the touch window suitable for the user's one-handed operation mode is located based on the range of positions that the user's thumb can touch on the display screen. In this way, the phone can ensure that the touch area set on the phone is more in line with each user's one-handed operation habits, and the user can operate various positions in the touch area when holding the phone in one hand.
  • the mobile phone when the mobile phone starts the "one-handed operation mode" function for the first time, the mobile phone can perform the initialization setting of the touch area of the one-handed operation mode.
  • the mobile phone can analyze the user's one-handed holding posture to determine the area that the user can touch with one hand on the mobile phone display screen, so that the mobile phone can adaptively adjust the position and size of the touch area of the one-handed operation mode according to the area of the display screen that the user can touch with one hand.
  • the user when the user needs to readjust the position and size of the touch area later, the user can also operate the mobile phone again to perform the initialization setting of the touch area of the one-handed operation mode.
  • the mobile phone may display a prompt animation to remind the user that the mobile phone has currently entered or exited the one-handed operation mode.
  • the display screen of the mobile phone may display a prompt for instructing the user to draw an arc on the display screen.
  • FIG3 shows the arc prompt when the mobile phone is held in the right hand, and when the user is accustomed to holding the mobile phone with the left hand, the display screen of the mobile phone may also display the arc prompt shown in FIG4.
  • the mobile phone can automatically identify the user's one-handed holding posture, so as to display the arc prompt shown in FIG4 according to the identified left-hand holding posture; and display the arc prompt shown in FIG3 according to the identified right-hand holding posture.
  • the following takes the one-handed holding posture as the right-hand holding posture as an example to introduce the technical methods of the embodiments of the present application.
  • the mobile phone can determine the user's one-hand holding posture by detecting the temperature of the hand by a temperature sensor, detecting the holding pressure by a pressure sensor, etc.
  • the embodiment of the present application does not limit the way in which the mobile phone recognizes the one-hand holding posture.
  • the mobile phone when the user holds the mobile phone in the right-hand holding posture shown in FIG2 and draws an arc on the display screen with the thumb according to the instruction (as shown by curve 501 in FIG5 ), the mobile phone can receive the touch position corresponding to the arc operation of the user. Then, the mobile phone determines the maximum area 502 that the user can touch with one hand according to the touch position.
  • the mobile phone can determine the position and size of the touch area 503 adapted to the one-handed operation mode of the user according to the maximum area 502 that can be touched by the user with one hand.
  • the embodiment of the present application does not limit the shape of the touch area 503, for example, it can be square, circular, elliptical, etc.
  • the mobile phone can determine the maximum value of the length and width of the touch area according to the maximum area 502 that can be touched by the user with one hand, so as to determine the position and size of the touch area.
  • the mobile phone can also display the maximum area 502 that can be touched by the user with one hand, so that the user can customize the shape and size of the touch area.
  • the mobile phone may also obtain the user's hand information to calculate the length of the user's thumb based on the obtained hand information.
  • the mobile phone can then calculate the maximum area that the user can touch with one hand based on the length of the user's thumb and the user's one-handed holding posture.
  • the mobile phone can determine the position and size of the touch area based on the maximum area that the user can touch with one hand.
  • the mobile phone can remind the user to enter the hand information. After the mobile phone enters the user's hand information, the mobile phone can calculate the length of the user's thumb based on the obtained hand information.
  • the mobile phone after the mobile phone determines the position and size of the touch area of the one-handed operation mode, the mobile phone can reduce the interface content displayed in the entire screen display area of the display screen to the remaining area of the display screen.
  • the remaining area is the other area of the display screen except the touch area.
  • the mobile phone After the mobile phone determines the position and size of the touch area 601 of the one-handed operation mode based on the maximum area that the user can touch with one hand, the mobile phone can use the remaining area 602 of the display screen except the touch area 601 as the display area of the one-handed operation mode, and display the interface content originally displayed in the entire screen display area. Therefore, the user can perform a full range of operations on the interface content displayed in the remaining area 602 by operating in the touch area 601.
  • FIG. 7 shows a schematic diagram of an interface when a mobile phone is in normal mode and displays a desktop interface through a display screen.
  • the normal mode generally refers to a normal display mode in which the user interface is designed according to the size of the display screen of the mobile phone.
  • the user interface is generally displayed on the entire screen.
  • the desktop interface 701 displayed by the mobile phone through the display screen fills the entire screen display area of the display screen.
  • the mobile phone When the mobile phone is displaying the desktop interface 701 normally, if the mobile phone detects a trigger operation of the one-handed operation mode, as shown in (b) of FIG. 7 , the mobile phone can respond to the trigger operation by displaying a touch area 702 of the one-handed operation mode on the display screen and changing the original display screen to the touch area 702 of the one-handed operation mode.
  • the desktop interface 701 displayed in the entire screen display area is reduced and displayed in the remaining area 703 of the display screen.
  • the remaining area 703 is the other area of the display screen except the touch area 702.
  • the desktop interface 704 after geometric reduction is displayed in the remaining area 703 of the display screen. Therefore, the user can perform full-range operations on the desktop interface 704 after geometric reduction displayed in the remaining area 703 by operating in the touch area 702.
  • the setting page of the mobile phone can add a function switch control for the one-handed operation mode
  • the triggering operation of the one-handed operation mode can be a user clicking the function switch control of the one-handed operation mode.
  • the triggering operation of the one-handed operation mode can also be a user pressing a specific physical button or button combination on the mobile phone, or a user's voice instruction, a user's quick gesture operation on the screen of the mobile phone (such as a gesture of drawing a circle trajectory or a long press gesture on the lower right corner of the screen), or a user's gesture operation in the air.
  • the embodiment of the present application does not limit the triggering operation of the one-handed operation mode.
  • the mobile phone can enter the initialization setting mode of the touch area of the one-handed operation mode to determine the position and size of the touch area of the one-handed operation mode.
  • the mobile phone then records the position and size of the touch area of the one-handed operation mode, so that when the mobile phone detects the triggering operation of the one-handed operation mode again, the touch area of the one-handed operation mode can be directly displayed on the display screen.
  • the mobile phone can determine the position and size of the remaining area in the display screen according to the position and size of the touch area in the one-hand operation mode.
  • the entire remaining area can be used as the display area in the one-hand operation mode.
  • FIG. 8 shows a schematic diagram of an interface when a mobile phone is in normal mode and displays a video application through a display screen.
  • the application interface 801 of the video application displayed by the mobile phone through the display screen fills the entire screen display area of the display screen.
  • the mobile phone When the mobile phone normally displays the application interface 801 of the video application, if the mobile phone detects a trigger operation of the one-handed operation mode, as shown in (b) of FIG8 , the mobile phone can respond to the trigger operation, display the touch area 802 of the one-handed operation mode on the display screen, and reduce the application interface 801 of the video application originally displayed in the entire screen display area of the display screen to the remaining area 803 of the display screen.
  • the remaining area 803 is the other area of the display screen except the touch area 802.
  • the application interface 804 of the video application after geometric reduction is displayed in the remaining area 803 of the display screen. Therefore, the user can perform a full range of operations on the application interface 804 of the video application after geometric reduction displayed in the remaining area 803 by operating in the touch area 802.
  • the mobile phone may also shrink the user interface originally displayed on the entire screen display area of the display screen including application windows of multiple applications and display it within the display area of the one-handed operation mode.
  • FIG9 shows a schematic diagram of an interface for displaying split-screen windows through a display screen when a mobile phone is in normal mode.
  • the user interface 901 displayed by the mobile phone through the display screen includes upper and lower split-screen windows of a video playback application and a music playback application, and the application window of the video playback application and the application window of the music playback application each occupy half of the mobile phone display screen, and the two application windows do not overlap.
  • the mobile phone When the mobile phone normally displays the split-screen windows of the video playback application and the music playback application, if the mobile phone detects a trigger operation of the one-handed operation mode, as shown in (b) of FIG. 9 , the mobile phone can respond to the trigger operation, display the touch area 902 of the one-handed operation mode on the display screen, and reduce the user interface 901 originally displayed in the entire screen display area of the display screen to the remaining area 903 of the display screen.
  • the remaining area 903 is the other area of the display screen except the touch area 902.
  • a proportionally reduced user interface 904 is displayed in the remaining area 903 of the display screen, and the split-screen windows of the video playback application and the music playback application in the user interface 904 are also proportionally reduced.
  • the user can operate in the touch area 902 to perform a full range of operations on the proportionally reduced user interface 904 including the split-screen windows of the video playback application and the music playback application displayed in the remaining area 903.
  • the user interface includes application windows of multiple applications and is not limited to split-screen scenarios. Other scenarios such as floating screens, mini floating windows, multi-tasking windows, etc. are also applicable. The embodiments of the present application do not limit this.
  • the mobile phone can also rearrange and combine the interface content originally displayed in the entire screen display area of the mobile phone display according to the size of the remaining area, and display the rearranged and combined interface content in the remaining area. It can be understood that the rearranged and combined interface content can cover the entire remaining area for display.
  • the desktop interface 1001 displayed by the mobile phone through the display screen covers the entire screen display area of the display screen.
  • the desktop interface 1001 includes multiple applications application icon.
  • the mobile phone When the mobile phone is displaying the desktop interface 1001 normally, if the mobile phone detects a trigger operation of the one-handed operation mode, as shown in (b) of FIG. 10 , the mobile phone can respond to the trigger operation, display the touch area 1002 of the one-handed operation mode on the display screen, and rearrange and combine the application icons of multiple applications in the desktop interface 1001 originally displayed in the entire screen display area of the display screen, so as to display the application icons of the rearranged and combined multiple applications in the remaining area 1003 of the display screen.
  • the remaining area 1003 is the other area of the display screen except the touch area 1002. As shown in (b) of FIG.
  • a new desktop interface 1004 after the application icons are rearranged and combined is displayed in the remaining area 1003 of the display screen, and the content arrangement in the new desktop interface 1004 is different from the content arrangement of the original desktop interface 1001, and the desktop interface 1004 covers the entire remaining area of the display screen.
  • the user can perform full-range operations on the new desktop interface 1004 displayed in the remaining area 1003 by operating in the touch area 1002.
  • the mobile phone can determine the largest area in the remaining area that meets the screen aspect ratio of the display screen according to the position and size of the remaining area as the target display area for displaying the user interface originally displayed in the entire screen display area. In this way, the mobile phone can proportionally reduce the user interface originally displayed in the entire screen display area of the mobile phone display screen to display in the target display area, and the reduced user interface can cover the target display area.
  • the mobile phone can also rearrange the application icons of multiple applications displayed on the desktop interface and display them in the other area.
  • the user can also operate the multiple application icons displayed in other areas through the touch area.
  • it not only avoids the phenomenon that the display screen has black areas due to the existence of areas that do not display any content, but also realizes the full use of the entire display area of the mobile phone display screen, avoids the waste of large screen area, and the user can also quickly open other applications while browsing the original user interface.
  • the mobile phone After the mobile phone determines the position and size of the touch area 1101 of the one-handed operation mode, the mobile phone can use the remaining area 1102 in the display screen except the touch area 1101 as the display area of the one-handed operation mode. Then the mobile phone can determine the largest area in the remaining area 1102 that meets the screen aspect ratio of the display screen according to the position and size of the remaining area 1102 as the target display area 1103 for displaying the user interface originally displayed in the entire screen display area. At this time, as shown in FIG. 11, in addition to the target display area 1103, there are other areas 1104 in the remaining area 1102 that do not display content. Therefore, in an embodiment of the present application, the mobile phone can rearrange the application icons of multiple applications displayed on the desktop interface to be displayed in the other area 1104. The full use of the entire display area of the mobile phone display screen is achieved.
  • FIG12 shows a schematic diagram of an interface when a mobile phone is in a one-handed operation mode and displays a video application through a display screen.
  • the mobile phone can display the touch area 1201 of the one-handed operation mode on the display screen, and display the application interface 1202 of the video application after proportional reduction in the remaining area of the display screen, and the remaining area is the other area of the display screen except the touch area 1201. Since the application interface 1202 of the video application does not cover the entire remaining area of the display screen, the remaining area also has a left area 1203 where no content is displayed. Therefore, as shown in (a) in FIG12, after the mobile phone enters the one-handed operation mode, the mobile phone can also display multiple application icons in the left area 1203 of the display screen at the same time.
  • the mobile phone can also display more application icons in the form of a switch list in the left area of the display screen.
  • the user can slide the switch list up and down by touching the touch area to display the hidden application icons in the left area of the display screen.
  • the mobile phone when the remaining area includes the target display area and other areas, the mobile phone can determine which area of the target display area and other areas the touch area can currently control according to the area switching instruction.
  • the area switching instruction can be a preset gesture operation of the user on the touch area, such as a horizontal sliding operation at the bottom of the touch area.
  • the embodiments of the present application do not limit the preset gesture operations.
  • the mobile phone can switch the scope of the touch area from the target display area to other areas. At this time, the mobile phone can perform the function corresponding to the touch operation on the content displayed in other areas based on the touch operation of the user on the touch area.
  • the mobile phone can switch the scope of the touch area.
  • the mobile phone can execute a function corresponding to the touch operation on the proportionally reduced user interface displayed in the target display area according to the touch operation performed by the user on the touch area.
  • the mobile phone can use the touch area to display a control for realizing the control switching between the target display area and other areas.
  • the area switching instruction can also be a user's click operation on the control.
  • the mobile phone can display a control 1204 in the touch area 1201.
  • the scope of the touch area can be switched from the application interface 1202 of the video application to the multiple application icons displayed in the left area 1203.
  • the mobile phone can perform a function corresponding to the touch operation on the multiple application icons displayed in the left area 1203 according to the touch operation of the user acting on the touch area 1201.
  • the scope of the touch area can be switched from the multiple application icons displayed in the left area 1203 back to the application interface 1202 of the video application.
  • the mobile phone can perform a function corresponding to the touch operation on the application interface 1202 of the video application according to the touch operation of the user acting on the touch area 1201.
  • the mobile phone can also display application icons of applications frequently used by the user in the other areas, or the user can customize application icons of multiple applications displayed in the other areas.
  • the present application embodiment is not limited thereto.
  • the remaining area may also include the bottom area of the display screen. Since the bottom area also does not display content, in order to avoid wasting the screen, the mobile phone may also display application icons of multiple applications in the bottom area. In this way, when the user operates the original user interface displayed in the target display area through the touch area, the user may also operate the multiple application icons displayed in the bottom area through the touch area. In this way, the phenomenon of black areas appearing on the display screen due to the existence of areas where no content is displayed is avoided, and the entire display area of the mobile phone display is fully utilized to avoid wasting a large screen area, and the user may also quickly open other applications while browsing the original user interface.
  • FIG. 13 shows a schematic diagram of an interface when a mobile phone is in a one-handed operation mode and displays a video application through a display screen.
  • the mobile phone can display the touch area 1301 of the one-handed operation mode on the display screen, and display the application interface 1302 of the video application after proportional reduction in the remaining area of the display screen, and the remaining area is the other area of the display screen except the touch area 1301. Since the application interface 1302 of the video application does not cover the entire remaining area of the display screen, the remaining area also has a left area 1303 and a bottom area 1304 where no content is displayed. Therefore, as shown in FIG. 13, after the mobile phone enters the one-handed operation mode, the mobile phone can also display multiple application icons in the left area 1303 and the bottom area 1304 of the display screen at the same time.
  • the user may directly perform touch operations on the multiple application icons displayed in the bottom area instead of performing touch operations on the multiple application icons displayed in the bottom area through the touch area.
  • the mobile phone can also display more application icons in the form of a switch list in the bottom area of the display screen.
  • the user can slide the switch list left and right by touching the touch area to display the hidden application icons in the bottom area of the display screen.
  • the bottom area or left area of the remaining area where no content is displayed may not display application icons, but display multiple shortcut functions, such as screenshot, share, scan, quick payment, health code, etc.
  • the embodiment of the present application is not limited to this.
  • the mobile phone when the mobile phone detects a touch operation performed by the user on the touch area, it can respond to the touch operation and perform a function corresponding to the touch operation on the reduced original entire screen display area and the user interface therein displayed in the remaining area.
  • the touch operation can be a common touch operation such as sliding up, sliding down, sliding left, sliding right, clicking, double-clicking, and long pressing, or a sliding gesture such as drawing a circle, drawing a check mark ⁇ , and drawing a cross ⁇ .
  • the embodiment of the present application does not limit this.
  • FIG14 shows a schematic diagram of an interface when a mobile phone is in a one-handed operation mode and displays a desktop interface through a display screen.
  • the mobile phone displays a touch area 1401 of the one-handed operation mode on the display screen, and displays the reduced original entire screen display area and the desktop interface 1402 originally displayed in the entire screen display area in the remaining area other than the touch area 1401 on the display screen
  • the mobile phone detects a left swipe operation performed by the user on the touch area
  • the mobile phone responds to the left swipe operation on the touch area, and maps the left swipe operation on the touch area to a left swipe operation on the reduced original entire screen display area and the desktop interface 1402 displayed therein displayed in the remaining area, so that the mobile phone can perform the operation corresponding to the left swipe operation on the reduced original entire screen display area and the desktop interface 1402 displayed therein.
  • the mobile phone can perform a desktop interface switching function corresponding to the left swipe operation on the reduced original entire screen display area and the desktop interface 1402 displayed therein, so that the reduced original entire screen display area displays a new desktop interface 1403 (next page desktop interface).
  • FIG15 shows a schematic diagram of an interface when a mobile phone is in a one-handed operation mode and displays a video application through a display screen.
  • the mobile phone displays a touch area 1501 in the one-handed operation mode on the display screen, and displays the reduced original entire screen display area and the application interface 1502 of the video application displayed in the original entire screen display area in the remaining area other than the touch area 1501 on the display screen
  • the mobile phone detects a left swipe operation performed by the user on the touch area
  • the mobile phone responds to the left swipe operation of the touch area, and maps the left swipe operation performed on the touch area to a left swipe operation performed on the reduced original entire screen display area and the application interface 1502 of the video application displayed therein, so that the mobile phone can perform a function corresponding to the left swipe operation on the reduced original entire screen display area and the application interface 1502 of the video application displayed therein.
  • the mobile phone can execute the video application exit function corresponding to the left swipe operation on the reduced original entire screen display area and the application interface 1502 of the video application displayed therein, so that the reduced original entire screen display area displays the desktop main interface 1503 to which the video application returns after exiting.
  • the mobile phone may assume that the user needs to further manipulate specific content in the user interface. At this time, the mobile phone may execute a function corresponding to the touch operation on a certain content or part of the content in the user interface based on the touch operation performed by the user on the touch area.
  • the mobile phone can use the touch area to display a switching control, and in this case, the confirmation instruction for the user interface can be a click operation of the user on the switching control.
  • the user can select the currently displayed user interface by clicking the switching control, and further manipulate the specific content in the user interface.
  • the user can also click the switching control again to cancel the selection of the currently displayed user interface, so as to cancel further manipulation of the specific content in the user interface.
  • the confirmation instruction for the user interface may also be a preset gesture operation of the user on the touch area, and the preset gesture operation may be a double-click gesture, a check mark “ ⁇ ”, etc., which is not limited in the embodiments of the present application.
  • the double-click gesture as an example, after the mobile phone uses the remaining area to display the proportionally reduced user interface, the user can select the currently displayed user interface and further manipulate the specific content in the user interface by performing a double-click gesture operation on the touch area.
  • the user can also perform a double-click gesture operation on the touch area again to deselect the currently displayed user interface to cancel further manipulation of the specific content in the user interface.
  • the mobile phone in order to ensure that the user can accurately locate and manipulate a certain content displayed in the remaining area through the touch operation of the touch area, the mobile phone can also determine the effective position of the touch area in the remaining area.
  • the mobile phone can display a cursor in the remaining area, and the position of the cursor can be used to prompt the user the location of the content currently located by the mobile phone.
  • the user can control the movement of the cursor in the remaining area by performing touch operations (such as swiping up, down, left, and right) in the touch area to control the cursor in the remaining area to move to the position of a certain content that the user wants to locate and control, so that the user can further achieve precise control of the content through touch operations in the touch area.
  • touch operations such as swiping up, down, left, and right
  • FIG. 16 shows a schematic diagram of an interface when a mobile phone is in a one-handed operation mode and displays a desktop interface through a display screen.
  • the mobile phone displays a touch area 1601 of the one-handed operation mode on the display screen, and displays the reduced original entire screen display area and the desktop interface 1602 originally displayed in the entire screen display area in the remaining area other than the touch area 1601 on the display screen, if the user performs a double-click gesture operation on the touch area 1601, the mobile phone may think that the user needs to further manipulate the specific content in the desktop interface 1602.
  • the mobile phone responds to the double-click gesture operation, and the mobile phone may display a cursor 1603 on the desktop interface 1602, which can be used to indicate a certain content currently selected by the user's mobile phone.
  • the mobile phone can use the interface content corresponding to the upper left corner of the original entire screen display area as the initial stop position of the cursor.
  • the initial stop position of the cursor 1603 can be the position where the first application icon in the upper left corner of the desktop interface 1602 is located.
  • the mobile phone When the mobile phone detects that the user has swiped right on the touch area, the mobile phone responds to the swiping right on the touch area and controls the cursor 1603 to move right in the remaining area to select the next content.
  • the cursor 1603 can be moved to the position of the second application icon in the upper left corner of the desktop interface 1602.
  • the user can control the movement of the cursor 1603 in the remaining area by swiping up, down, left, and right on the touch area 1601 to control the cursor 1603 in the remaining area to move to the position of a certain application icon that the user wants to locate and manipulate, so that the user can perform operations in the touch area. Perform a click operation to trigger the mobile phone to open the application corresponding to the application icon currently selected by the cursor 1603.
  • the display screen originally displays a lot of content.
  • the size of the displayed content will be reduced accordingly, and the displayed content will be more dense, which is not convenient for users to find content, and it also increases the difficulty for users to accurately locate the cursor in the remaining area to a specific position through touch operations in the touch area.
  • the mobile phone may display a content selection box in the remaining area, and the area range of the content selection box may be the effective range of the touch area in the remaining area.
  • the area range framed by the content selection box may include multiple contents displayed in the remaining area, so that the user can control the movement of the content selection box in the remaining area by performing touch operations in the touch area (such as swiping up, swiping down, swiping left, and swiping right), so as to control the content selection box in the remaining area to move to the position of the area where a certain content that the user wants to locate and control is located, so that the user can further accurately locate the specific position of the content from the content selection box and realize precise control of the content through touch operations in the touch area.
  • the mobile phone when the mobile phone displays a touch area 1701 of a one-handed operation mode on the display screen, and displays the reduced original entire screen display area and the desktop interface 1702 originally displayed in the entire screen display area in the remaining area other than the touch area 1701 on the display screen, the mobile phone can also display a content selection box 1703 on the desktop interface 1702, and the content selection box can be used to indicate to the user that the mobile phone currently selects a certain area, and the selected area can contain multiple interface contents.
  • the mobile phone can use the upper left corner position of the original entire screen display area as the initial stop position of the content selection box.
  • the initial stop position of the content selection box 1703 can be the upper left corner position of the desktop interface 1602, and the area framed by the content selection box 1703 includes multiple application icons.
  • the mobile phone When the mobile phone detects that the user has swiped right on the touch area, the mobile phone responds to the swiping right on the touch area and controls the content selection box 1703 to move to the right in the remaining area to select the next area.
  • the content selection box 1703 can be moved to the upper right corner of the desktop interface 1702.
  • the user can control the movement of the content selection box 1703 in the remaining area by swiping up, down, left, and right on the touch area 1701, so as to control the content selection box 1703 in the remaining area to move to a certain area that the user wants to locate and control.
  • the user can further accurately locate the specific position of a certain content in the content selection box 1703 and realize precise control of the content through touch operations on the touch area 1701.
  • the area range framed by the content selection box can be the effective range of the touch area in the remaining area.
  • the mobile phone can establish a mapping relationship between the touch area and the area selected by the current content selection box, so that when the mobile phone detects a touch operation (such as a single click operation, a double click operation, a long press operation, etc.) performed by the user at a certain position in the touch area, the mobile phone can map the touch operation at a certain position in the touch area to the touch operation at a corresponding position in the content selection box according to the mapping relationship, so that the mobile phone can execute the function corresponding to the touch operation at the corresponding position in the content selection box.
  • a touch operation such as a single click operation, a double click operation, a long press operation, etc.
  • the mobile phone can establish a mapping relationship between the touch area 1701 and the selected area of the current content selection box 1703.
  • the touch operations in the four touch sub-areas of the upper left, upper right, lower left, and lower right in the touch area can be mapped one-to-one to the touch operations on the four application icons in the current content selection box 1703.
  • the user can manipulate the four application icons in the current content selection box 1703 by performing touch operations in the four touch sub-areas of the upper left, upper right, lower left, and lower right in the touch area.
  • the mobile phone when the mobile phone detects that the user performs a single-click operation on the lower left corner of the touch area, the mobile phone can map the single-click operation on the lower left corner of the touch area to a single-click operation on the application icon 1704 of the music application in the content selection box 1703 according to the mapping relationship, so that the mobile phone can respond to the user's single-click operation on the lower left corner of the touch area and open the music application.
  • the mobile phone can establish a mapping relationship between the touch area and the area currently selected by the content selection box in real time according to the area range framed by the content selection box, or it can establish a mapping relationship between the touch area and the area currently selected by the content selection box only when a confirmation operation of the user on the content selection box is detected.
  • the confirmation operation on the content selection box can be a double-click operation of the user on the touch area, a specific gesture operation (such as drawing a check mark), etc., which is not limited in the embodiments of the present application.
  • a double-click operation when the user manipulates the content selection box 1703 to move to the table shown in (b) of FIG. 17 ,
  • the user selects the upper right corner of the touch screen interface 1702 if the user determines that a specific operation needs to be performed on a certain content in the current content selection box 1703, the user can perform a double-click operation in the touch area.
  • the mobile phone detects the double-click operation of the user acting on the touch area, it can determine that the user's confirmation operation on the content selection box is detected. At this time, the mobile phone can respond to the double-click operation, enter the area range framed by the content selection box 1703, and establish a mapping relationship between the touch area and the area range framed by the current content selection.
  • the mobile phone can also delete the mapping relationship between the touch area and the current content selection box selected area when detecting the user's exit operation on the content selection box.
  • the confirmation operation on the content selection box can be a double-click operation of the user on the touch area, a specific gesture operation (such as drawing a cross), etc., which is not limited in the embodiment of the present application.
  • the user when the user performs a double-click operation in the touch area to control the mobile phone to enter the area range framed by the content selection box and establishes a mapping relationship between the touch area and the area range framed by the current content selection, the user can also perform a double-click operation in the touch area again to control the mobile phone to exit the area range framed by the content selection box and delete the mapping relationship between the touch area and the area range framed by the current content selection box.
  • the mobile phone returns to the original operation logic, that is, the user can continue to control the movement of the content selection box by sliding in the touch area.
  • the mobile phone can adaptively display content selection boxes of different sizes according to the different contents displayed in the remaining area.
  • the mobile phone can generate content selection boxes of different sizes according to the application interfaces of different applications displayed in the original entire screen display area.
  • the mobile phone when the mobile phone displays the reduced original entire screen display area and the application interface 1801 of the short video application displayed in the original entire screen display area in the remaining area, the mobile phone can also display a content selection box 1802 as shown in FIG18 on the application interface 1801 of the short video application.
  • the content selection box can be used to indicate a certain area currently selected by the user's mobile phone, and the selected area can contain multiple controls such as the focus control, the collection control, the comment control, and the sharing control. It can be seen that since the focus control, the collection control, the comment control, the sharing control, and multiple controls in the application interface 1801 of the short video application are in a vertical position, the mobile phone can adaptively adjust the size of the content selection box to match the vertically distributed interface content.
  • the mobile phone can obtain all the operation control views displayed in the remaining area, and then determine whether these views are in a horizontal or vertical position.
  • the mobile phone can select some views and reassemble them into a combined view by calculating the size and position of each view.
  • the range size of the combined view is the size of the content selection box.
  • the mobile phone can adaptively determine the size of the content selection box according to the horizontal arrangement or vertical arrangement of the views displayed in the remaining area.
  • the mobile phone when the mobile phone displays the reduced original entire screen display area and the application interface 1901 of the search application displayed in the original entire screen display area in the remaining area, the mobile phone can obtain all views displayed in the application interface 1901 of the search application, and then the mobile phone can determine which of these views are in a horizontal position and which are in a vertical position. Then the mobile phone calculates the size and position of each view, and adaptively selects a suitable view to be reassembled into a combined view according to the horizontal arrangement or vertical arrangement of each view to form the size of the content selection box. As shown in (a) of FIG.
  • the mobile phone can reassemble a plurality of views arranged horizontally in a row in the application interface 1901 of the search application into a combined view 1902, and the range size of the combined view 1902 is the size of the content selection box.
  • the mobile phone can also reassemble a plurality of views arranged vertically in a column in the application interface 1901 of the search application into a combined view 1903, and the range size of the combined view 1903 is the size of the content selection box.
  • the mobile phone may also recombine the multiple views arranged vertically and the multiple views arranged horizontally in the application interface of the search application into a combined view 1904 , where the size of the combined view 1904 is the size of the content selection box.
  • the mobile phone may highlight the area framed by the content selection box to prompt the user of the area currently selected by the content selection box.
  • the mobile phone can also map the combined view in the content selection box to the touch area and set the combined view to be visible, so that the mobile phone can display the combined view framed by the content selection box in the touch area, so that the user can accurately control the corresponding view in the content selection box displayed in the remaining area by controlling a certain view displayed in the touch area.
  • the mobile phone displays the combined view framed by the content selection box 2002, that is, the icon control of the four application icons, in the touch area 2001.
  • the mobile phone since the view in the reduced user interface is relatively small, when it is mapped to a relatively small touch area, it is sometimes inconvenient for the user to operate, or when the layout of the view in the reduced user interface does not match the touch area, the mobile phone cannot Therefore, when mapping the combined view in the content selection box to the touch area, the mobile phone can also resize and position each view according to the size of the touch area, so that the adjusted combined view better matches the touch area and is more convenient for users to operate in the touch area with one hand.
  • the mobile phone when the mobile phone maps the combined view in the content selection box 2101 displayed in the remaining area to the touch area 2102, the mobile phone can readjust the size and position of each view in the content selection box 2101 so that the adjusted combined view better matches the touch area 2102. Then the mobile phone can display the adjusted combined view as shown in FIG. 21 in the touch area 2102.
  • the size of the view in the adjusted combined view is relatively large, which is more convenient for user operation.
  • the mobile phone when the mobile phone maps the combined view in the content selection box 2201 displayed in the remaining area to the touch area 2202, since the focus control, collection control, comment control, and sharing control in the combined view in the content selection box 2201 are in a vertical position, the mobile phone can determine that the combined view in the content selection box 2201 does not match the touch area. If it is directly mapped to the touch area, each control will be relatively small in the touch area, which is not convenient for the user to operate. Therefore, the mobile phone can readjust the size and position of each control in the combined view according to the size and position of each control in the focus control, collection control, comment control, and sharing control, so that the adjusted combined view is more matched with the touch area 2202.
  • the mobile phone can display the adjusted combined view as shown in FIG22 in the touch area 2202, in which the focus control, collection control, comment control, and sharing control are arranged horizontally and the display size is relatively large, which is more convenient for the user to operate in the touch area with one hand.
  • the mobile phone can also directly map the title bar, bottom sidebar, top sidebar and other fixed-position operation controls in the user interface displayed in the remaining area that do not move with page browsing to the touch area, and these controls do not need to be framed by the content selection box.
  • the user can directly operate these controls in the touch area.
  • the mapping positions of these controls in the touch area can correspond to the positions of these controls in the user interface.
  • the top sidebar is usually displayed at the top of the user interface, and the mobile phone can also map the top sidebar to the top of the touch area.
  • the bottom sidebar is usually displayed at the bottom of the user interface, and the mobile phone can also map the bottom sidebar to the bottom of the touch area.
  • the mobile phone can map the top sidebar 2301 and the bottom sidebar 2302 to the touch area, so that the mobile phone can display the top sidebar 2303 and the bottom sidebar 2304 shown in (a) of FIG23 in the touch area.
  • the user can directly perform touch operations on the top sidebar 2303 and the bottom sidebar 2304 in the touch area with one hand.
  • the mobile phone can also map the operation controls in the user interface that move with page browsing and are not fixed in position displayed in the remaining area to the touch window through the content selection box. That is, the mobile phone can map the combined view framed by the content selection box to the touch area. It can be understood that the title bar, bottom sidebar, top sidebar and other operation controls that do not move with page browsing and are fixed in position do not overlap with the combined view framed by the content selection box when displayed on the touch area.
  • the mobile phone can not only map the top sidebar 2301, bottom sidebar 2302 and other fixed position operation controls in the application interface of the shopping application displayed in the remaining area to the touch area, so that the user can directly touch the top sidebar 2303 and bottom sidebar 2304 in the touch area to implement the touch operation on the top sidebar 2301 and bottom sidebar 2302 in the remaining area.
  • the mobile phone can also map the combined view in the content selection box 2305 to the touch area, so that the user can directly touch the combined view 2306 in the touch area to implement the touch operation on the combined view 2306 in the content selection box 2305 in the remaining area.
  • the mobile phone when the mobile phone detects a user's sliding operation on the touch area, the mobile phone responds to the sliding operation on the touch area and controls the content selection box 2305 to move downward in the remaining area to select the next combined view.
  • the content selection box 2305 can move from the position shown in (b) of FIG. 23 to the position shown in (c) of FIG. 23 .
  • the mobile phone when the mobile phone displays the newly generated user interface in the remaining area according to the user's touch operation on the designated control in the touch area, the mobile phone may highlight the content related to the designated control in the newly generated user interface, or the mobile phone may display the content control related to the designated control in the user interface in the touch area, so that the user can quickly focus on or manipulate the content that may be of interest.
  • the mobile phone may not highlight it.
  • the mobile phone when the mobile phone maps the combined view in the content selection box 2305 to the touch area, if the user clicks the hat control 2307 in the combined view in the touch area, the mobile phone can respond to the click operation and enter the hat-related page corresponding to the hat control 2307, that is, the mobile phone can switch the page shown in (c) of FIG. 23 displayed in the remaining area to the hat-related page shown in (d) of FIG. 23. At this time, as shown in (d) of FIG.
  • the mobile phone may highlight controls 2308 and 2309 similar to hat control 2307 in the hat-related page, and the mobile phone may also map controls 2308 and 2309 similar to hat control 2307 in the hat-related page to the touch area, so that the user can quickly control the content that may be of interest directly through the touch area.
  • the mobile phone can also hide each view in the combined view in the touch area, that is, set it to be invisible, so as to avoid the display screen from displaying too much repeated content and affecting the user's perception.
  • the user can directly perform a touch operation on the combined view in the content selection box 2305 in the remaining area by performing a touch operation on the corresponding touch position in the touch area.
  • the mobile phone when the mobile phone detects a swipe operation by the user on the touch area, the mobile phone can synchronously control the page content in the user interface displayed in the remaining area to perform a swipe browsing operation.
  • the mobile phone can reacquire all views in non-fixed positions in the user interface displayed in the remaining area, and reassemble them to generate one or more combined views.
  • the user can synchronously control the content selection box in the user interface displayed in the remaining area by swiping down on the touch area, and switch from multiple combined views.
  • the remaining area has a vacant left area or bottom area displaying multiple application icons or shortcut functions in addition to the target display area for displaying the user interface displayed in the original entire screen display area
  • the user when the user needs to manipulate the display content of the vacant left area or bottom area, the user can also manipulate the content selection box to move to the left area or bottom area through the touch area, and when the content selection box moves to the left area or bottom area, the mobile phone can adaptively adjust the content selection box to a suitable size according to the vertical or horizontal arrangement of the multiple application icons or shortcut functions in the left area or bottom area. Then the mobile phone can map the multiple application icons or shortcut functions in the content selection box to the touch area.
  • the user can also customize gesture operations, which can be bound to a function of a certain application interface.
  • the user can directly perform a customized gesture operation on the touch area to quickly start the corresponding function of the application bound to the gesture operation.
  • the phone can be triggered to directly open the call page of the phone application and execute the call function for the specified contact.
  • the phone can be triggered to directly open the scan code page of the payment application and execute the scan code function.
  • a one-handed operation method provided by an embodiment of the present application is described below in conjunction with the accompanying drawings, which can be applied to the scenarios shown in Figures 2 to 23 above.
  • the one-handed operation method is applied to an electronic device, which can be the above-mentioned mobile phone.
  • the method can include S2410-S2480.
  • the electronic device controls the screen display area of the display screen to display a first interface.
  • the first interface can be understood as the user interface presented by the mobile phone through the entire screen display area of the display screen.
  • the user interface is a medium interface for interaction and information exchange between the application or operating system and the user, which realizes the conversion between the internal form of information and the form acceptable to the user.
  • the user interface of the application is the source code written in a specific computer language such as Java and extensible markup language (XML).
  • the interface source code is parsed and rendered on the terminal device and finally presented as the interface elements in the user interface.
  • the user interface may include interface elements such as icons, windows, and controls.
  • controls are also called widgets.
  • Typical controls include toolbars, menu bars, text boxes, buttons, scroll bars, pictures, and text.
  • GUI graphical user interface
  • the commonly used form of user interface is the graphical user interface (GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It can be an interface element such as an icon, window, or control displayed on the display screen of an electronic device, where controls can include visual interface elements such as pictures, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, and widgets.
  • the first interface may include at least one of a desktop interface and an application interface of an application.
  • the first interface may be a desktop interface displayed in full screen or an application interface displayed in full screen.
  • the first interface may also be a combined interface consisting of a desktop interface displayed in full screen and a floating application interface displayed in suspension on the desktop interface.
  • the first interface may also be a combined interface consisting of at least one application interface, for example, the first interface may be an application interface of two applications displayed in a split-screen state.
  • the interface content of the first interface may cover the entire screen display area.
  • the interface content of the first interface may not cover the entire screen display area, that is, there is a black border area without displaying any content between the interface content boundary in the first interface and the entire screen display area boundary of the display screen.
  • the first interface is not limited in the embodiments of the present application.
  • S2420 The electronic device detects a trigger instruction for a one-handed operation mode.
  • the triggering instruction of the one-handed operation mode may be triggered by the user.
  • the triggering instruction of the one-handed operation mode may be triggered by the user. It can be a user clicking operation on a function option of the one-handed operation mode, or it can be a user pressing an specific physical button or button combination on the mobile phone, or it can be a user's voice instruction, a user's quick gesture operation on the screen of the mobile phone (such as a gesture of drawing a circle trajectory), or a user's air gesture operation, etc.
  • the embodiment of the present application does not limit the way in which the user triggers the one-handed operation mode.
  • the trigger instruction of the one-handed operation mode may also be triggered by an electronic device.
  • the electronic device when the electronic device detects that the user is currently holding the electronic device in a one-handed holding posture, the electronic device may automatically trigger the one-handed operation mode.
  • the electronic device may also automatically trigger the one-handed operation mode when it detects that the user's holding temperature is lower than a preset temperature value.
  • the embodiment of the present application does not limit the manner in which the electronic device automatically triggers the one-handed operation mode.
  • the one-handed operation mode is an operation mode possessed by an electronic device, and the operation mode divides the display screen of the electronic device into a first area and a second area.
  • the first area is the area that the user's finger can touch or easily touch in the display screen when the user operates the electronic device with one hand
  • the second area is the remaining area of the display screen of the electronic device except the first area.
  • This operation mode reduces the original display area of the display screen of the electronic device (i.e., the display area of the electronic device before entering the one-handed operation mode) and the display content therein (such as the desktop interface, application interface, application icon, text, pattern, display window, control, etc.
  • the first area and the second area can establish a mapping relationship, that is, the touch operation performed by the user in the first area can be mapped to the second area, so that by touching the first area, the electronic device can perform the function required to be performed when the second area is touched. In this way, the user can perform normal operation of the electronic device in the first area with one hand.
  • the reduced original display area and the reduced display content are displayed in the second area of the display screen, where the displayed content includes an application icon of a text message application.
  • the electronic device can respond to the single-click operation and determine that the single-click operation is mapped to the application icon of the text message application displayed in the second area, so that the electronic device can execute the function to be executed when the application icon of the text message application is touched, that is, open the text message application corresponding to the text message application icon.
  • the electronic device can implement all functions of the display area before the reduction in the second area, including displaying the reduced display content in the second area and performing touch operations in the second area. That is, the user can perform touch operations on the second area in the first area, or directly perform touch operations on the second area in the second area.
  • the electronic device may also display the first area to let the user know the location of the area that can be touched by one hand on the display screen.
  • the electronic device may add a touch window and place the touch window in the first area of the display screen for display. That is, once the one-handed operation mode of the electronic device is triggered, the electronic device will display the touch window in the first area of the display screen, and will display the reduced original display area and the reduced display content in the second area of the display screen.
  • S2430 The electronic device responds to a trigger instruction of the one-hand operation mode and determines a first area and a second area of the display screen.
  • the first area is the area on the display screen that can be touched by the finger of the user's single hand when the user holds the electronic device with one hand
  • the second area is the remaining area on the display screen except the first area.
  • the electronic device can receive the touch operation input by the user's single hand through the first area, and can display the content to be displayed by the electronic device through the second area.
  • the electronic device when the electronic device responds to the trigger instruction of the one-handed operation mode for the first time, the electronic device can enter the initialization setting mode of the first area to determine the first area suitable for the current user's one-handed operation.
  • the electronic device can prompt the user to use the thumb to slide on the display screen according to a specified trajectory such as an arc, and then the electronic device can analyze the maximum sliding distance of the user's thumb on the screen through the sliding trajectory of the user's thumb to determine the maximum area range that the thumb can touch on the screen.
  • the electronic device can determine the area range of the first area based on the maximum area range that the thumb can touch on the screen.
  • the electronic device may also display the maximum area that can be touched by the thumb on the screen, so that the user can customize the area of the first area.
  • the electronic device can use the remaining area of the display screen except the first area as the second area.
  • the first area can be understood as the touch area of the one-handed operation mode
  • the second area can be understood as the display area of the one-handed operation mode.
  • the electronic device may record the position and size of the first area, so that when the electronic device subsequently enters the one-handed operation mode again, the first area can be directly determined.
  • the shape of the first area is not limited in the embodiment of the present application, and can be, for example, square, circular, elliptical, etc. Taking the shape of the first area as a square as an example, the electronic device can determine the maximum length and width of the first area according to the maximum area that can be touched by the thumb on the screen, so that the electronic device can determine the position and size of the first area.
  • an upper limit value may be set for the size of the first area to prevent the first area from being too large, causing the second area to be too small, and causing the content displayed in the second area to be too small.
  • the electronic device may also reserve a bottom area of a preset size at the bottom of the display screen so that the first area can be located above the bottom area.
  • the bottom area can be used as part of the second area for displaying content.
  • the preset size can be reasonably set according to actual conditions and is not limited in the embodiments of the present application.
  • S2440 The electronic device reduces the screen display area of the display screen and the first interface displayed in the screen display area.
  • the electronic device when the electronic device is in normal mode, the entire screen display area of the display screen is usually used to display the first interface, and when the electronic device switches from normal mode to one-handed operation mode, the electronic device divides the display screen into two areas, a first area and a second area, where the first area is the touch area of the one-handed operation mode, and the second area is the display area of the one-handed operation mode.
  • the size of the second area i.e., the display area of the one-handed operation mode
  • the electronic device switches from normal mode to one-handed operation mode, it is necessary to reduce the original entire screen display area of the display screen and the display content therein to ensure that all displays in normal mode are included in the second area, and the information that should be displayed in normal mode is not missing. In this way, when the electronic device switches from normal mode to one-handed operation mode, the user can view the original normal mode display through the second area of the display screen.
  • the electronic device can determine the largest area in the second area that meets the screen aspect ratio of the display screen based on the position and size of the second area as the target display area for displaying the first interface originally displayed in the entire screen display area. Then, the electronic device can determine the target ratio to which the original screen display area of the display screen and the first interface displayed in the original screen display area need to be reduced based on the size of the target display area. Then, the mobile phone can reduce the original screen display area of the electronic device and the first interface displayed in the original screen display area in equal proportion to the target ratio, so as to ensure that the reduced first interface can cover the target display area.
  • the electronic device can record the target ratio so that when the electronic device subsequently enters the one-handed operation mode again, the first interface originally displayed in the entire screen display area of the electronic device can be directly reduced proportionally according to the target ratio based on the recorded target ratio.
  • S2450 The electronic device controls the second area of the display screen to display the reduced display area and the reduced first interface.
  • the electronic device can determine from the second area the target display area for the first interface displayed in the original entire screen display area of the display screen, to ensure that the first interface displayed in the original entire screen display area can be proportionally reduced to the target display area for display.
  • the target display area can determine whether the target display area is close to the left edge or the right edge of the display screen in the second area according to whether the user's one-handed holding posture is left-handed holding or right-handed holding. For example, when the user's one-handed holding posture is right-handed holding, the target display area in the second area can be close to the right edge of the display screen.
  • the electronic device may also reorder the applications on the desktop and control the third area of the display screen to display the reordered desktop applications.
  • the electronic device may also control the third area of the display screen to display applications or shortcut functions frequently used by the user.
  • the present application does not limit the content displayed in the third area.
  • S2460 The electronic device establishes a mapping relationship between the first area and the second area.
  • the mapping relationship between the first area and the second area may be a coordinate mapping relationship between the first area and the second area.
  • the electronic device detects that the user performs a touch operation on a certain position in the first area
  • the user's touch operation on a certain position in the first area may be mapped to a touch operation on a corresponding position in the second area according to the coordinate mapping relationship, so that the electronic device can execute the function required to be executed when the corresponding position in the second area is touched.
  • the electronic device may establish a mapping relationship between the first area and the first interface displayed in the second area after being reduced.
  • the electronic device may establish a mapping relationship between the operation controls in the first interface displayed in the first area and the second area.
  • the operation controls in the first interface may include a title bar, a navigation bar, a bottom sidebar, a top sidebar, and other operation controls that do not move with page browsing and are fixed in position, and may also include pictures, buttons, text boxes, and other operation controls that move with page browsing and are not fixed in position.
  • the electronic device may establish a position of a target control in the first interface displayed in the second area and a position of a target control in the first area.
  • the coordinate mapping relationship between the target positions is used to map the position of the target control in the first interface to the first area.
  • the target control can be any operation control in the first interface
  • the target position can be any position in the second area.
  • the electronic device detects that the user performs a touch operation on the target position in the first area, the user's touch operation on the target position in the first area can be mapped to a touch operation on the position of the target control in the second area according to the coordinate mapping relationship, so that the electronic device can execute the function required to be executed when the target control in the second area is touched.
  • the electronic device can display a new user interface generated when the target control is touched in the reduced screen display area located in the second area.
  • the electronic device may also map the target control in the first interface displayed in the second area to the first area for display, that is, the first area may display the target control.
  • the electronic device may then establish a functional mapping relationship between the function that the electronic device needs to execute when the target control displayed in the second area is touched and the target control displayed in the first area. In this way, when the electronic device detects that the user performs a touch operation on the target control displayed in the first area, the electronic device may directly execute the function that needs to be executed when the target control in the second area is touched.
  • the electronic device can map the title bar, bottom sidebar, top sidebar and other fixed-position operation controls that do not move with page browsing to the first area, while pictures, buttons, text boxes and other non-fixed-position operation controls that move with page browsing can be selectively mapped to the first area.
  • S2460 may include:
  • the electronic device obtains a first control and a second control in the first interface.
  • the first control can be a fixed-position operation control such as a title bar, bottom sidebar, top sidebar, etc. in the first interface that does not move with page browsing
  • the second control can be a non-fixed-position operation control that moves with page browsing in the first interface.
  • the electronic device can obtain the display style of each control in the first interface to determine the display position and display size of each control in the first interface. Then the electronic device can determine whether it is the first control by determining whether the control is fixedly displayed at a certain position in the first interface.
  • the title bar, navigation bar, menu bar and other fixed-position operation controls that do not move with page browsing are usually displayed at the top or bottom of the user interface, and the non-fixed-position operation controls that move with page browsing are usually displayed in the long preview page in the middle of the user interface.
  • the electronic device can obtain the control that is fixedly displayed in the first position in the first interface from the first interface as the first control; and obtain the control that is non-fixedly displayed in the first position in the first interface from the first interface as the second control.
  • the first position includes at least one of the top position and the bottom position.
  • the electronic device generates a combined control based on the second control.
  • the electronic device may determine multiple second controls with matching display styles according to the display style of each second control, and then combine the multiple second controls with matching display styles to obtain a combined control.
  • the electronic device may use multiple second controls arranged in the same direction as multiple second controls with matching display styles. The same direction may be horizontal or vertical.
  • the electronic device can determine whether the arrangement of the second controls in the first interface is horizontal or vertical, and select part of the second controls from the first interface according to the arrangement, position and size of the second controls in the first interface, and recombine them to obtain a combined control, so that the electronic device can map the combined control to the first area.
  • the electronic device can recombine all the second controls in the first interface to obtain multiple combined controls.
  • the electronic device may combine multiple second controls arranged horizontally in one or more rows to obtain a combined control.
  • the electronic device may also combine multiple second controls arranged vertically in one or more columns to obtain a combined control.
  • the electronic device may combine multiple second controls arranged horizontally and multiple second controls arranged vertically to obtain a combined control.
  • the electronic device may display a selection cursor in the second area, such as a content selection box 2101 shown in FIG. 21 .
  • the selection cursor may prompt the user which part of the content in the second area can be controlled in the first area.
  • the selection cursor includes a border, which defines a selection area of the selection cursor in the reduced first interface, and the selection area is used to select at least one second control.
  • the transparency of the selection area defined by the border may be 0 to 100%.
  • the content of the second area framed by the selection area is the content that the user can control through the first area.
  • the electronic device can move the selection cursor in the second area according to the user's sliding operation on the first area, so that the selection area of the selection cursor frames different content.
  • the size of the selection area of the selection cursor may be the size of the area occupied by the combined control. According to the size of the area occupied by the combined control, the size of the selection area of the selection cursor is determined to ensure that the selection area of the selection cursor can just frame the combined control. In this way, the electronic device can move the selection cursor among the multiple combined controls according to the sliding operation of the user on the first area. Thus, the user can determine the combined control currently selected by the electronic device through the selection cursor displayed by the electronic device.
  • a first control in a first interface that does not move with page browsing and is in a fixed position, it will always be displayed in the first interface, and the electronic device may map the first control to the first area during the display of the first interface, so that the user can operate the first control at any time.
  • a second control in the first interface that moves with page browsing and is in a non-fixed position, it will gradually appear or disappear as the page is browsed, so the electronic device does not need to map the second control all the time, but maps it when the user browses to the second control.
  • the electronic device may map the combination control framed by the selection area of the selection cursor to the first area.
  • the electronic device may first map the first combination control to the first area, that is, first display the first combination control selected by the selection cursor in the first area, and then follow the movement of the selection cursor to cancel the mapping of the first combination control and re-map the second combination control to the first area, that is, the electronic device cancels the display of the first combination control in the first area and re-displays the second combination control newly selected by the selection cursor.
  • the electronic device can map the combination control framed by the selection area of the selection cursor to the first area in real time.
  • the first interface is a browsing interface
  • the browsing interface includes multiple second controls
  • the multiple second controls include a first option control, a second option control, and a third option control arranged in a horizontal direction.
  • the electronic device can combine the first option control, the second option control, and the third option control arranged in a horizontal direction to obtain an option combination control.
  • the selection cursor selects the option combination control
  • the electronic device can display the option combination control in the first area.
  • the selection area of the current selection cursor can just frame the option combination control.
  • the size of the selection area of the selection cursor may also be the size of the area occupied by each second control.
  • the electronic device may follow the movement of the selection cursor and map different second controls to the first area respectively.
  • the electronic device may map the second controls framed by the selection area of the selection cursor to the first area for display in real time.
  • the first interface is a browsing interface
  • the browsing interface includes multiple second controls
  • the multiple second controls include a first option control, a second option control, and a third option control arranged in a horizontal direction
  • the selection cursor can follow the user's sliding operation in the first area to select the first option control, the second option control, and the third option control one by one.
  • the electronic device can map the first option control to the first area for display.
  • the electronic device may not map the object to the first area for display, but directly perform a single click or double click operation in the first area to realize touch control of the single object.
  • the electronic device can adaptively display selection cursors of different sizes according to different applications.
  • the electronic device can use multiple second controls arranged in a horizontal direction as multiple second controls with matching display styles to generate a combined control.
  • the first interface is the homepage of a search application
  • the focus control, news control, news control, and map control are usually in a horizontal position.
  • the electronic device can combine these multiple controls arranged in a horizontal direction to obtain a combined control, and the electronic device can display a selection cursor 1902 according to the area occupied by the combined control.
  • the electronic device can use multiple second controls arranged in the vertical direction as multiple second controls that match the display style.
  • the first application is different from the second application.
  • the electronic device can combine multiple controls arranged in the vertical direction to obtain a combined control, and the electronic device can display the selection cursor 1802 according to the area occupied by the combined control.
  • the electronic device can adaptively display selection cursors of different sizes according to different contents in the same application.
  • the embodiment of the present application does not limit the size of the selection cursor.
  • S2463 The electronic device determines whether the first control and the combined control are adapted to the first area. If not, the electronic device first executes S2464 and then continues to execute S2463. If yes, the electronic device executes S2465.
  • S2464 The electronic device adjusts the position and size of the first control and the combined control to obtain the adjusted first control and the combined control.
  • S2465 The electronic device maps the first control and the combined control to the first area.
  • the electronic device can directly determine whether the size of the area occupied by these second controls matches the size of the first area. If not, the electronic device can adjust the display style of these second controls to match the first area. The electronic device can then map the adjusted second controls to the first area.
  • the electronic device can divide these second controls into multiple areas, recombine the second controls in each area to obtain multiple combined combined controls, and then the electronic device can map the combined control selected by the selection cursor to the first area according to the movement of the selection cursor.
  • second controls such as not less than 10 second controls
  • the electronic device can determine whether the size of the area occupied by the combined control matches the size of the first area, and if not, the electronic device can adjust the display style of the combined control to match the first area.
  • the size of the area occupied by the adjusted combined control matches the size of the first area.
  • the electronic device can adjust the display style of each second control in the combined control, and then recombine each adjusted second control to obtain an adjusted new combined control, and then the electronic device can map the adjusted new combined control to the first area.
  • the electronic device can adjust the display size of each second control in the combined control according to the size of the first area, and can also adjust the display spacing between two adjacent second controls in the combined control, and can also adjust the display position of each second control in the combined control.
  • the embodiment of the present application does not limit the adjustment method of the display style of the combined control.
  • the electronic device can also directly determine whether the size of the area occupied by these first controls matches the size of the first area. If not, the electronic device can adjust the display style of these first controls to match the first area. The electronic device can then map the adjusted first controls to the first area. Optionally, the electronic device can also adjust the display style of these first controls, but display some of the first controls in the form of a switching list and hide some of the first controls. When the user slides the switching list on the first area, the electronic device displays the hidden first controls.
  • the electronic device may regroup the plurality of second controls to obtain a plurality of combined controls. The electronic device may then determine whether the size of the area occupied by the first controls and the combined controls selected by the selection cursor matches the size of the first area. If not, the electronic device may adjust the display styles of the first controls and the combined controls selected by the selection cursor to match the first area.
  • the electronic device can determine that the layout and size of the first control and the combined control do not match the first area.
  • the electronic device can determine that the layout and size of the first control and the combined control do not match the first area.
  • the electronic device can determine that the layout and size of the first control and the combined control do not match the first area.
  • the preset value can be reasonably set according to the implementation application, and is not limited in the embodiments of the present application.
  • the electronic device can determine that the layout and size of the first control and the combined control match the first area, and the electronic device can directly map the first control and the combined control according to the original layout and size to the first area.
  • the electronic device can adjust the layout and size of the first control and the combined control to ensure that the adjusted layout and size of the first control and the combined control match the first area.
  • the electronic device then maps the adjusted first control and the combined control to the first area.
  • the electronic device may not adjust the layout and size of the first control, but display the first control in the form of a switch list according to the original layout and size.
  • the electronic device can display a switching list of sub-controls in the first area.
  • the five sub-controls "Home, VIP Member, Message, Shopping Cart, Mine” in the bottom sidebar 2302 the electronic device can display a switching list of the bottom bar, and the switching list can display the first three sub-controls "Home, VIP Member, Message”.
  • the switching list can switch to "Message, Shopping Cart, Mine", that is, display the remaining two hidden sub-controls and hide the two sub-controls previously displayed.
  • the electronic device may also establish a coordinate mapping relationship between the position of the target control in the first interface and the target position in the first area, thereby mapping the target control of the first interface in the first area to the second area.
  • the electronic device can respond to the touch operation acting on the target position in the first area, and map the touch operation acting on the target position in the first area to the touch operation acting on the target control in the reduced first interface.
  • the electronic device can respond to the touch operation acting on the target control in the reduced first interface, and execute the function corresponding to the target control on the reduced first interface.
  • the target control can be the above-mentioned first control, the above-mentioned second control, or the above-mentioned third control.
  • S2470 The electronic device detects a touch operation performed by the user on the first area.
  • touch operations may include operations such as a user touching the display screen of an electronic device and moving the display screen after touching. It not only includes the user touching the display screen through fingers or other parts of the body, but also includes the user touching the display screen through a touch device such as a touch pen.
  • the touch here can be an operation of directly contacting the touch screen, or it can be the control of the display screen within a certain small distance range of the vertical distance from the touch screen surface.
  • a finger can directly contact the touch screen, or it can realize touch control of the touch screen within a small distance range of the vertical distance from the touch screen surface, that is, there is no need to directly contact the touch screen surface.
  • the touch operation may be a common touch operation such as sliding up, sliding down, sliding left, sliding right, single-clicking, double-clicking, long pressing, etc., or a specific touch gesture such as drawing a circle, drawing a check mark ⁇ , drawing a cross ⁇ , etc.
  • a common touch operation such as sliding up, sliding down, sliding left, sliding right, single-clicking, double-clicking, long pressing, etc.
  • a specific touch gesture such as drawing a circle, drawing a check mark ⁇ , drawing a cross ⁇ , etc.
  • the present application embodiment does not limit this.
  • the electronic device can monitor the user's touch gesture on the display screen in real time according to the gesture algorithm, and determine whether the touch gesture on the display screen is located in the first area of the display screen.
  • the electronic device can map the touch gesture to the second area.
  • the electronic device executes a function corresponding to the touch operation on the reduced screen display area displayed in the second area and the reduced first interface according to a mapping relationship.
  • the electronic device when the first interface is the homepage of an application, when the electronic device detects that the user's touch operation on the first area is a left swipe operation, the electronic device can execute an exit function of the application. At this time, the electronic device can respond to the left swipe operation and control the reduced screen display area to switch from displaying the first interface to displaying the desktop interface returned after the application exits.
  • the electronic device when the first interface is not the home page of an application, when the electronic device detects that the user's touch operation on the first area is a left swipe operation, the electronic device can execute the application's return to the previous page function. At this time, the electronic device can respond to the left swipe operation and control the reduced screen display area to switch from displaying the first interface to displaying the previous application interface.
  • the electronic device when the first interface is a desktop interface, when the electronic device detects that the user's touch operation on the first area is a left swipe operation, the electronic device can execute the desktop interface switching function of the application. At this time, the electronic device can respond to the left swipe operation and control the reduced screen display area to switch from displaying the first interface to displaying a new desktop interface.
  • the electronic device when the first interface is an application interface of an application, when the first area is mapped with multiple operation controls in the content selection box in the application interface, when the electronic device detects a user's click operation on a target operation control in the first area, the electronic device can execute the function when the target operation control in the application interface is touched.
  • the electronic device when the first area is mapped with a video playback control in a reduced video playback application interface displayed in the second area, when the electronic device detects a user click operation on the video playback control in the first area, the electronic device can execute the function of the video playback control in the reduced video playback application interface when it is touched, that is, the video playback function, to control the reduced screen display area to display the video playback screen.
  • the electronic device when the video playback control in the video playback application interface is located at the center of the video playback interface, if the electronic device has pre-established a coordinate mapping relationship between the center of the video playback interface and the center of the first area, then when the electronic device detects a click operation performed by the user on the center of the first area, the electronic device can map the click operation performed on the center of the first area to a click operation performed on the playback control in the reduced video playback interface; then the electronic device can respond to the click operation performed on the playback control in the reduced video playback interface and play the video in the reduced video playback interface to control the reduced screen display area to display the video playback screen.
  • the one-handed operation method provided in the embodiment of the present application can be used by the electronic device to adaptively adjust the touch area of the one-handed operation mode according to the one-handed holding posture of the user to ensure that the touch area is suitable for the one-handed operation of the user.
  • the remaining area of the display screen except the touch area of the one-handed operation mode can be used as the display area of the one-handed operation mode.
  • the display area can display the user interface displayed in the original display area after being reduced.
  • the electronic device can also display the desktop application in the remaining display area. Rearrange and display.
  • the electronic device also establishes a mapping relationship between the operation area of the one-hand operation mode and the display area of the one-hand operation mode, so that when the electronic device detects a touch operation in the touch area of the one-hand operation mode, it can be mapped to the touch operation in the display area of the one-hand operation mode according to the mapping relationship, so that the electronic device can perform the function that needs to be performed when the display area of the one-hand operation mode is touched.
  • the user can perform a full range of operations on the original interface of the electronic device that is reduced in display by performing touch operations in the touch area. Without the need for new development or adaptation of the application, the problem that part of the content displayed on the screen may not be touchable by the user's fingers when operating the electronic device with one hand is solved.
  • the electronic device includes hardware and/or software modules corresponding to the execution of each function.
  • the present application can be implemented in the form of hardware or a combination of hardware and computer software. Whether a function is executed in the form of hardware or computer software driving hardware depends on the specific application and design constraints of the technical solution. Those skilled in the art can use different methods to implement the described functions for each specific application in combination with the embodiments, but such implementation should not be considered to be beyond the scope of the present application.
  • the electronic device can be divided into functional modules according to the above method example.
  • each functional module can be divided according to each function, or two or more functions can be integrated into one processing module.
  • the above integrated module can be implemented in the form of hardware. It should be noted that the division of modules in this embodiment is schematic and is only a logical function division. There may be other division methods in actual implementation.
  • An embodiment of the present application further provides a computer storage medium, in which computer instructions are stored.
  • the computer instructions When the computer instructions are executed on an electronic device, the electronic device executes the above-mentioned related method steps to implement the interface display method in the above-mentioned embodiment.
  • the embodiments of the present application also provide a computer program product.
  • the computer program product When the computer program product is run on a computer, the computer executes the above-mentioned related steps to implement the interface display method executed by the electronic device in the above-mentioned embodiment.
  • an embodiment of the present application also provides a device, which may specifically be a chip, component or module, and the device may include a connected processor and memory; wherein the memory is used to store computer execution instructions, and when the device is running, the processor may execute the computer execution instructions stored in the memory so that the chip executes the interface display method executed by the electronic device in the above-mentioned method embodiments.
  • the electronic device, computer storage medium, computer program product or chip provided in this embodiment is used to execute the corresponding method provided above. Therefore, the beneficial effects that can be achieved can refer to the beneficial effects in the corresponding method provided above and will not be repeated here.
  • the disclosed devices and methods can be implemented in other ways.
  • the device embodiments described above are only schematic.
  • the division of the modules or units is only a logical function division. There may be other division methods in actual implementation, such as multiple units or components can be combined or integrated into another device, or some features can be ignored or not executed.
  • Another point is that the mutual coupling or direct coupling or communication connection shown or discussed can be through some interfaces, indirect coupling or communication connection of devices or units, which can be electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may be one physical unit or multiple physical units, that is, they may be located in one place or distributed in multiple different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the present embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the above-mentioned integrated unit may be implemented in the form of hardware or in the form of software functional units.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a readable storage medium.
  • the technical solution of the embodiment of the present application is essentially or the part that contributes to the prior art or all or part of the technical solution can be embodied in the form of a software product, which is stored in a storage medium and includes a number of instructions for making a device (which can be a single-chip microcomputer, chip, etc.) or a processor (processor) Execute all or part of the steps of the method described in each embodiment of the present application.
  • the aforementioned storage medium includes: a USB flash drive, a mobile hard disk, a read only memory (ROM), a random access memory (RAM), a magnetic disk or an optical disk, and other media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种单手操作方法及电子设备,涉及电子设备技术领域,电子设备在进入单手操作模式后,将显示屏划分为单手操作模式的触控区域和单手操作模式的显示区域两个区域,使得用户可以通过在触控区域内进行触摸操作实现对显示区域的全范围操作。该方案中,电子设备显示第一界面时,若检测到单手操作模式的触发指令,电子设备确定显示屏的第一区域和第二区域,并将显示屏的原显示区域及其中显示的第一界面进行缩小并显示于第二区域;然后电子设备响应于用户作用于第一区域的触摸操作,对第二区域显示的缩小的原显示区域及其中显示的第一界面执行与该触摸操作对应的功能。

Description

一种单手操作方法及电子设备
本申请要求于2022年11月30日提交国家知识产权局、申请号为202211521350.3、申请名称为“一种单手操作方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及电子设备技术领域,尤其涉及一种单手操作方法及电子设备。
背景技术
随着电子设备的不断发展,越来越多具有显示屏的电子设备被广泛应用于人们的日常生活和工作中,如具有显示屏的手机、平板等。且不难发现,随着屏幕技术的发展,电子设备的显示屏也变得越来越大,以给用户提供更丰富的信息,带给用户更好的使用体验。
然而,随着电子设备的显示屏越来越大,用户单手操作不便的问题也日益显现出来。例如用户在单手握持电子设备的情况下,很难用握持电子设备的那只手对显示屏进行全范围的操作,显示屏上的一大部分区域比较难操作到,此时用户不得不使用双手操作,即一只手握持电子设备,另一只手对电子设备的显示屏进行操作。影响了用户的单手操作体验。
发明内容
本申请提供一种单手操作方法及电子设备,可以提升用户的单手操作体验。
为达到上述目的,本申请实施例采用如下技术方案:
第一方面,提供一种单手操作方法,该方法可以应用于电子设备,该电子设备可以包括显示屏。该单手操作方法包括:显示第一界面;响应单手操作模式的触发指令,确定显示屏的第一区域和第二区域;其中,第一区域为用户单手握持电子设备时,用户单手的手指在显示屏上能触摸到的区域,第二区域为显示屏上除第一区域以外的剩余区域;显示缩小后的第一界面于第二区域;响应作用于第一区域的触摸操作,对缩小后的第一界面执行与触摸操作对应的功能。
上述第一方面提供的方案,电子设备在通过显示屏的屏幕显示区域显示第一界面时,若检测到进入单手操作模式的触发指令,则电子设备可以将显示屏上用户单手可触摸到的区域(即电子设备显示屏的部分屏幕显示区域),作为单手操作模式的触控区域,以接收用户单手的触摸操作,同时将显示屏上除该触控区域以外的剩余区域,作为单手操作模式的显示区域,以将显示屏的原屏幕显示区域以及原屏幕显示区域所显示的第一界面缩小显示于该单手操作模式的显示区域。然后电子设备可以感测用户在单手操作模式的触控区域内的触摸操作,以根据在该触控区域内感测到的触摸操作,对应控制单手操作模式的显示区域中所显示的缩小后的原屏幕显示区域以及原屏幕显示区域中显示的第一界面。如此,用户通过在触控区域内进行触摸操作,便可对电子设备缩小显示的原有界面进行全范围的操作。在无需应用程序新增开发或适配的情况下,解决了用户单手在电子设备上进行操作时屏幕显示的部分内容可能手指触摸不到的问题。
在一种可能的实现方式中,上述确定显示屏的第一区域,可以包括:提示用户单手握持电子设备时,使用单手的手指在显示屏上按照指定轨迹滑动;根据单手的手指的滑动,确定单手的手指在显示屏上能触摸到的最大区域;根据最大区域,确定显示屏的第一区域。如此,电子设备可以确保设置的触控区域更符合每个用户的单手操作习惯,用户在单手握持电子设备时,能够对触控区域的各个位置进行操作。
在一种可能的实现方式中,上述第一界面可以包括目标控件,上述响应作用于第一区域的触摸操作,对缩小后的第一界面执行与触摸操作对应的功能,可以包括:响应作用于第一区域中目标位置的触摸操作,将作用于第一区域中目标位置的触摸操作,映射为作用于缩小后的第一界面中目标控件的触摸操作;其中,第一区域中的目标位置与第一界面中目标控件所在的位置之间预先建立有坐标映射关系;响应作用于缩小后的第一界面中目标控件的触摸操作,对缩小后的第一 界面执行目标控件对应的功能。
如此,电子设备通过建立单手操作模式的触控区域与单手操作模式的显示区域之间的坐标映射关系,可以将用户对该触控区域内某个位置的触摸操作,映射到对该显示区域内对应位置的触摸操作,使得电子设备能够执行该显示区域内对应位置被触控时所需执行的功能。从而无需用户在单手无法触摸到的显示区域操作,用户可以直接在单手能触摸到的触控区域内进行触摸操作,实现对显示区域内缩小显示的原有界面进行全范围的操作。
可选地,以第一界面为视频播放界面,目标控件为位于视频播放界面的中心位置的播放控件为例,视频播放界面的中心位置与第一区域的中心位置可以预先建立有坐标映射关系,上述响应作用于第一区域的触摸操作,对缩小后的第一界面执行与触摸操作对应的功能,可以包括:响应作用于第一区域中心位置的触摸操作,将作用于第一区域中心位置的触摸操作,映射为作用于缩小后的视频播放界面中播放控件的触摸操作;响应作用于缩小后的视频播放界面中播放控件的触摸操作,对缩小后的视频播放界面中的视频进行播放。
如此,电子设备可以根据目标控件在第一界面中的布局方位,例如目标控件在显示区域的中心位置、左上方、右上方、左下方、右下方等,在单手操作模式的触控区域内相应位置,如触控区域的中心位置、左上方、右上方、左下方、右下方等,建立映射关系,以确保用户在触控区域内的触控位置,与目标控件在第一界面中的位置对应,便于用户在触控区域内可以快速准确地触控到想要操作的控件。在一种可能的实现方式中,第一界面可以包括目标控件,在显示缩小后的第一界面之后,单手操作方法还包括:显示目标控件于第一区域。如此,电子设备也可以将第一界面中控件映射到触控区域进行显示,以便用户在触控区域可以直观、精准地操控到想要操作的控件。
在该实现方式下,上述响应作用于第一区域的触摸操作,对缩小后的第一界面执行与触摸操作对应的功能,可以包括:响应作用于第一区域中目标控件的触摸操作,对缩小后的第一界面执行目标控件对应的功能。由于触控区域显示的目标控件由第一界面中的目标控件映射而来,因此当触控区域显示的目标控件被触发时,可以认为第一界面中的目标控件被触发,电子设备可以直接对缩小后的第一界面执行目标控件被触发时所需执行的功能。
可选地,以第一界面为视频播放界面,目标控件为播放控件为例,在显示缩小后的第一界面之后,单手操作方法还包括:显示播放控件于第一区域。上述响应作用于第一区域的触摸操作,对缩小后的第一界面执行与触摸操作对应的功能,可以包括:响应作用于第一区域中播放控件的触摸操作,对缩小后的视频播放界面中的视频进行播放。如此,电子设备以单手操作模式显示缩小后的视频播放界面时,电子设备可以将视频播放界面中的播放控件显示在触控区域,以便用户在触控区域可以直观、精准地单手操控到该播放控件。
在一种可能的实现方式中,上述目标控件为第一控件,在显示缩小后的第一界面之后,单手操作方法还可以包括:根据第一界面中每个控件的显示样式,从第一界面中获取固定显示于第一界面内第一位置的控件,作为第一控件;其中,第一位置包括顶部位置、底部位置中的至少一种。可以理解,一般位于用户界面中顶部或底部的控件是不随页面浏览而移动的、固定位置的操作控件,因此电子设备可以将位于用户界面中顶部或底部的控件一直映射到触控区域进行显示,以便用户可以随时在触控区域对这些固定位置的控件进行操作。可选地,映射到触控区域显示时,也可以对应映射到触控区域的顶部或底部进行显示。
可选地,第一控件可以包括标题栏、导航栏、菜单栏中的至少一种。
在一种可能的实现方式中,上述目标控件为第二控件,在显示缩小后的第一界面之后,单手操作方法还包括:根据第一界面中每个控件的显示样式,从第一界面中获取非固定显示于第一界面内第一位置的控件,作为第二控件;其中,第一位置包括顶部位置、底部位置中的至少一种;显示选择光标于缩小后的第一界面,选择光标包括一边界,边界定义了选择光标在缩小后的第一界面中的选择区域,选择区域用于选择至少一个第二控件;显示目标控件于第一区域,包括:显示选择光标的选择区域中的第二控件于第一区域。
由于第一界面中随页面浏览而移动的、非固定位置的第二控件,其会随着页面的浏览逐渐显示、或逐渐不显示,因此,电子设备可以不用一直映射第二控件,而是在用户浏览到该第二控件 时再对其进行映射显示。可选地,电子设备可以通过显示选择光标来确定用户当前浏览到的第二控件,从而电子设备可以将选择光标选定的第二控件映射到触控区域进行显示。
可选地,选择光标可以随用户在触控区域的滑动操作而移动,当用户操控选择光标移动时,电子设备可以跟随选择光标的移动,将选择光标移动后新选中的第二控件映射到触控区域进行显示。如此,电子设备可以实时将选择光标选定的控件映射到触控区域进行显示。
可选地,以第一界面为浏览界面,浏览界面包括多个第二控件,多个第二控件中包括第一选项控件为例,单手操作方法还可以包括:当选择光标的选择区域中包括第一选项控件时,显示第一选项控件于第一区域。如此,电子设备以单手操作模式显示缩小后的浏览界面时,电子设备可以在浏览界面显示选择光标,该选择光标可以浏览界面中的任一选择控件,电子设备通过将选择光标选中的选择控件显示在触控区域,以便用户在触控区域可以直观、精准地单手操控到该选择控件。
在一种可能的实现方式中,在从第一界面中获取非固定显示于第一界面内第一位置的控件,作为第二控件之后,单手操作方法还可以包括:对显示样式相匹配的多个第二控件进行组合处理,得到组合控件;将组合控件所占据的区域,确定为选择光标的选择区域。上述显示选择光标的选择区域内的第二控件于第一区域,包括:显示选择光标的选择区域中的组合控件于第一区域。如此,对于用户界面中存在比较多的随页面浏览而移动的、非固定位置的第二控件,如果按照单个控件逐一映射到触控区域,用户需要频繁执行滑动操作,可能才会操作到想要操控到控件,因此,电子设备可以将这些第二控件进行重新组合,以得到可以一起映射到触控区域显示的组合控件,然后用户可以直接在触控区域内显示的组合控件中,操作到想要操控到某一具体控件。
在一种可能的实现方式中,单手操作方法还可以包括:根据每个第二控件的显示样式,将沿同一方向排列的多个第二控件,作为显示样式相匹配的多个第二控件。可以理解,由于同向排列的控件通常尺寸大小相同,因此电子设备可以将同向排列的多个控件进行重新组合,以得到规则形状的组合控件。
在一种可能的实现方式中,第一界面为应用界面,单手操作方法还可以包括:当第一界面为第一应用程序的应用界面时,将沿水平方向排列的多个第二控件,作为显示样式相匹配的多个第二控件;当第一界面为第二应用程序的应用界面时,将沿垂直方向排列的多个第二控件,作为显示样式相匹配的多个第二控件;其中,第一应用程序不同于第二应用程序。如此,电子设备可以根据不同的应用程序类型,自适应生成匹配的组合控件。
可选地,以第一界面为浏览界面,浏览界面包括多个第二控件,多个第二控件中包括沿水平方向排列的第一选项控件、第二选项控件、第三选项控件为例,单手操作方法还可以包括:对沿水平方向排列的第一选项控件、第二选项控件、第三选项控件进行组合处理,得到选项组合控件;当选择光标的选择区域中包括选项组合控件时,显示选项组合控件于第一区域;其中,选择光标的选择区域的大小与选项组合控件所占用的区域大小相匹配。如此,电子设备以单手操作模式显示缩小后的浏览界面时,电子设备可以将水平方向排成一列的多个控件组合成一个组合控件,该组合控件的大小可以是选择光标的选择区域大小,从而当选择光标移动至该组合控件所在位置时,电子设备可以直接将组合控件映射到触控区域进行显示,以便用户在触控区域可以直观、精准地单手操控到该组合控件中的每个控件。
在一种可能的实现方式中,单手操作方法还可以包括:在检测到组合控件所占用的区域大小与第一区域的大小不匹配时,调整组合控件中每个第二控件的显示样式,得到调整后的组合控件;其中,调整后的组合控件所占用的区域大小与第一区域的大小匹配;显示调整后的组合控件于第一区域。如此,电子设备可以根据触控区域的大小,自适应调整组合控件中每个控件的大小和位置,这样当显示区域内显示的组合控件过小时,映射到触控区域中时可以对组合控件进行放大显示,提升用户在触控区域的操控体验。
在一种可能的实现方式中,上述调整组合控件中每个第二控件的显示样式,包括:根据第一区域的大小,调整组合控件中每个第二控件的显示大小;或者根据第一区域的大小,调整组合控件中相邻两个第二控件之间的显示间距;或者根据第一区域的大小,调整组合控件中每个第二控件的显示位置。如此,电子设备可以触控区域的大小,自适应选择调整组合控件中每个控件的大 小、位置或间距中的至少一种。
在一种可能的实现方式中,以第一界面为应用程序的首页为例,上述响应作用于第一区域的触摸操作,对缩小后的第一界面执行与触摸操作对应的功能,包括:响应作用于第一区域的左滑操作,退出应用程序,并显示缩小后的桌面界面于第二区域。如此,用户可以在触控区域执行特定的手势实现特定的操作。
在一种可能的实现方式中,第一界面为应用程序的非首页时,上述响应作用于第一区域的触摸操作,对缩小后的第一界面执行与触摸操作对应的功能,包括:响应作用于第一区域的左滑操作,显示缩小后的第二界面,第二界面为第一界面的上一层级的界面。如此,相同的手势在不同的界面中时所执行的功能可以不同。
在一种可能的实现方式中,上述响应作用于第一区域的触摸操作,对缩小后的第一界面执行与触摸操作对应的功能,包括:检测到作用于第一区域的预设手势操作,预设手势操作用于触发第三界面中的预设功能;响应预设手势操作,将第二区域显示的缩小后的第一界面切换至缩小后的第三界面,并执行预设功能。如此,电子设备可以将特定的手势操作与某个页面中某个功能进行绑定,从而用户在触控区域执行该特定的手势操作时,可以一键打开该页面执行该功能。
在一种可能的实现方式中,上述第二区域包括目标显示区域和第三区域,显示缩小后的第一界面于第二区域,包括:显示缩小后的第一界面于目标显示区域;显示多个图标控件于第三区域;其中,图标控件包括应用程序的图标控件、快捷功能的图标控件中的至少一种。如此,既避免了显示屏因存在有未显示任何内容的区域而导致显示屏出现黑色区域的现象,也实现对电子设备显示屏的整个显示区域的充分利用,避免大屏幕面积浪费,且用户也可以在浏览原用户界面的同时,快速打开其他应用。
在一种可能的实现方式中,上述显示多个图标控件于第三区域,包括:将桌面界面中的多个应用程序的图标控件进行重新排列;显示重新排列后的多个应用程序的图标控件于第三区域。如此,电子设备在进入单手操作模式时,除了将原屏幕显示区域所显示的第一界面缩小显示于该单手操作模式的显示区域,还可以自动将桌面上的应用图标重新排列,显示在剩余的第三区域。
在一种可能的实现方式中,单手操作方法还包括:显示切换控件于第一区域,其中,第一区域的作用区域为目标显示区域;响应作用于第一区域中切换控件的触摸操作,确定第一区域的作用区域从目标显示区域切换至第三区域;响应作用于第一区域的触摸操作,对第三区域中的多个图标控件执行与第一区域的触摸操作对应的功能。如此,电子设备可以在触控区域提供一切换控件,以供用户控制当前触控区域是对原屏幕显示区域所显示的第一界面进行操控,还是对第三区域额外显示的多个图标进行操控。
第二方面,提供了一种电子设备,包括:显示单元、确定单元和执行单元。其中,显示单元,用于显示第一界面。确定单元,用于响应单手操作模式的触发指令,确定显示屏的第一区域和第二区域;其中,第一区域为用户单手握持电子设备时,用户单手的手指在显示屏上能触摸到的区域,第二区域为显示屏上除第一区域以外的剩余区域。确定单元,还用于显示缩小后的第一界面于第二区域。执行单元,用于响应作用于第一区域的触摸操作,对缩小后的第一界面执行与触摸操作对应的功能。
在一种可能的实现方式中,确定单元可以用于:提示用户单手握持电子设备时,使用单手的手指在显示屏上按照指定轨迹滑动;根据单手的手指的滑动,确定单手的手指在显示屏上能触摸到的最大区域;根据最大区域,确定显示屏的第一区域。
在一种可能的实现方式中,上述第一界面可以包括目标控件,上述执行单元可以用于:响应作用于第一区域中目标位置的触摸操作,将作用于第一区域中目标位置的触摸操作,映射为作用于缩小后的第一界面中目标控件的触摸操作;其中,第一区域中的目标位置与第一界面中目标控件所在的位置之间预先建立有坐标映射关系;响应作用于缩小后的第一界面中目标控件的触摸操作,对缩小后的第一界面执行目标控件对应的功能。
可选地,以第一界面为视频播放界面,目标控件为位于视频播放界面的中心位置的播放控件为例,视频播放界面的中心位置与第一区域的中心位置可以预先建立有坐标映射关系,上述执行单元可以用于:响应作用于第一区域中心位置的触摸操作,将作用于第一区域中心位置的触摸操 作,映射为作用于缩小后的视频播放界面中播放控件的触摸操作;响应作用于缩小后的视频播放界面中播放控件的触摸操作,对缩小后的视频播放界面中的视频进行播放。
在一种可能的实现方式中,第一界面可以包括目标控件,上述显示单元还可以用于:显示目标控件于第一区域。上述执行单元可以用于:响应作用于第一区域中目标控件的触摸操作,对缩小后的第一界面执行目标控件对应的功能。
可选地,以第一界面为视频播放界面,目标控件为播放控件为例,上述显示单元还可以用于:显示播放控件于第一区域。上述执行单元可以用于:响应作用于第一区域中播放控件的触摸操作,对缩小后的视频播放界面中的视频进行播放。
在一种可能的实现方式中,上述目标控件为第一控件,电子设备还可以包括:第一获取单元,用于根据第一界面中每个控件的显示样式,从第一界面中获取固定显示于第一界面内第一位置的控件,作为第一控件;其中,第一位置包括顶部位置、底部位置中的至少一种。
可选地,第一控件可以包括标题栏、导航栏、菜单栏中的至少一种。
在一种可能的实现方式中,上述目标控件为第二控件,电子设备还可以包括:第二获取单元,用于根据第一界面中每个控件的显示样式,从第一界面中获取非固定显示于第一界面内第一位置的控件,作为第二控件;其中,第一位置包括顶部位置、底部位置中的至少一种。上述显示单元还可以用于:显示选择光标于缩小后的第一界面,选择光标包括一边界,边界定义了选择光标在缩小后的第一界面中的选择区域,选择区域用于选择至少一个第二控件;显示选择光标的选择区域中的第二控件于第一区域。
可选地,以第一界面为浏览界面,浏览界面包括多个第二控件,多个第二控件中包括第一选项控件为例,上述显示单元还可以用于:当选择光标的选择区域中包括第一选项控件时,显示第一选项控件于第一区域。
在一种可能的实现方式中,电子设备还可以包括:组合单元和区域生成单元。其中,组合单元,用于对显示样式相匹配的多个第二控件进行组合处理,得到组合控件;区域生成单元用于,将组合控件所占据的区域,确定为选择光标的选择区域。上述显示单元还可以用于:显示选择光标的选择区域中的组合控件于第一区域。
在一种可能的实现方式中,电子设备还可以包括:匹配单元,用于根据每个第二控件的显示样式,将沿同一方向排列的多个第二控件,作为显示样式相匹配的多个第二控件。
在一种可能的实现方式中,第一界面为应用界面,上述匹配单元还可以用于:当第一界面为第一应用程序的应用界面时,将沿水平方向排列的多个第二控件,作为显示样式相匹配的多个第二控件;当第一界面为第二应用程序的应用界面时,将沿垂直方向排列的多个第二控件,作为显示样式相匹配的多个第二控件;其中,第一应用程序不同于第二应用程序。
可选地,以第一界面为浏览界面,浏览界面包括多个第二控件,多个第二控件中包括沿水平方向排列的第一选项控件、第二选项控件、第三选项控件为例,上述组合单元可以用于:对沿水平方向排列的第一选项控件、第二选项控件、第三选项控件进行组合处理,得到选项组合控件。上述显示单元还可以用于:当选择光标的选择区域中包括选项组合控件时,显示选项组合控件于第一区域;其中,选择光标的选择区域的大小与选项组合控件所占用的区域大小相匹配。
在一种可能的实现方式中,电子设备还可以包括:调整单元,用于在检测到组合控件所占用的区域大小与第一区域的大小不匹配时,调整组合控件中每个第二控件的显示样式,得到调整后的组合控件;其中,调整后的组合控件所占用的区域大小与第一区域的大小匹配。上述显示单元还可以用于:显示调整后的组合控件于第一区域。
在一种可能的实现方式中,上述调整单元还可以用于:根据第一区域的大小,调整组合控件中每个第二控件的显示大小;或者根据第一区域的大小,调整组合控件中相邻两个第二控件之间的显示间距;或者根据第一区域的大小,调整组合控件中每个第二控件的显示位置。
在一种可能的实现方式中,以第一界面为应用程序的首页为例,上述执行单元可以用于:响应作用于第一区域的左滑操作,退出应用程序,并显示缩小后的桌面界面于第二区域。
在一种可能的实现方式中,第一界面为应用程序的非首页时,上述执行单元可以用于:响应作用于第一区域的左滑操作,显示缩小后的第二界面,第二界面为第一界面的上一层级的界面。
在一种可能的实现方式中,上述执行单元可以用于:检测到作用于第一区域的预设手势操作,预设手势操作用于触发第三界面中的预设功能;响应预设手势操作,将第二区域显示的缩小后的第一界面切换至缩小后的第三界面,并执行预设功能。
在一种可能的实现方式中,上述第二区域包括目标显示区域和第三区域,上述显示单元可以用于:显示缩小后的第一界面于目标显示区域;显示多个图标控件于第三区域;其中,图标控件包括应用程序的图标控件、快捷功能的图标控件中的至少一种。
在一种可能的实现方式中,上述显示单元可以用于:将桌面界面中的多个应用程序的图标控件进行重新排列;显示重新排列后的多个应用程序的图标控件于第三区域。
在一种可能的实现方式中,上述显示单元可以用于:显示切换控件于第一区域,其中,第一区域的作用区域为目标显示区域。电子设备还可以包括:切换单元,用于响应作用于第一区域中切换控件的触摸操作,确定第一区域的作用区域从目标显示区域切换至第三区域。上述执行单元还可以用于:响应作用于第一区域的触摸操作,对第三区域中的多个图标控件执行与第一区域的触摸操作对应的功能。
第三方面,本申请提供了一种电子设备,包括显示屏、一个或多个处理器和一个或多个存储器。该显示屏、一个或多个存储器与一个或多个处理器耦合,一个或多个存储器用于存储计算机程序代码,计算机程序代码包括计算机指令,当一个或多个处理器执行计算机指令时,使得电子设备执行上述第一方面任一项可能的实现中的单手操作方法。
第四方面,本申请提供了一种单手操作装置,该装置包含在电子设备中,该装置具有实现上述第一方面及第一方面的可能实现方式中任一方法中电子设备行为的功能。该功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。硬件或软件包括一个或多个与上述功能相对应的模块或单元。
第五方面,本申请提供了一种芯片***,该芯片***应用于电子设备。该芯片***包括一个或多个接口电路和一个或多个处理器。该接口电路和处理器通过线路互联。该接口电路用于从电子设备的存储器接收信号,并向处理器发送该信号,该信号包括存储器中存储的计算机指令。当处理器执行计算机指令时,电子设备执行上述第一方面任一项可能的实现中的单手操作方法。
第六方面,本申请提供了一种计算机存储介质,包括计算机指令,当计算机指令在电子设备上运行时,使得电子设备执行上述第一方面任一项可能的实现中的单手操作方法。
第七方面,本申请提供了一种计算机程序产品,当计算机程序产品在电子设备上运行时,使得电子设备执行上述第一方面任一项可能的实现中的单手操作方法。
可以理解地,上述提供的第二方面及其任一种可能的实现的电子设备,第三方面的电子设备、第四方面的单手操作装置,第五方面的芯片***,第六方面的计算机存储介质,第七方面的计算机程序产品所能达到的有益效果,可参考第一方面及其任一种可能的实现中的有益效果,此处不再赘述。
附图说明
图1A为本申请实施例提供的一种电子设备的硬件结构示意图;
图1B为本申请实施例提供的一种电子设备的软件架构实例示意图;
图2为本申请实施例提供的一种右手操作习惯时用户操作电子设备的情景示意图;
图3为本申请实施例提供的电子设备的界面示意图一;
图4为本申请实施例提供的电子设备的界面示意图二;
图5为本申请实施例提供的电子设备的界面示意图三;
图6为本申请实施例提供的电子设备的界面示意图四;
图7为本申请实施例提供的电子设备的界面示意图五;
图8为本申请实施例提供的电子设备的界面示意图六;
图9为本申请实施例提供的电子设备的界面示意图七;
图10为本申请实施例提供的电子设备的界面示意图八;
图11为本申请实施例提供的电子设备的界面示意图九;
图12为本申请实施例提供的电子设备的界面示意图十;
图13为本申请实施例提供的电子设备的界面示意图十一;
图14为本申请实施例提供的电子设备的界面示意图十二;
图15为本申请实施例提供的电子设备的界面示意图十三;
图16为本申请实施例提供的电子设备的界面示意图十四;
图17为本申请实施例提供的电子设备的界面示意图十五;
图18为本申请实施例提供的电子设备的界面示意图十六;
图19为本申请实施例提供的电子设备的界面示意图十七;
图20为本申请实施例提供的电子设备的界面示意图十八;
图21为本申请实施例提供的电子设备的界面示意图十九;
图22为本申请实施例提供的电子设备的界面示意图二十;
图23为本申请实施例提供的电子设备的界面示意图二十一;
图24为本申请实施例提供的一种单手操作方法的方法流程图;
图25为本申请实施例提供的另一种单手操作方法的方法流程图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行描述。以下,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征。在本实施例的描述中,除非另有说明,“多个”的含义是两个或两个以上。
本申请实施例提供一种单手操作方法,可以应用于电子设备。其中,电子设备可以是平板、手机、车载设备、增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)设备、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个人数字助理(personal digital assistant,PDA)等具有显示屏的电子设备,本申请实施例对电子设备的具体类型不作任何限制。
示例性的,图1A示出了电子设备100的结构示意图。电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M,重力传感器等。
可以理解的是,本申请实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
其中,控制器可以是电子设备100的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了***的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路 (inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口130可以用于连接充电器为电子设备100充电,也可以用于电子设备100与***设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他电子设备,例如AR设备等。
可以理解的是,本申请实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备100的结构限定。在本申请另一些实施例中,电子设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块140可以通过USB接口130接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块140可以通过电子设备100的无线充电线圈接收无线充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为电子设备供电。
电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,外部存储器,显示屏194,摄像头193,和无线通信模块160等供电。电源管理模块141还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。在其他一些实施例中,电源管理模块141也可以设置于处理器110中。在另一些实施例中,电源管理模块141和充电管理模块140也可以设置于同一个器件中。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
电子设备100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备100可以包括1个或N个显示屏194,N为大于1的正整数。
在本申请实施例中,显示屏194可以用于显示触控窗口和应用窗口。其中,触控窗口显示于显示屏194上用户单手可触摸到的区域,应用窗口显示于显示屏194上除触控窗口所在区域以外的剩余区域。
电子设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信 号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备100可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
在本申请实施例中,可使用摄像头193采集用户的手部信息。例如,当用户首次启动“单手操作”功能时,电子设备100提示用户录入手部信息。处理器110在用户输入“打开摄像头193”的指令后,打开摄像头193进行拍摄,以获取用户的手部信息。其中,手部信息可以包括手掌大小、各个手指的长度、各个手指的指纹等信息。本申请实施例中电子设备100获取用户的手部信息,主要是为了得到用户的大拇指长度。
本申请实施例中,大拇指的长度为用户的大拇指在显示屏194上进行触摸操作时,可触到的最远触摸点与握持点之间的距离。其中,握持点可以为用户的手掌与显示屏194边缘相接触的位置点。示例性地,如图2所示,当用户采用图2所示的单手握持电子设备的姿势时,大拇指的最远触摸点201与握持点202之间的距离即为大拇指的长度。
本申请实施例中,电子设备100通过获取用户的手部信息,以获取大拇指的长度,用于在后续开启“单手操作”功能时,确定触控窗口显示的位置和尺寸大小,以保证用户在握持手机时,能够通过作用于该触控窗口内的触摸操作,对应用窗口内的显示内容进行控制。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,可执行程序代码包括指令。处理器110通过运行存储在内部存储器121的指令,从而执行电子设备100的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作***,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储电子设备100使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。压力传感器180A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器180A,电极之间的电容改变。电子设备100根据电容的变化确定压力的强度。当有触摸操作作用于显示屏194,电子设备100根据压力传感器180A检测触摸操作强度。电子设备100也可以根据压力传感器180A的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。例如:当有触摸操作强度小于第一压力阈值的触摸操作作用于短消息应用图标时,执行查看短消息的指令。当有触摸操作强度大于或等于第一压力阈值的触摸操作作用于短消息应用图标时,执行新建短消息的指令。
指纹传感器180H用于采集指纹。电子设备100可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。
温度传感器180J用于检测温度。在一些实施例中,电子设备100利用温度传感器180J检测 的温度,执行温度处理策略。例如,当温度传感器180J上报的温度超过阈值,电子设备100执行降低位于温度传感器180J附近的处理器的性能,以便降低功耗实施热保护。在另一些实施例中,当温度低于另一阈值时,电子设备100对电池142加热,以避免低温导致电子设备100异常关机。在其他一些实施例中,当温度低于又一阈值时,电子设备100对电池142的输出电压执行升压,以避免低温导致的异常关机。
触摸传感器180K,也称“触控面板”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作(例如长按、上滑、左滑、单击、双击等)。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于电子设备100的表面,与显示屏194所处的位置不同。
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。电子设备100可以接收按键输入,产生与电子设备100的用户设置以及功能控制有关的键信号输入。
马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如拍照,音频播放等)的触摸操作,可以对应不同的振动反馈效果。作用于显示屏194不同区域的触摸操作,马达191也可对应不同的振动反馈效果。不同的应用场景(例如:时间提醒,接收信息,闹钟,游戏等)也可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。
指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。
SIM卡接口195用于连接SIM卡。SIM卡可以通过***SIM卡接口195,或从SIM卡接口195拔出,实现和电子设备100的接触和分离。电子设备100可以支持1个或N个SIM卡接口,N为大于1的正整数。SIM卡接口195可以支持Nano SIM卡,Micro SIM卡,SIM卡等。同一个SIM卡接口195可以同时***多张卡。多张卡的类型可以相同,也可以不同。SIM卡接口195也可以兼容不同类型的SIM卡。SIM卡接口195也可以兼容外部存储卡。电子设备100通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,电子设备100采用eSIM,即:嵌入式SIM卡。eSIM卡可以嵌在电子设备100中,不能和电子设备100分离。
电子设备100可以是搭载安卓微软或者其它操作***的终端设备,本申请实施例对电子设备搭载的操作***不作限制。
电子设备100的软件***可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本申请实施例以分层架构的***为例,示例性说明电子设备100的软件结构。
图1B是本申请实施例的电子设备100的软件结构框图。分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android***分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和***库,以及内核层。应用程序层可以包括一系列应用程序包。
如图1B所示,应用程序包可以包括相机,图库,日历,通话,地图,导航,WLAN,蓝牙,音乐,视频,短信息等应用程序(application,APP)。为方便描述,以下将应用程序简称为应用。电子设备上的应用可以是原生的应用,也可以是第三方应用,本申请实施例不予限定。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
如图1B所示,应用程序框架层可以包括窗口管理服务器(window manager service,VMS),活动管理服务器(activity manager service,AMS)、输入事件管理服务器(input manager service,IMS)、内容提供器,视图***,电话管理器,资源管理器,通知管理器等。
其中,窗口管理服务器用于管理窗口程序。窗口管理服务器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。
活动管理服务器(activity manager service,AMS)用于负责管理Activity,负责***中各组件的启动、切换、调度及应用程序的管理和调度等工作。
输入事件管理服务器(input manager service,IMS)可以用于对原始输入事件进行翻译、封装等处理,得到包含更多信息的输入事件,并发送到窗口管理服务器,窗口管理服务器中存储有每个应用程序的可点击区域(比如控件)、焦点窗口的位置信息等。因此,窗口管理服务器可以正确的将输入事件分发到指定的控件或者焦点窗口。
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。
视图(view)***包括可视控件,例如显示文字的控件,显示图片的控件等。视图***可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。
电话管理器用于提供电子设备100的通信功能。例如通话状态的管理(包括接通,挂断等)。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在***顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,电子设备振动,指示灯闪烁等。
Android Runtime包括核心库和虚拟机。Android Runtime负责安卓***的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
***库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。
表面管理器用于对显示子***进行管理,并且为多个应用程序提供了2D和3D图层的融合。
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。
三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。
2D图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层。内核层可以包含显示驱动,输入/输出设备驱动(例如,键盘、触摸屏、耳机、扬声器、麦克风等),设备节点,摄像头驱动,音频驱动以及传感器驱动等。用户通过输入设备进行输入操作,内核层可以根据输入操作产生相应的原始输入事件,并存储在设备节点中。
基于上文对本申请涉及的电子设备的软硬件的介绍,接下来将结合附图详细介绍本申请提供的单手操作方法流程。
目前,用户在使用电子设备时,如果用户的手掌尺寸不够大,则用户通常需要使用双手操作电子设备才能获得较好的操作体验。但这种操作方式比较单一。且在实际使用中,用户有时也需要解放其中一只手去做其他事情,如拎包、抓地铁扶手等。此时用户就需要单手操作电子设备。然而随着电子设备的显示屏越来越大,显示屏显示的界面中存在部分内容,是用户单手不能够操作到的,用户单手操作电子设备变得越来越困难,很多时候用户需要使用两个手同时操作才能完成相应的功能。
作为一种解决方案,开发人员可以重新设计应用程序的布局,以将应用界面中的功能按键的显示位置设于显示屏的右下角或左下角,如此,用户单手使用电子设备时可以比较方便地触摸到功能按键。然而,这种方案需要重新设计所有的应用程序,工作量巨大。
作为另一种解决方案,开发人员可以将下拉窗口转变为上拉窗口,如此,用户单手使用电子设备时,对于原本需要用户在显示屏顶部区域进行的触摸操作,用户可以比较方便在显示屏底部区域进行。然而,这种方案不仅容易导致用户误触,也无法解决屏幕显示的部分内容用户单手可 能操作不到的问题。
为了解决上述问题,本申请实施例提供了一种单手操作方法。电子设备在通过显示屏的屏幕显示区域显示第一界面时,若检测到进入单手操作模式的指令,则电子设备可以将显示屏上用户单手可触摸到的区域(即电子设备显示屏的部分区域),作为单手操作模式的触控区域,以接收用户单手的触摸操作,同时将显示屏上除该触控区域以外的剩余区域,作为单手操作模式的显示区域,以将显示屏的原屏幕显示区域以及原屏幕显示区域所显示的第一界面缩小显示于该单手操作模式的显示区域。然后电子设备可以感测用户在单手操作模式的触控区域内的触摸操作,以根据在该触控区域内感测到的触摸操作,对应控制单手操作模式的显示区域中所显示的缩小后的原屏幕显示区域以及原屏幕显示区域中显示的第一界面。如此,用户通过在触控区域内进行触摸操作,便可对电子设备缩小显示的原有界面进行全范围的操作。在无需应用程序新增开发或适配的情况下,解决了用户单手在电子设备上进行操作时屏幕显示的部分内容可能手指触摸不到的问题。
也就是说,在用户单手使用电子设备的过程中,如果电子设备显示的界面中,存在部分内容是用户单手不能够操作到的内容时,用户可以开启电子设备的单手操作模式。一旦触发电子设备的单手操作模式,电子设备可将显示屏划分为两个区域,一个是单手操作模式的触控区域,另一个是单手操作模式的显示区域。其中,单手操作模式的触控区域可以是用户单手操作电子设备时,用户的手指在显示屏中能够触摸到或者容易触摸到的区域,单手操作模式的显示区域可以是显示屏中除前述触控区域以外的剩余区域。
然后电子设备可将显示屏上原本显示的内容进行相应倍数地缩小以集中显示到该单手操作模式的显示区域中,并建立单手操作模式的操作区域与单手操作模式的显示区域之间的映射关系,从而当电子设备检测到在单手操作模式的触控区域的触控操作之后,电子设备可根据该映射关系,执行该单手操作模式的显示区域被触控时所需执行的功能。
可以理解,电子设备处于正常模式时,通常显示屏的整个屏幕显示区域都用于显示内容,而电子设备从正常模式切换到单手操作模式时,电子设备的显示屏上会出现一个单手操作模式的显示区域。由于单手操作模式的显示区域的尺寸比显示屏的整个屏幕显示区域的尺寸要小,因此,需要将显示屏原本整个屏幕显示区域及其中的显示内容进行缩小,以确保所有在正常模式中的显示均包括在单手操作模式的显示区域内,正常模式中应当显示的信息没有缺失。
在一些实施例中,电子设备进入单手操作模式后,电子设备也可以将单手操作模式的触控区域进行显示,以让用户知道单手可触控的区域在显示屏上的位置,从而用户可以准确在单手操作模式的触控区域内实现对单手操作模式的显示区域的触控操作。电子设备也可以将单手操作模式的显示区域进行显示,以让用户知道单手操作时用户界面在显示屏上的位置。
本申请实施例中,电子设备可以以窗口的形式将单手操作模式的操作区域进行显示,该显示的窗口可以称为触控窗口。该触控窗口的位置和大小与单手操作模式的操作区域的位置和大小一致。也即,一旦触发电子设备的单手操作模式,电子设备将会在显示屏的单手操作模式的操作区域显示触控窗口,并会在显示屏的单手操作模式的显示区域显示缩小后的原显示区域及其中的显示内容。
可选地,电子设备也可通过高亮显示,或者,通过预设的颜色显示等方式,实现对单手操作模式的触控区域的突出显示。可选地,电子设备也可通过高亮显示,或者,通过预设的颜色显示等方式,实现对单手操作模式的显示区域的突出显示。本申请实施例对此不作限定。
以下将以电子设备为手机为例,对本申请实施例提供的技术方案进行具体阐述。
本申请实施例中,单手操作模式的触控区域的大小和位置可以是电子设备的操作***预先设置的,例如操作***预先设置触控区域的坐标、尺寸中的至少一个,生成响应的触控区域的配置文件,使得电子设备调用上述配置文件的时候,将触控区域显示在电子设备的显示屏上。例如,电子设备可以根据配置文件,将触控区域显示在显示屏的左上角、左下角、右上角、右下角中的至少一个角落,并且尺寸小于显示屏的显示尺寸。
本申请实施例中,用户还可以自定义单手操作模式的触控区域的大小和位置。
可以理解,不同的手机的尺寸不一致,且不同用户使用的手掌大小不一致,导致每个用户单手大拇指操作的范围也不相同。因此,本申请实施例提供了单手操作模式的触控窗口的初始化设 置功能。用户正常单手握持手机,手机通过对用户的单手握持姿态进行分析,以根据用户单手大拇指在显示屏上可触控到的位置范围,确定出适配于该用户的单手操作模式的触控窗口在显示屏中的所在区域。如此,手机可以确保手机设置的触控区域更符合每个用户的单手操作习惯,用户在单手握持手机时,能够对触控区域的各个位置进行操作。
可选地,当手机首次启动“单手操作模式”功能时,手机可以执行单手操作模式的触控区域的初始化设置。此时,手机可以对用户的单手握持姿态进行分析,以确定出用户单手在手机显示屏上可触摸到的区域,从而手机可以根据用户单手可触摸到的显示屏的区域,自适应调整单手操作模式的触控区域的位置和大小。可选地,当用户后续需要对触控区域的位置和大小重新进行调整时,用户也可以再次操作手机执行单手操作模式的触控区域的初始化设置。
在一些实施例中,手机可以显示提示动画,以提醒用户手机当前已进入或者退出单手操作模式。
示例性地,请参阅图3,在用户首次开启“单手操作模式”功能时,手机的显示屏可以显示一个用于指示用户在显示屏上划一个圆弧的提示。可以理解,图3示出了右手握持手机时的圆弧提示,当用户习惯左手握持手机时,手机的显示屏也可以显示图4所示的圆弧提示。
可选地,在用户首次开启“单手操作”功能时,手机可以自动识别用户的单手握持姿势,以根据识别出的左手握持姿势,显示图4所示的圆弧提示;根据识别出的右手握持姿势,显示图3所示的圆弧提示。下述均以单手握持姿势为右手握持姿势为例,对本申请实施例的技术方式进行介绍。
可选地,手机可以根据温度传感器检测手的温度、压力传感器检测握持压力等方式来确定用户的单手握持姿势。本申请实施例对手机识别单手握持姿势的方式并不做限定。
示例性地,请参阅图5,当用户采用图2所示的右手握持姿势握持手机,并按照指示,用大拇指在显示屏上划一个圆弧(如图5中曲线501所示)后,手机可以接收到用户的圆弧操作对应的触摸位置。然后手机根据该触摸位置,确定出用户单手可触摸到的最大区域502。
可选地,手机可以根据用户单手可触摸到的最大区域502,确定适配于该用户的单手操作模式的触控区域503的位置和大小。本申请实施例对触控区域503的形状并不作限定,例如可以是方形、圆形、椭圆形等。作为一种方式,以触控区域的形状为方形为例,手机可以根据用户单手可触摸到的最大区域502,确定触控区域的长和宽的最大值,从而可以确定出触控区域的位置和大小。作为另一种方式,手机也可以将用户单手可触摸到的最大区域502进行显示,以供用户自定义触控区域的形状和大小。
在一些实施例中,手机也可以获取用户的手部信息,以根据获取到的手部信息,计算出用户的大拇指长度。然后手机可以根据用户的大拇指长度以及用户的单手握持姿势,计算出用户单手可触摸到的最大区域。从而手机可以根据用户单手可触摸到的最大区域,确定触控区域的位置和大小。可选地,在用户开启“单手操作模式”功能时,手机可以提醒用户录入手部信息。手机在录入用户的手部信息后,手机可以根据获取的手部信息,计算出用户的大拇指长度。
本申请实施例中,手机在确定出单手操作模式的触控区域的位置和大小后,手机可以将显示屏整个屏幕显示区域显示的界面内容,缩小显示至显示屏的剩余区域。其中,该剩余区域为显示屏中除该触控区域以外的其他区域。示例性地,请参阅图6,手机根据用户单手可触摸到的最大区域,确定出单手操作模式的触控区域601的位置和大小后,手机可以将显示屏中除触控区域601以外的剩余区域602,作为单手操作模式的显示区域,显示原本整个屏幕显示区域显示的界面内容。从而用户可通过在触控区域601内进行操作,来对剩余区域602内显示的界面内容进行全范围的操作。
示例性地,图7中的(a)示出了一种手机处于正常模式时通过显示屏显示桌面界面时的界面示意图。其中,正常模式通常是指用户界面按照手机的显示屏的大小设计的正常的显示方式。手机在正常模式下时,用户界面通常显示在整个屏幕上。如图7中的(a)所示,手机处于正常模式时,手机通过显示屏显示的桌面界面701布满显示屏的整个屏幕显示区域。
手机在正常显示桌面界面701时,若手机检测到单手操作模式的触发操作,则如图7中的(b)所示,手机可以响应该触发操作,在显示屏上显示单手操作模式的触控区域702,并将显示屏原 本整个屏幕显示区域显示的桌面界面701缩小显示到显示屏的剩余区域703。该剩余区域703为显示屏中除该触控区域702以外的其他区域。如图7中的(b)所示,显示屏的剩余区域703内显示有等比缩小后的桌面界面704。从而用户可通过在触控区域702内进行操作,来对剩余区域703显示的等比缩小后的桌面界面704进行全范围的操作。
可选地,手机的设置页面可以增加单手操作模式的功能开关控件,单手操作模式的触发操作,可以是用户对该单手操作模式的功能开关控件的点击操作。可选地,单手操作模式的触发操作,也可以为用户对手机上特定的物理按键或按键组合的按压操作,还可以为用户的语音指示、用户在手机的屏幕上的快捷手势操作(如画一个圆圈轨迹的手势或者作用于屏幕右下角的长按手势)或用户的隔空手势操作。本申请实施例对单手操作模式的触发操作并不作限定。
可选地,当单手操作模式的触发操作为手机首次检测到时,手机可以进入单手操作模式的触控区域的初始化设置模式,以确定出单手操作模式的触控区域的位置和大小。然后手机将单手操作模式的触控区域的位置和大小进行记录,便于后续手机再次检测到单手操作模式的触发操作时,可以直接在显示屏上显示单手操作模式的触控区域。
手机根据单手操作模式的触控区域在显示屏中的位置和大小,可以确定剩余区域在显示屏中的位置和大小。可选地,整个剩余区域均可作为单手操作模式的显示区域。
示例性地,图8中的(a)示出了一种手机处于正常模式时通过显示屏显示视频应用时的界面示意图。如图8中的(a)所示,手机处于正常模式时,手机通过显示屏显示的视频应用的应用界面801布满显示屏的整个屏幕显示区域。
手机在正常显示视频应用的应用界面801时,若手机检测到单手操作模式的触发操作,则如图8中的(b)所示,手机可以响应该触发操作,在显示屏上显示单手操作模式的触控区域802,并将显示屏原本整个屏幕显示区域显示的视频应用的应用界面801缩小显示到显示屏的剩余区域803。该剩余区域803为显示屏中除该触控区域802以外的其他区域。如图8中的(b)所示,显示屏的剩余区域803内显示有等比缩小后的视频应用的应用界面804。从而用户可通过在触控区域802内进行操作,来对剩余区域803显示的等比缩小后的视频应用的应用界面804进行全范围的操作。
可选地,若显示屏原本整个屏幕显示区域显示的用户界面,包括多个应用的应用窗口,则手机进入单手操作模式时,手机也可以将显示屏原本整个屏幕显示区域显示的包括多个应用的应用窗口的用户界面,缩小显示至在单手操作模式的显示区域内。
示例性地,图9中的(a)示出了一种手机处于正常模式时通过显示屏显示分屏窗口的界面示意图。如图9中的(a)所示,手机处于正常模式时,手机通过显示屏显示的用户界面901包括视频播放应用和音乐播放应用的上下分屏窗口时,视频播放应用的应用窗口和音乐播放应用的应用窗口各占据手机显示屏的一半,且两个应用窗口之间未重叠。
手机正常显示视频播放应用和音乐播放应用的分屏窗口时,若手机检测到单手操作模式的触发操作,则如图9中的(b)所示,手机可以响应该触发操作,在显示屏上显示单手操作模式的触控区域902,并将显示屏原本整个屏幕显示区域显示的用户界面901缩小显示到显示屏的剩余区域903。该剩余区域903为显示屏中除该触控区域902以外的其他区域。如图9中的(b)所示,显示屏的剩余区域903内显示有等比缩小后的用户界面904,该用户界面904中的视频播放应用和音乐播放应用的分屏窗口同样也被等比缩小。从而用户可通过在触控区域902内进行操作,来对剩余区域903显示的等比缩小后的包括视频播放应用和音乐播放应用的分屏窗口的用户界面904进行全范围的操作。
可以理解,用户界面包括多个应用的应用窗口可以不局限于分屏场景,悬浮屏、迷你浮窗、多任务窗等其他场景也同样适用,本申请实施例对此并不作限定。
可选地,手机也可以根据剩余区域的大小,将手机显示屏原本整个屏幕显示区域显示的界面内容重新排列组合,并将重新排列组合后的界面内容显示于该剩余区域。可以理解,重新排列组合后的界面内容可以铺满整个剩余区域进行显示。
示例性地,以桌面界面为例,如图10中的(a)所示,手机处于正常模式时,手机通过显示屏显示的桌面界面1001布满显示屏的整个屏幕显示区域。其中,桌面界面1001中包括多个应用 的应用图标。
手机在正常显示桌面界面1001时,若手机检测到单手操作模式的触发操作,则如图10中的(b)所示,手机可以响应该触发操作,在显示屏上显示单手操作模式的触控区域1002,并将显示屏原本整个屏幕显示区域显示的桌面界面1001中的多个应用的应用图标进行重新排列组合,以将重新排列组合后的多个应用的应用图标显示于显示屏的剩余区域1003。该剩余区域1003为显示屏中除该触控区域1002以外的其他区域。如图10中的(b)所示,显示屏的剩余区域1003内显示有应用图标重新排列组合后的新的桌面界面1004,该新的桌面界面1004中的内容排布与原桌面界面1001的内容排布不同,且该桌面界面1004布满显示屏的整个剩余区域。同理,用户可通过在触控区域1002内进行操作,来对剩余区域1003显示的新的桌面界面1004进行全范围的操作。
可选地,由于显示屏显示的用户界面通常都与显示屏的屏幕长宽比例贴合,因此,手机可以根据剩余区域的位置和大小,确定剩余区域中,满足显示屏的屏幕长宽比例的最大区域,作为用于显示原本整个屏幕显示区域显示的用户界面的目标显示区域。如此,手机可以将手机显示屏原本整个屏幕显示区域显示的用户界面,等比缩小至该目标显示区域显示,同时缩小后的用户界面能够铺满该目标显示区域。
可选地,当手机利用剩余区域中的目标显示区域显示等比缩小后的用户界面后,若剩余区域中除了目标显示区域外还有未显示内容的其他区域,则手机也可以桌面界面显示的多个应用的应用图标重新排列,显示于该其他区域。如此,用户在通过触控区域对目标显示区域内显示的原用户界面进行操作时,也可以通过触控区域对其他区域内显示的多个应用图标进行操作。如此,既避免了显示屏因存在有未显示任何内容的区域而导致显示屏出现黑色区域的现象,也实现对手机显示屏的整个显示区域的充分利用,避免大屏幕面积浪费,且用户也可以在浏览原用户界面的同时,快速打开其他应用。
示例性地,请参阅图11,手机确定出单手操作模式的触控区域1101的位置和大小后,手机可以将显示屏中除触控区域1101以外的剩余区域1102,作为单手操作模式的显示区域。然后手机可以根据剩余区域1102的位置和大小,确定剩余区域1102中,满足显示屏的屏幕长宽比例的最大区域,作为用于显示原本整个屏幕显示区域显示的用户界面的目标显示区域1103。此时,如图11所示,剩余区域1102中除了目标显示区域1103外还有未显示内容的其他区域1104。因此,本申请实施例中,手机可以将桌面界面显示的多个应用的应用图标重新排列,以显示于该其他区域1104。实现对手机显示屏的整个显示区域的充分利用。
作为一种示例,图12中的(a)示出了一种手机处于单手操作模式时通过显示屏显示视频应用时的界面示意图。如图12中的(a)所示,手机进入单手操作模式后,手机可以在显示屏上显示单手操作模式的触控区域1201,在显示屏的剩余区域显示等比缩小后的视频应用的应用界面1202,该剩余区域为显示屏中除该触控区域1201以外的其他区域。由于视频应用的应用界面1202并未铺满显示屏的整个剩余区域,该剩余区域还存在未显示有内容的左侧区域1203,因此,如图12中的(a)所示,手机进入单手操作模式后,手机同时还可以在显示屏的左侧区域1203显示多个应用图标。
可选地,手机也可以在显示屏的左侧区域以切换列表的方式将较多的应用图标进行显示,用户可以通过对触控区域的触控,来对该切换列表进行上下滑动控制,以使显示屏的左侧区域显示出处于隐藏显示的应用图标。
在一些实施例中,剩余区域包括上述目标显示区域和其他区域时,手机可以根据区域切换指令,确定触控区域当前可以对目标显示区域和其他区域中的哪一个区域进行操控。
可选地,区域切换指令可以是用户在触控区域上的预设手势操作,如在触控区域底部的横向滑动操作,本申请实施例对预设手势操作并不作限定。作为一种示例,用户当前利用触控区域对目标显示区域内显示的等比缩小后的用户界面进行操控时,若用户在触控区域底部执行从右到左的横向滑动手势,则手机可以将触控区域的作用范围从目标显示区域切换到其他区域,此时,手机可以根据用户作用于触控区域的触摸操作,对其他区域内显示的内容执行与该触摸操作对应的功能。若用户在触控区域底部执行从左到右的横向滑动手势,则手机可以将触控区域的作用范围 从其他区域切换回目标显示区域,此时,手机可以根据用户作用于触控区域的触摸操作,对目标显示区域内显示的等比缩小后的用户界面执行与该触摸操作对应的功能。
可选地,手机可以利用触控区域显示一控件,用于实现目标显示区域与其他区域的操控切换,此时区域切换指令也可以是用户对该控件的点击操作。作为一种示例,如图12中的(b)所示,手机可以在触控区域1201显示控件1204。当用户点击该控件1204时,触控区域的作用范围可以从视频应用的应用界面1202切换到左侧区域1203内显示的多个应用图标,此时手机可以根据用户作用于触控区域1201的触摸操作,对左侧区域1203内显示的多个应用图标执行与该触摸操作对应的功能。当用户再次点击该控件1204时,触控区域的作用范围可以从左侧区域1203内显示的多个应用图标切换回视频应用的应用界面1202,此时手机可以根据用户作用于触控区域1201的触摸操作,对视频应用的应用界面1202执行与该触摸操作对应的功能。
可选地,若剩余区域中除了目标显示区域外还有未显示内容的其他区域,手机也可以将用户频繁使用的应用的应用图标显示于该其他区域,或者用户可以自定义显示在该其他区域的多个应用的应用图标。本申请实施例并不作限定。
在一些实施例中,当单手操作模式的触控区域的位置在显示屏中比较靠上时,剩余区域也可以包括显示屏的底部区域。由于该底部区域也未显示内容,因为为了避免屏幕浪费,手机也可以在该底部区域显示多个应用的应用图标。如此,用户在通过触控区域对目标显示区域内显示的原用户界面进行操作时,也可以通过触控区域对底部区域内显示的多个应用图标进行操作。如此,既避免了显示屏因存在有未显示任何内容的区域而导致显示屏出现黑色区域的现象,也实现对手机显示屏的整个显示区域的充分利用,避免大屏幕面积浪费,且用户也可以在浏览原用户界面的同时,快速打开其他应用。
示例性地,请参阅图13,图13示出了一种手机处于单手操作模式时通过显示屏显示视频应用时的界面示意图。如图13所示,手机进入单手操作模式后,手机可以在显示屏上显示单手操作模式的触控区域1301,在显示屏的剩余区域显示等比缩小后的视频应用的应用界面1302,该剩余区域为显示屏中除该触控区域1301以外的其他区域。由于视频应用的应用界面1302并未铺满显示屏的整个剩余区域,该剩余区域还存在未显示有内容的左侧区域1303和底部区域1304,因此,如图13所示,手机进入单手操作模式后,手机同时还可以在显示屏的左侧区域1303和底部区域1304显示多个应用图标。
可选地,由于有的用户单手可以触摸到显示屏的底部区域,因此,用户也可以不通过触控区域对底部区域内显示的多个应用图标进行触控操作,而是直接对底部区域内的多个应用图标进行触控操作。
可选地,手机也可以在显示屏的底部区域以切换列表的方式将较多的应用图标进行显示,用户可以通过对触控区域的触控,来对该切换列表进行左右滑动控制,以使显示屏的底部区域显示出处于隐藏显示的应用图标。
可选地,剩余区域中未显示内容的底部区域或左侧区域也可以不显示应用图标,而是显示多个快捷功能,例如截图、分享、扫一扫、快捷支付、健康码等快捷功能。本申请实施例对此并不作限定。
本申请实施例中,手机在检测到用户作用于触控区域的触摸操作时,可以响应该触摸操作,对剩余区域显示的缩小后的原整个屏幕显示区域及其中的用户界面执行与该触摸操作对应的功能。其中,触摸操作可以是上滑、下滑、左滑、右滑、点击、双击、长按等常用触摸操作,也可以是画圆、画勾√、画叉×等滑动手势。本申请实施例对此不作限定。
示例性地,以左滑操作为例,图14中的(a)示出了一种手机处于单手操作模式时通过显示屏显示桌面界面时的界面示意图。如图14中的(a)所示,手机在显示屏上显示单手操作模式的触控区域1401,并在显示屏上除该触控区域1401以外的剩余区域,显示缩小后的原本整个屏幕显示区域及原本整个屏幕显示区域显示的桌面界面1402时,若手机检测到用户作用于触控区域的左滑操作时,则手机响应该触控区域的左滑操作,将该作用于触控区域的左滑操作映射为作用于剩余区域所显示的缩小后的原本整个屏幕显示区域及其中显示的桌面界面1402的左滑操作,从而手机可以对缩小后的原本整个屏幕显示区域及其中显示的桌面界面1402执行与该左滑操作对应 的功能。如图14中的(b)所示,手机可以对缩小后的原本整个屏幕显示区域及其中显示的桌面界面1402执行与该左滑操作对应的桌面界面切换功能,从而缩小后的原本整个屏幕显示区域显示新的桌面界面1403(下一页桌面界面)。
示例性地,以左滑操作为例,图15中的(a)示出了一种手机处于单手操作模式时通过显示屏显示视频应用时的界面示意图。如图15中的(a)所示,手机在显示屏上显示单手操作模式的触控区域1501,并在显示屏上除该触控区域1501以外的剩余区域,显示缩小后的原本整个屏幕显示区域及原本整个屏幕显示区域显示的视频应用的应用界面1502时,若手机检测到用户作用于触控区域的左滑操作时,则手机响应该触控区域的左滑操作,将该作用于触控区域的左滑操作映射为,作用于剩余区域所显示的缩小后的原本整个屏幕显示区域及其中显示的视频应用的应用界面1502的左滑操作,从而手机可以对缩小后的原本整个屏幕显示区域及其中显示的视频应用的应用界面1502执行与该左滑操作对应的功能。如图15中的(b)所示,手机可以对缩小后的原本整个屏幕显示区域及其中显示的视频应用的应用界面1502执行与该左滑操作对应的视频应用退出功能,从而缩小后的原本整个屏幕显示区域显示视频应用退出后回到的桌面主界面1503。
在一些实施例中,手机利用剩余区域显示等比缩小后的用户界面后,若手机检测到对该用户界面的确认指令,则手机可以认为用户需要进一步操控该用户界面中的具体内容,此时手机可以根据用户作用于触控区域的触摸操作,对该用户界面中的某个内容或某部分内容执行与该触摸操作对应的功能。
可选地,手机可以利用触控区域显示一切换控件,此时,对用户界面的确认指令可以是用户对该切换控件的点击操作。如此,用户可以通过点击该切换控件,实现对当前显示的用户界面的选中,以及实现对该用户界面中的具体内容的进一步操控。可选地,用户也可以再次点击该切换控件,实现对当前显示的用户界面的取消选中,以取消对该用户界面中的具体内容的进一步操控。
可选地,对用户界面的确认指令也可以是用户在触控区域上的预设手势操作,该预设手势操作可以是双击手势、画勾“√”等,本申请实施例对其并不作限定。以双击手势为例,手机利用剩余区域显示等比缩小后的用户界面后,用户可以通过在触控区域上执行双击手势操作,实现对当前显示的用户界面的选中,以及实现对该用户界面中的具体内容的进一步操控。可选地,用户也可以再次在触控区域上执行双击手势操作,实现对当前显示的用户界面的取消选中,以取消对该用户界面中的具体内容的进一步操控。在一些实施例中,为确保用户通过触控区域的触摸操作,能够精确定位和操控到剩余区域内显示的某个内容,手机也可以确定触控区域在剩余区域的作用位置。
作为一种方式,手机可以在剩余区域内显示一个光标,该光标的位置可以用于提示用户手机当前定位的内容的位置。用户可以通过在触控区域的触摸操作(如上滑、下滑、左滑、右滑),来控制剩余区域内光标的移动,以控制剩余区域内的光标移动到自身想要定位和操控的某个内容的位置,从而用户可以进一步通过在触控区域的触摸操作,实现对该内容精准控制。
示例性地,图16中的(a)示出了一种手机处于单手操作模式时通过显示屏显示桌面界面时的界面示意图。如图16中的(a)所示,手机在显示屏上显示单手操作模式的触控区域1601,并在显示屏上除该触控区域1601以外的剩余区域,显示缩小后的原本整个屏幕显示区域及原本整个屏幕显示区域显示的桌面界面1602时,若用户在触控区域1601上执行双击手势操作,则手机可以认为用户需要进一步操控该桌面界面1602中的具体内容,此时手机响应该双击手势操作,手机可以在桌面界面1602上显示一个光标1603,该光标可以用于指示提示用户手机当前选中的某个内容。可选地,手机可以将原本整个屏幕显示区域的左上角对应的界面内容作为光标的初始停留位置。如图16中的(a)所示,光标1603的初始停留位置可以为桌面界面1602左上角的第一个应用图标所在的位置。
当手机检测到用户作用于触控区域的右滑操作时,则手机响应该触控区域的右滑操作,在剩余区域内控制光标1603向右移动,以选中下一个内容。如图16中的(b)所示,光标1603可以移动至桌面界面1602左上角第二个应用图标所在的位置。如此,用户可以通过在触控区域1601的上滑、下滑、左滑、右滑等触摸操作,来控制剩余区域内光标1603的移动,以控制剩余区域内的光标1603移动到自身想要定位和操控的某个应用图标的位置,从而用户可以再在触控区域内执 行点击操作,触发手机打开光标1603当前选中的应用图标对应的应用。
可以理解,当剩余区域内显示的内容比较多时,用户可能需要在触控区域的执行很多次触摸操作,才能控制光标在剩余区域内逐步移动到相应操控的某个内容位置处。尤其对于大屏手机来说,显示屏原本显示的内容就比较多,在单手操作模式下缩小后集中显示到剩余区域时,显示内容的尺寸也会相应缩小,显示内容也会更密集,既不方便用户寻找内容,也增加了用户通过在触控区域的触摸操作,将剩余区域内的光标精确定位到某个具***置的难度。
因此,在一些实施例中,手机可以在剩余区域内显示一个内容选择框,该内容选择框的区域范围可以是触控区域在剩余区域的作用范围。其中,该内容选择框所框定的区域范围内可以包括剩余区域内显示多个内容,从而用户可以通过在触控区域的触摸操作(如上滑、下滑、左滑、右滑),来控制剩余区域内内容选择框的移动,以控制剩余区域内的内容选择框移动到自身想要定位和操控的某个内容所在区域的位置,从而用户可以进一步通过在触控区域的触摸操作,从内容选择框中精确定位到该内容的具***置并实现对该内容的精准操控。
示例性地,如图17中的(a)所示,手机在显示屏上显示单手操作模式的触控区域1701,并在显示屏上除该触控区域1701以外的剩余区域,显示缩小后的原本整个屏幕显示区域及原本整个屏幕显示区域显示的桌面界面1702时,手机也可以在桌面界面1702上显示一个内容选择框1703,该内容选择框可以用于指示提示用户手机当前选中的某个区域,选中的区域内可以包含多个界面内容。可选地,手机可以将原本整个屏幕显示区域的左上角位置作为内容选择框的初始停留位置。如图17中的(a)所示,内容选择框1703的初始停留位置可以为桌面界面1602左上角的位置,该内容选择框1703所框定的区域中包括多个应用图标。
当手机检测到用户作用于触控区域的右滑操作时,则手机响应该触控区域的右滑操作,在剩余区域内控制内容选择框1703向右移动,以选中下一个区域。如图17中的(b)所示,内容选择框1703可以移动至桌面界面1702右上角的位置。如此,用户可以通过在触控区域1701的上滑、下滑、左滑、右滑等触摸操作,来控制剩余区域内内容选择框1703的移动,以控制剩余区域内的内容选择框1703移动到自身想要定位和操控的某个区域。从而用户可以进一步通过在触控区域1701的触摸操作,精确定位到内容选择框1703中的某个内容的具***置并实现对该内容的精准操控。
可选地,内容选择框所框定的区域范围可以是触控区域在剩余区域的作用范围。当剩余区域内的内容选择框停留在某个区域时,手机可以建立触控区域与当前内容选择框选定区域之间的映射关系,从而当手机检测到用户作用于触控区域内的某个位置的触摸操作(如单击操作、双击操作、长按操作等)时,手机可以根据该映射关系,将在触控区域内某个位置的触摸操作,映射到在内容选择框内相应位置的触摸操作,使得手机可以执行与该内容选择框中相应位置的触摸操作对应的功能。
示例性地,当用户操控内容选择框1703移动至如图17中的(b)所示的桌面界面1702右上角的位置时,手机可以建立触控区域1701与当前内容选择框1703选中区域之间的映射关系。例如,如图17中的(c)所示,在触控区域内左上方、右上方、左下方、右下方四个触控子区域内的触摸操作,可以一一对应映射到在当前内容选择框1703内4个应用图标上的触摸操作。这样,用户可以通过分别在触控区域内左上方、右上方、左下方、右下方四个触控子区域内执行触摸操作,分别对当前内容选择框1703内4个应用图标进行操控。
例如,如图17中的(d)所示,当手机检测到用户作用于触控区域内左下方的单击操作时,手机可以根据映射关系,将对触控区域内左下方的单击操作映射为对内容选择框1703内音乐应用的应用图标1704的单击操作,从而手机可以响应用户作用于触控区域内左下方的单击操作,打开音乐应用。
可选地,手机可以实时根据内容选择框所框定的区域范围,建立触控区域与当前内容选择框选定区域之间的映射关系,也可以在检测到用户对内容选择框的确定操作时,才建立触控区域与当前内容选择框选定区域之间的映射关系。其中,对内容选择框的确定操作,可以是用户作用触控区域的双击操作、特定的手势操作(如画勾√)等,本申请实施例对其并不作限定。
示例性地,以双击操作为例,当用户操控内容选择框1703移动至如图17中的(b)所示的桌 面界面1702右上角的位置时,若用户确定需要对当前内容选择框1703中的某个内容进行具体操控时,用户可以在触控区域内执行双击操作。手机在检测到用户作用于触控区域的双击操作时,可以确定检测到用户对内容选择框的确定操作,此时手机可以响应于该双击操作,进入到内容选择框1703所框定的区域范围,并建立触控区域与当前内容选择所框定的区域范围之间的映射关系。
可选地,手机也可以在检测到用户对内容选择框的退出操作时,删除触控区域与当前内容选择框选定区域之间的映射关系。其中,对内容选择框的确定操作,可以是用户作用触控区域的双击操作、特定的手势操作(如画叉×)等,本申请实施例对其并不作限定。
示例性地,以双击操作为例,当用户通过在触控区域内执行双击操作,以操控手机进入内容选择框所框定的区域范围,并建立触控区域与当前内容选择所框定的区域范围之间的映射关系后,用户也可以再次在触控区域内执行双击操作,以操控手机退出内容选择框所框定的区域范围,并删除触控区域与当前内容选择框所框定的区域范围之间的映射关系。手机回到原有操作逻辑,即用户可以通过在触控区域内的滑动操作,继续操控内容选择框的移动。
在一些实施例中,手机可以根据剩余区域显示的不同内容,自适应显示不同大小的内容选择框。可选地,由于不同应用显示的内容不同,且不同应用的应用界面的布局也不同,因此,手机可以根据原整个屏幕显示区域显示的不同应用的应用界面,生成不同大小的内容选择框。
示例性地,如图18所示,手机在剩余区域显示缩小后的原本整个屏幕显示区域及原本整个屏幕显示区域显示的短视频应用的应用界面1801时,手机也可以在短视频应用的应用界面1801上显示一个如图18所示的内容选择框1802,该内容选择框可以用于指示提示用户手机当前选中的某个区域,选中的区域内可以包含关注控件、收藏控件、评论控件、分享控件等多个控件。可以看出,由于短视频应用的应用界面1801中关注控件、收藏控件、评论控件、分享控件等多个控件处于一个垂直的位置,因此,手机可以自适应调整内容选择框的大小,以与垂直分布的界面内容匹配。
可选地,手机可以获取剩余区域内显示出来的所有操作控件视图view,然后判断这些view是否处于一个水平或者垂直的位置,手机可以通过对每个view的大小以及位置进行计算,以选取部分view重新组合成一个组合view。该组合view的范围大小即为内容选择框的大小。如此,手机可以实现根据剩余区域内显示出来的view的水平排布或垂直排布,自适应确定内容选择框的大小。
示例性地,如图19中(a)所示,手机在剩余区域显示缩小后的原本整个屏幕显示区域及原本整个屏幕显示区域显示的搜索应用的应用界面1901时,手机可以获取搜索应用的应用界面1901中显示出来的所有view,然后手机可以判断出这些view中哪些处于水平的位置,哪些处于或者垂直的位置。然后手机通过对每个view的大小以及位置进行计算,以根据每个view的水平排布或垂直排布,自适应选取合适的view重新组合成一个组合view,构成内容选择框的大小。如图19中(a)所示,手机可以将搜索应用的应用界面1901中一行水平排布的多个view,重新组合成组合view1902,该组合view1902的范围大小即为内容选择框的大小。如图19中(b)所示,手机也可以将搜索应用的应用界面1901中一列垂直排布的多个view,重新组合成组合view1903,该组合view1903的范围大小即为内容选择框的大小。如图19中(c)所示,手机也可以将搜索应用的应用界面中垂直排布的多个view和水平排布的多个view,重新组合成组合view1904,该组合view1904的范围大小即为内容选择框的大小。
可选地,手机可以将内容选择框所框定的区域进行高亮显示,以提示用户当前内容选择框所选中的区域。
可选地,手机也可以将内容选择框中的组合view映射到触控区域,并设置组合view可见,如此手机可以在触控区域将内容选择框所框定的组合view进行显示,从而用户可以通过操控触控区域内显示的某个view,来精准对应操控剩余区域内显示的内容选择框中的对应view。示例性地,如图20所示,手机在触控区域2001将内容选择框2002所框定的组合view即4个应用图标的图标控件进行显示。
可选地,由于缩小后的用户界面中的view比较小,映射到比较小的触控区域中时,用户有时也不便于操作,又或者,缩小后的用户界面中的view的排版与触控区域不匹配时,手机也无法对 应映射到触控区。因此,手机也可以在将内容选择框中的组合view映射到触控区域时,根据触控区域的大小,重新调整每个view的大小以及位置,使得调整后的组合view与触控区域更匹配,更方便用户单手在触控区域内操作。
示例性地,如图21所示,手机将剩余区域内显示的内容选择框2101中的组合view映射到触控区域2102时,手机可以重新调整内容选择框2101中每个view的大小以及位置,使得调整后的组合view与触控区域2102更匹配,然后手机可以在触控区域2102显示如图21所示的调整后的组合view,该调整后的组合view中view的大小比较大,更方便用户操作。
示例性地,如图22所示,手机将剩余区域内显示的内容选择框2201中的组合view映射到触控区域2202时,由于内容选择框2201内的组合view中的关注控件、收藏控件、评论控件、分享控件处于一个垂直的位置,手机可判断内容选择框2201内的组合view与触控区域不匹配,如果直接映射到触控区域,每个控件在触控区域会比较小,不便于用户操作。因此,手机可以根据关注控件、收藏控件、评论控件、分享控件中每个控件的大小和位置,重新调整组合view中每个控件的大小和位置,使得调整后的组合view与触控区域2202更匹配。然后手机可以在触控区域2202显示如图22所示的调整后的组合view,该调整后的组合view中关注控件、收藏控件、评论控件、分享控件横向排列且显示尺寸比较大,更便于用户单手在触控区域内操作。
可选地,手机也可以将剩余区域内显示的用户界面中的标题栏、底边栏、顶边栏等不随页面浏览而移动的、固定位置的这些操作控件直接映射到触控区域,这些控件无需通过内容选择框框定。从而用户可在触控区域内直接操作到这些控件。可选地,这些控件在触控区域内的映射位置可以与这些控件在用户界面中位置对应。例如,顶边栏通常显示在用户界面的顶部,手机也可以将顶边栏映射到触控区域的顶部。底边栏通常显示在用户界面的底部,手机也可以将底边栏映射到触控区域的底部。
示例性地,如图23中的(a)所示,由于剩余区域显示的购物应用的应用界面中包括顶边栏2301、底边栏2302等固定位置的操作控件,因此,手机可以将顶边栏2301、底边栏2302映射到触控区域,从而手机可以在触控区域内显示如图23中的(a)所示的顶边栏2303、底边栏2304。如此,用户单手可以直接在触摸区域内对顶边栏2303、底边栏2304进行触控操作。
可选地,手机在将用户界面中的标题栏、底边栏、顶边栏等不随页面浏览而移动的、固定位置的这些操作控件直接映射到触控区域后,也可以将剩余区域内显示的用户界面中的随页面浏览而移动的、非固定位置的操作控件,通过内容选择框映射到触控窗口。也即手机可以将内容选择框所框定的组合view映射到触控区域。可以理解,标题栏、底边栏、顶边栏等不随页面浏览而移动的、固定位置的这些操作控件与内容选择框所框定的组合view在触控区域上显示时互不重叠。
如图23中的(b)所示,手机不仅可以将剩余区域显示的购物应用的应用界面中的顶边栏2301、底边栏2302等固定位置的操作控件映射到触控区域中,以使用户可以直接触摸操作触控区域中的顶边栏2303、底边栏2304,来实现对剩余区域内的顶边栏2301、底边栏2302的触摸操作。手机还可以将内容选择框2305内的组合view映射到触控区域中,以使用户可以直接触摸操作触控区域中的组合view2306,来实现对剩余区域中内容选择框2305内的组合view2306的触摸操作。
可选地,当手机检测到用户作用于触控区域的下滑操作时,则手机响应该触控区域的下滑操作,在剩余区域内控制内容选择框2305向下移动,以选中下一个组合view。例如,内容选择框2305可以从如图23中的(b)所示的位置移动至如图23中的(c)所示的位置。
可选地,手机根据用户作用于触控区域中指定控件的触摸操作,在剩余区域显示新生成的用户界面时,手机可以将新生成的用户界面中与指定控件相关的内容进行高亮显示,或者手机可以将用户界面中与指定控件相关的内容控件显示在触控区域,以便用户可以快速关注到或操控到可能感兴趣的内容。可选地,若新生成的用户界面中没有与指定控件相关的内容,手机也可以不高亮显示。
作为一种示例,如图23中的(c)所示,手机将内容选择框2305内的组合view映射到触控区域中时,若用户点击触控区域内的组合view中的帽子控件2307,则手机可以响应该点击操作,进入该帽子控件2307对应的帽子相关页面,也即手机可以将剩余区域内显示的如图23中的(c)所示的页面,切换至如图23中的(d)所示的帽子相关页面。此时,如图23中的(d)所示,手 机可以将帽子相关页面中与帽子控件2307相似的控件2308和控件2309进行高亮显示,同时手机也可以将帽子相关页面中与帽子控件2307相似的控件2308和控件2309映射到触控区域,从而用户可以直接通过触控区域快速操控到可能感兴趣的内容。
可选地,当触控区域中的组合view与剩余区域中内容选择框2305内的组合view的位置一一对应时,手机也可以在触控区域中将组合view中的每个view进行隐藏,即设置为不可见,以避免显示屏显示过多的重复内容,影响用户观感。如此,用户直接通过在触控区域中对应的触控位置上执行触摸操作,即可实现对剩余区域中内容选择框2305内的组合view的触摸操作。
可选地,当手机检测到用户作用于触控区域的下滑操作时,手机可以同步控制剩余区域中显示的用户界面中的页面内容进行下滑浏览操作。从而手机可以重新获取剩余区域显示的用户界面中,非固定位置的所有view,并重新组合生成一个或多个组合view。用户可以通过作用于触控区域的下滑操作,同步控制剩余区域显示的用户界面中的内容选择框,从多个组合view中进行切换。
可选地,当剩余区域除了用于显示原整个屏幕显示区域显示的用户界面的目标显示区域外,还存在空余的左侧区域或底部区域显示有多个应用图标或快捷功能时,当用户需要对空余的左侧区域或底部区域的显示内容进行操控时,用户也可以通过触控区域操控内容选择框移动到左侧区域或底部区域,且内容选择框移动到左侧区域或底部区域时,手机可以根据左侧区域或底部区域中多个应用图标或快捷功能的垂直或水平排布,自适应将内容选择框调整为合适大小。然后手机可以将内容选择框中的多个应用图标或快捷功能映射到触控区域。
可选地,用户也可以自定义手势操作,该自定义手势操作可以绑定到某个应用界面的某个功能。从而用户可以直接在通过触控区域上执行自定义手势操作,快速启动该手势操作所绑定应用的对应功能。例如,用户在触控区域上执行画圆的手势时,可以触发手机直接打开电话应用的拨打电话页面,并执行对指定联系人的电话拨打功能。例如,用户在触控区域上执行画方框的手势时,可以触发手机直接打开支付应用的扫码页面,并执行扫码功能。
下面结合附图介绍本申请实施例提供的一种单手操作方法,可以应用于上述图2‐图23所示的场景。该单手操作方式应用电子设备,该电子设备可以是上述手机。如图24所示,该方法可以包括S2410‐S2480。
S2410、电子设备控制显示屏的屏幕显示区域显示第一界面。
该第一界面可以理解为手机通过显示屏的整个屏幕显示区域呈现的用户界面。用户界面是应用程序或操作***与用户之间进行交互和信息交换的介质接口,它实现信息的内部形式与用户可以接受形式之间的转换。应用程序的用户界面是通过java、可扩展标记语言(extensible markup language,XML)等特定计算机语言编写的源代码,界面源代码在终端设备上经过解析,渲染,最终呈现为用户界面中的界面元素。
可选地,用户界面中可以包含图标、窗口、控件等界面元素。其中,控件(control)也称为部件(widget),典型的控件有工具栏(toolbar)、菜单栏(menu bar)、文本框(text box)、按钮(button)、滚动条(scrollbar)、图片和文本。用户界面常用的表现形式是图形用户界面(graphic user interface,GUI),是指采用图形方式显示的与计算机操作相关的用户界面。它可以是在电子设备的显示屏中显示的一个图标、窗口、控件等界面元素,其中控件可以包括图片、按钮、菜单、选项卡、文本框、对话框、状态栏、导航栏、widget等可视的界面元素。
本申请实施例中,第一界面可以包括桌面界面,应用程序的应用界面中的至少一种。可选地,第一界面可以是全屏显示的桌面界面或全屏显示的应用界面。可选地,第一界面也可以是由全屏显示的桌面界面以及悬浮显示于该桌面界面上的悬浮应用界面组成的组合界面。可选地,第一界面还可以由至少一个应用界面组成的组合界面,例如,第一界面可以是以分屏状态显示的两个应用程序的应用界面。
可选地,第一界面的界面内容可以铺满整个屏幕显示区域。可选地,第一界面的界面内容也可以未铺满整个屏幕显示区域,也即第一界面中的界面内容边界与显示屏的整个屏幕显示区域边界之间,还存在未显示任何内容的黑边区域。本申请实施例对第一界面并不作限定。
S2420、电子设备检测到单手操作模式的触发指令。
可选地,单手操作模式的触发指令,可以由用户触发。可选地,单手操作模式的触发指令可 以为用户对单手操作模式的功能选项的点击操作,也可以为用户对手机上特定的物理按键或按键组合的按压操作,还可以为用户的语音指示、用户在手机的屏幕上的快捷手势操作(如画一个圆圈轨迹的手势)或用户的隔空手势操作等,本申请实施例对用户触发单手操作模式的方式并不作限定。
可选地,单手操作模式的触发指令,也可以由电子设备触发。作为一种实施方式,电子设备在检测到用户当前握持电子设备姿势为单手握持姿势时,电子设备可以自动触发单手操作模式。作为另一种方式,电子设备也可以在检测到用户的握持温度低于预设温度值时,自动触发单手操作模式。本申请实施例对电子设备自动触发单手操作模式的方式并不作限定。
其中,单手操作模式,是电子设备具备的一种操作模式,该操作模式将电子设备的显示屏划分为第一区域和第二区域。其中,第一区域为用户单手操作电子设备时,用户的手指在显示屏中能够触摸到或者容易触摸到的区域,第二区域为电子设备的显示屏中除第一区域以外的剩余区域。该操作模式将电子设备的显示屏的原显示区域(即进入单手操作模式前,电子设备的显示区域)及其中的显示内容(例如显示区域中的桌面界面、应用程序界面、应用图标、文字、图案、显示窗口、控件等)进行缩小,并将缩小后的显示区域及缩小后的显示内容置于显示屏的第二区域进行显示。且第一区域与第二区域可以建立映射关系,也即用户在第一区域内执行的触摸操作可以映射到第二区域,从而通过对第一区域的触控,能够使电子设备执行第二区域被触控时所需执行的功能。如此,用户单手就能够在第一区域实现对电子设备的正常操作。
例如,电子设备进入单手操作模式后,在显示屏的第二区域中显示缩小后的原显示区域及缩小后的显示内容,其中显示内容包括短信应用的应用图标,当用户在第一区域的某个位置进行单击操作后,电子设备可以响应该单击操作,确定该单击操作映射到第二区域内显示的短信应用的应用图标,从而电子设备可以执行短信应用的应用图标被触控时所需执行的功能,即打开短信应用图标对应的短信应用程序。
可选地,电子设备进入单手操作模式后,电子设备在第二区域可以实现缩小前显示区域的全部功能,包括在第二区域显示缩小后的显示内容和在第二区域进行触控操作。也即,用户可以在第一区域实现对第二区域的触控操作,也可以直接在第二区域实现对第二区域的触控操作。
可选地。电子设备也可以将第一区域进行显示,以让用户知道单手可触控的区域在显示屏上的位置。作为一种实施方式,电子设备可以新增一个触控窗口,并将该触控窗口置于显示屏的第一区域进行显示。也即,一旦触发电子设备的单手操作模式,电子设备将会在显示屏的第一区域显示触控窗口,并会在显示屏的第二区域显示缩小后的原显示区域及缩小后的显示内容。
S2430、电子设备响应单手操作模式的触发指令,确定显示屏的第一区域和第二区域。
其中,第一区域为用户单手握持所述电子设备时,用户单手的手指在所述显示屏上能触摸到的区域,第二区域为显示屏上除第一区域以外的剩余区域。电子设备可以通过第一区域接收用户单手输入的触摸操作,可以通过第二区域显示电子设备待显示的内容。
可选地,电子设备首次响应单手操作模式的触发指令时,电子设备可以进入第一区域的初始化设置模式,以确定适配于当前用户单手操作的第一区域。作为一种方式,用户正常单手握持电子设备时,电子设备可以提示用户使用大拇指在显示屏上按照指定轨迹如圆弧进行滑动,然后电子设备可以通过用户大拇指的滑动轨迹,对用户大拇指在屏幕上的最大滑动距离进行分析,以确定出大拇指在屏幕上能触摸到的最大区域范围。从而电子设备可以根据大拇指在屏幕上能触摸到的最大区域范围,确定出第一区域的区域范围。
可选地,电子设备也可以将大拇指在屏幕上能触摸到的最大区域范围进行显示,以供用户自定义第一区域的区域范围。
电子设备在确定出第一区域后,可以将显示屏中除第一区域以为的剩余区域作为第二区域。本申请实施例中,第一区域可以理解为单手操作模式的触控区域,第二区域可以理解为单手操作模式的显示区域。
可选地,电子设备在确定出第一区域后,可以将第一区域的位置和大小进行记录,以便后续电子设备再次进入单手操作模式时,可以直接确定第一区域。
可以理解,第一区域的形状本申请实施例并不作限定,例如可以是方形、圆形、椭圆形等。 以第一区域的形状为方形为例,电子设备可以根据大拇指在屏幕上能触摸到的最大区域范围,确定第一区域的长和宽的最大值,从而电子设备可以确定出第一区域的位置和大小。
可选地,第一区域的大小可以设置上限值,以避免第一区域过大,导致第二区域过小,使得第二区域内显示的内容也过小。
可选地,电子设备在确定第一区域的区域范围时,也可以先在显示屏的底部预留预设大小的底部区域,使得第一区域的位置可以在底部区域之上。从而在确定出第一区域时,该底部区域可以作为第二区域的一部分,用于显示内容。其中,预设大小可以根据实际情况合理设定,本申请实施例不作限定。
S2440、电子设备将显示屏的屏幕显示区域以及屏幕显示区域所显示的第一界面进行缩小。
可以理解,电子设备处于正常模式时,通常显示屏的整个屏幕显示区域都用于显示第一界面,而电子设备从正常模式切换到单手操作模式时,电子设备将显示屏划分为第一区域和第二区域两个区域,第一区域是单手操作模式的触控区域,第二区域是单手操作模式的显示区域。由于第二区域(即单手操作模式的显示区域)的尺寸比显示屏的整个屏幕显示区域的尺寸要小,因此,电子设备从正常模式切换到单手操作模式时,需要将显示屏原本整个屏幕显示区域及其中的显示内容进行缩小,以确保所有在正常模式中的显示均包括在第二区域内,正常模式中应当显示的信息没有缺失。如此,电子设备从正常模式切换到单手操作模式时,用户可以通过显示屏的第二区域查看到原本正常模式的显示。
可选地,由于显示屏显示的第一界面与显示屏的屏幕长宽比例贴合,因此,电子设备可以根据第二区域的位置和大小,确定第二区域中,满足显示屏的屏幕长宽比例的最大区域,作为用于显示原本整个屏幕显示区域显示的第一界面的目标显示区域,然后电子设备可以根据目标显示区域的大小,确定显示屏的原屏幕显示区域以及原屏幕显示区域所显示的第一界面需要缩小的目标比例。然后,手机可以将电子设备原屏幕显示区域及原屏幕显示区域显示的第一界面,按照目标比例等比缩小,如此,可以确保缩小后的第一界面能够铺满该目标显示区域。
可选地,电子设备在确定出目标比例后,可以将该目标比例进行记录,以便后续电子设备再次进入单手操作模式时,可以直接根据记录的目标比例,将电子设备原本整个屏幕显示区域显示的第一界面,按照目标比例等比缩小。
S2450、电子设备控制显示屏的第二区域显示缩小的显示区域以及缩小的第一界面。
可选地,电子设备可以从第二区域中,确定用于显示屏原整个屏幕显示区域显示的第一界面的目标显示区域,以确保原整个屏幕显示区域显示的第一界面能够等比缩小至目标显示区域进行显示。可选地,目标显示区域可以根据用户的单手握持姿势是左手握持还是右手握持,确定目标显示区域在第二区域中是靠向显示屏的左侧边缘还是右侧边缘。例如,用户的单手握持姿势是右手握持时,目标显示区域在第二区域中可以靠向显示屏的右侧边缘。
可选地,第二区域中除了目标显示区域还存在未显示内容的第三区域时,电子设备也可以将桌面上的应用程序进行重新排序,并控制显示屏的第三区域显示重新排序后的桌面应用程序。
可选地,电子设备也可以控制显示屏的第三区域显示用户常用的应用程序或者快捷功能。本申请并不对第三区域显示的内容进行限定。
S2460、电子设备建立第一区域与第二区域之间的映射关系。
可选地,第一区域与第二区域之间的映射关系,可以是第一区域与第二区域的坐标映射关系。如此,当电子设备检测到用户对第一区域内某个位置执行触摸操作之后,可根据该坐标映射关系,将用户对第一区域内某个位置的触摸操作,映射到对第二区域内对应位置的触摸操作,使得电子设备能够执行该第二区域内对应位置被触控时所需执行的功能。
在一些实施例中,电子设备可以建立第一区域与第二区域中显示缩小后的第一界面之间的映射关系。可选地,电子设备可以建立第一区域与第二区域所显示的第一界面中操作控件之间的映射关系。第一界面中的操作控件可以包括标题栏、导航栏、底边栏、顶边栏等不随页面浏览而移动的、固定位置的操作控件,也可以包括图片、按钮、文本框等随页面浏览而移动的、非固定位置的操作控件。
可选地,电子设备可以建立第二区域内显示的第一界面中的目标控件的位置与第一区域内目 标位置之间的坐标映射关系,以将第一界面中的目标控件的位置映射到第一区域。其中,目标控件可以是第一界面中的任一操作控件,目标位置可以是第二区域内的任一位置。如此,当电子设备检测到用户对第一区域内目标位置执行触摸操作之后,可根据该坐标映射关系,将用户对第一区域内目标位置的触摸操作,映射到对第二区域内目标控件所在位置的触摸操作,使得电子设备能够执行该第二区域内目标控件被触控时所需执行的功能。例如,电子设备可以在位于第二区域内的缩小后屏幕显示区域中显示目标控件被触控时产生的新的用户界面。
可选地,电子设备也可以将第二区域内显示的第一界面中的目标控件,映射到第一区域进行显示,即第一区域可以显示该目标控件。然后电子设备可以建立第二区域内显示的目标控件被触控时电子设备所需执行的功能与第一区域内显示的目标控件之间的功能映射关系。如此,当电子设备检测到用户对第一区域内显示的目标控件执行触摸操作时,电子设备可以直接执行第二区域内目标控件被触控时所需执行的功能。
可选地,电子设备可以将标题栏、底边栏、顶边栏等不随页面浏览而移动的、固定位置的操作控件映射到第一区域,而图片、按钮、文本框等随页面浏览而移动的、非固定位置的操作控件可以选择性地映射到第一区域。
作为一种实施方式,请参阅图25,S2460可以包括:
S2461、电子设备获取第一界面中的第一控件和第二控件。
其中,第一控件可以是第一界面中标题栏、底边栏、顶边栏等不随页面浏览而移动的、固定位置的操作控件,第二控件可以第一界面中随页面浏览而移动的、非固定位置的操作控件。
电子设备可以获取第一界面中每个控件的显示样式,以确定每个控件在第一界面中的显示位置和显示大小。然后电子设备可以通过判断控件是否固定显示在第一界面中某个位置,来判断是否为第一控件。可选地,由于标题栏、导航栏、菜单栏等不随页面浏览而移动的、固定位置的操作控件,通常显示在用户界面的顶部或低部,而随页面浏览而移动的、非固定位置的操作控件,通常显示在用户界面中间的长预览页面中。因此,电子设备可以从第一界面中获取固定显示于第一界面内第一位置的控件,作为第一控件;从第一界面中获取非固定显示于第一界面内第一位置的控件,作为第二控件。其中,第一位置包括顶部位置、底部位置中的至少一种。
S2462、电子设备根据第二控件,生成组合控件。
可选地,电子设备可以根据每个第二控件的显示样式,确定显示样式相匹配的多个所述第二控件,然后对显示样式相匹配的多个第二控件进行组合处理,得到组合控件。作为一种方式,电子设备可将沿同一方向排列的多个第二控件,作为显示样式相匹配的多个第二控件。同一方向可以是水平方式,也可以是垂直方式。
可选地,电子设备可以判断第二控件在第一界面中的排布方式是水平排布还是垂直排布,以根据第一界面中第二控件的排布方式、位置和大小,从第一界面中选取部分第二控件,重新组合,得到组合后的组合控件,以便电子设备可以将组合控件一起映射到第一区域。可选地,电子设备可以对第一界面中的所有第二控件进行重新组合,以得到多个组合控件。
可选地,电子设备可以将水平排布成一行或多行的多个第二控件,进行组合得到组合控件。可选地,电子设备也可以是将垂直排布成一列或多列的多个第二控件进行组合得到组合控件。可选地,电子设备可以水平排布的多个第二控件和垂直排布的多个第二控件,进行组合得到组合控件。
可选地,电子设备可以在第二区域显示选择光标,如图21所示的内容选择框2101,选择光标可以提示用户当前在第一区域可以对应控制第二区域中的哪部分内容。
可选地,选择光标包括一边界,该边界定义了选择光标在缩小后的第一界面中的选择区域,选择区域用于选择至少一个第二控件。其中,上述边界所定义的上述选择区域的透明度可以为0到100%。
选择区域所框定的第二区域的内容即为用户可通过第一区域对应控制的内容。电子设备可以根据用户作用于第一区域的滑动操作,在第二区域内移动选择光标,以使选择光标的选择区域框定不同的内容。
可选地,选择光标的选择区域大小可以是组合控件所占用的区域大小。从而电子设备可以根 据组合控件所占据的区域大小,确定为选择光标的选择区域的大小,以确保选择光标的选择区域恰好可以框定组合控件。如此,电子设备可以根据用户作用于第一区域的滑动操作,将选择光标在多个组合控件中移动。从而用户可以通过电子设备显示的选择光标,确定电子设备当前选中的组合控件。
可选地,对于第一界面中不随页面浏览而移动的、固定位置的第一控件,其会一直显示在第一界面中,电子设备可以在第一界面的显示期间,将第一控件一直映射到第一区域,使得用户可以随时对第一控件进行操作。而对于第一界面中随页面浏览而移动的、非固定位置的第二控件,其会随着页面的浏览逐渐显示、或逐渐不显示,因此,电子设备可以不用一直映射第二控件,而是在用户浏览到该第二控件时再对其进行映射。
可选地,电子设备可以将选择光标的选择区域所框定的组合控件映射到第一区域。如此,当用户通过在第一区域执行滑动操作,控制第二区域内的选择光标从第一组合控件移动到第二组合控件时,电子设备可以先将第一组合控件映射到第一区域,即先在第一区域内显示选择光标选中的第一组合控件,然后跟随选择光标的移动,取消第一组合控件的映射,重新将第二组合控件映射到第一区域,即电子设备在第一区域内取消第一组合控件的显示,重新显示选择光标新选中的第二组合控件。从而电子设备可以实时将选择光标的选择区域所框定的组合控件映射到第一区域。
示例性地,第一界面为浏览界面,浏览界面包括多个第二控件,多个第二控件中包括沿水平方向排列的第一选项控件、第二选项控件、第三选项控件时,电子设备可以对沿水平方向排列的第一选项控件、第二选项控件、第三选项控件进行组合处理,以得到选项组合控件。这样当选择光标选中该选项组合控件时,电子设备可以在第一区域内显示该选项组合控件。其中,当前选择光标的的选择区域可以恰好框定选项组合控件。
可选地,选择光标的选择区域大小也可以是每个第二控件所占用的区域大小。如此,当用户通过在第一区域执行滑动操作,控制第二区域内的选择光标在不同的第二控件之间移动时,电子设备可跟随选择光标的移动,分别将不同的第二控件映射到第一区域。从而电子设备可以实时将选择光标的选择区域所框定的第二控件映射到第一区域进行显示。
示例性地,第一界面为浏览界面,浏览界面包括多个第二控件,多个第二控件中包括沿水平方向排列的第一选项控件、第二选项控件、第三选项控件时,选择光标可以跟随用户在第一区域的滑动操作,逐个选中第一选项控件、第二选项控件、第三选项控件。可选地,当选择光标选中第一选项控件时,电子设备可以将该第一选项控件映射到第一区域进行显示。
可选地,选择光标选中单个对象时,电子设备也可以不将该对象映射到第一区域进行显示,直接在第一区域执行单击或双击操作来实现对该单个对象的触控。
可选地,电子设备可以根据不同的应用程序,自适应显示不同大小的选择光标。
作为一种方式,当第一界面为第一应用程序的应用界面时,电子设备可将沿水平方向排列的多个第二控件,作为显示样式相匹配的多个第二控件,以生成组合控件。如图19中的(a)所示,第一界面为搜索类应用程序的首页时,关注控件、新闻控件、新闻控件、地图控件通常处于一个水平的位置,此时电子设备可以将这些沿水平方向排列的多个控件进行组合,得到组合控件,电子设备可以根据该组合控件所占用的区域显示选择光标1902。
当第一界面为第二应用程序的应用界面时,电子设备可将沿垂直方向排列的多个第二控件,作为显示样式相匹配的多个第二控件。其中,第一应用程序不同于第二应用程序。如图18所示,第一界面为短视频类应用程序的视频播放界面时,关注控件、收藏控件、评论控件、分享控件通常处于一个垂直的位置,此时电子设备可以沿垂直方向排列的多个控件进行组合,得到组合控件,电子设备可以根据该组合控件所占用的区域显示选择光标1802。
可选地,电子设备可以根据同一的应用程序中内容的不同,自适应显示不同大小的选择光标。本申请实施例对选择光标的大小并不作限定。
S2463、电子设备判断第一控件和组合控件是否适配于第一区域。若否,则电子设备先执行S2464然后再继续执行S2463。若是,则电子设备执行S2465。
S2464、电子设备对第一控件和组合控件的位置及大小进行调整,得到调整后的第一控件和组合控件。
S2465、电子设备将第一控件和组合控件映射到第一区域。
当第一界面中存在较少的第二控件(如小于10个第二控件)时,电子设备可以直接判断这些第二控件所占用的区域大小与第一区域的大小是否匹配,如果不匹配,电子设备可以对这些第二控件的显示样式进行调整,以调整至与第一区域匹配。然后电子设备可以将调整后的第二控件映射到第一区域。
当第一界面中存在较多的第二控件(如不小于10个第二控件)时,电子设备可以将这些第二控件划分为多个区域,每个区域的第二控件进行重新组合,以得到多个组合后的组合控件,然后电子设备可以根据选择光标的移动,将选择光标选中的组合控件映射到第一区域。
可选地,电子设备可以判断组合控件所占用的区域大小与第一区域的大小是否匹配,如果不匹配,电子设备可以对组合控件的显示样式进行调整,以调整至与第一区域匹配。其中,所述调整后的组合控件所占用的区域大小与所述第一区域的大小匹配。
作为一种方式,电子设备可以对组合控件中每个第二控件的显示样式进行调整,再将调整后的每个第二控件重新组合,得到一个调整后的新的组合控件,然后电子设备可以将调整后的新的组合控件映射到第一区域。
可选地,电子设备可以根据第一区域的大小,调整组合控件中每个第二控件的显示大小,也可以调整组合控件中相邻两个第二控件之间的显示间距,还可以调整组合控件中每个第二控件的显示位置。本申请实施例对组合控件的显示样式的调整方式不作限定。
当第一界面中存在多个第一控件时,电子设备也可以直接判断这些第一控件所占用的区域大小与第一区域的大小是否匹配,如果不匹配,电子设备可以对这些第一控件的显示样式进行调整,以调整至与第一区域匹配。然后电子设备可以将调整后的第一控件映射到第一区域。可选地,电子设备也可以对这些第一控件的显示样式进行调整,而是以切换列表的形式将部分第一控件进行显示,部分第一控件进行隐藏,当用户在第一区域上滑动切换列表时,电子设备再将隐藏的第一控件进行显示。
当第一界面中存在第一控件和较多的第二控件时,电子设备可以将这些较多的第二控件进行重新组合,以得到多个组合控件。然后电子设备可以判断这些第一控件和选择光标选中的组合控件所占用的区域大小与第一区域的大小是否匹配,如果不匹配,电子设备可以对这些第一控件和选择光标选中的组合控件的显示样式进行调整,以调整至与第一区域匹配。
可以理解,当第一控件和组合控件按照原有排版和大小显示时所占用的区域,远小于第一区域时,电子设备可以确定第一控件和组合控件的排版和大小与第一区域不匹配。可选地,当第一控件和组合控件按照原有排版和大小显示时所占用的区域,大于第一区域时,电子设备可以确定第一控件和组合控件的排版和大小与第一区域不匹配。可选地,当控件过小,即第一控件和组合控件的尺寸小于预设值时,电子设备可以确定第一控件和组合控件的排版和大小与第一区域不匹配。其中,预设值可以根据实现应用合理设定,本申请实施例中并不作限定。
当第一控件和组合控件按照原有排版和大小显示时所占用的区域与第一区域大概一致时,电子设备可以确定第一控件和组合控件的排版和大小与第一区域匹配,电子设备可以直接将第一控件和组合控件的按照原本排版和大小映射到第一区域。
当第一控件和组合控件的排版和大小与第一区域不匹配时,电子设备可以显对第一控件和组合控件的排版和大小进行调整,以确保调整后的第一控件和组合控件的排版和大小与第一区域匹配。然后电子设备再将调整后的第一控件和组合控件映射到第一区域。可选地,电子设备也可以不调整第一控件的排版和大小,而是以切换列表的形式,将第一控件按照原有排版和大小进行显示。
以第一界面中的第一控件为底边栏为例,当底边栏中存在较多的子控件时,如图23所示的底边栏2302中的5个子控件,电子设备可以在第一区域显示子控件的切换列表。如图23所示的底边栏2302中的5个子控件“主页、VIP会员、消息、购物车、我的”,电子设备可以显示该底部栏的切换列表,该切换列表可以显示前3个子控件“主页、VIP会员、消息”,当用户滑动该切换列表时,该切换列表可以切换显示为“消息、购物车、我的”也即显示剩下隐藏的2个子控件,并隐藏之前显示的2个子控件。
可选地,电子设备也可以根据目标控件在第一界面内所处的位置,与第一区域内目标位置,建立这两个位置之间的坐标映射关系,从而实现将第一区域中第一界面的目标控件映射到第二区域。从而电子设备可以响应作用于第一区域中目标位置的触摸操作,将作用于第一区域中目标位置的触摸操作,映射为作用于缩小后的第一界面中目标控件的触摸操作。然后电子设备可以响应作用于缩小后的第一界面中目标控件的触摸操作,对缩小后的第一界面执行目标控件对应的功能。其中,目标控件可以是上述第一控件、也可以是上述第二控件、还可以是上述第三控件。
S2470、电子设备检测到用户作用于第一区域的触摸操作。
其中,触摸操作,可以包括用户对电子设备显示屏的触碰、触碰后在显示屏上的移动等操作,不仅包括用户通过手指或身体其他部位对显示屏的触摸,还包括用户通过触摸笔等触摸装置对显示屏的触摸,其中这里的触碰可以为直接接触触摸屏的操作,也可以为距离触摸屏表面的垂直距离在一定较小的距离范围内对显示屏的控制,例如手指可以直接接触触摸屏,也可以在距离触摸屏表面的垂直距离较小的距离范围内对触摸屏实现触碰控制,即不用直接接触触摸屏表面。
可选地,触摸操作可以是上滑、下滑、左滑、右滑、单击、双击、长按等常用触摸操作,也可以是画圆、画勾√、画叉×等特定触摸手势。本申请实施例对此不作限定。
可选地,电子设备可以根据手势算法实时监听用户作用于显示屏的触摸手势,并确定该显示屏的触摸手势是否位于显示屏的第一区域。当电子设备检测到用户作用于显示屏的触摸手势位于显示屏的第一区域时,电子设备可以将该触摸手势映射到第二区域。
S2480、电子设备响应于用户作用于第一区域的触摸操作,根据映射关系,对第二区域显示的缩小的屏幕显示区域以及缩小的第一界面执行与触摸操作对应的功能。
可选地,第一界面为某个应用程序的首页时,电子设备检测到用户作用于第一区域的触摸操作为左滑操作时,电子设备可以执行该应用程序的退出功能,此时电子设备可以响应该左滑操作,控制缩小的屏幕显示区域从显示第一界面,切换显示应用程序退出后返回的桌面界面。
可选地,第一界面为某个应用程序的非首页时,电子设备检测到用户作用于第一区域的触摸操作为左滑操作时,电子设备可以执行该应用程序的返回上一页功能,此时电子设备可以响应该左滑操作,控制缩小的屏幕显示区域从显示第一界面,切换显示上一级的应用界面。
可选地,第一界面为桌面界面时,电子设备检测到用户作用于第一区域的触摸操作为左滑操作时,电子设备可以执行该应用程序的桌面界面切换功能,此时电子设备可以响应该左滑操作,控制缩小的屏幕显示区域从显示第一界面,切换显示新的桌面界面。
可选地,第一界面为某个应用程序的应用界面时,当第一区域映射有该应用界面中内容选择框中的多个操作控件时,电子设备检测到用户作用于第一区域中目标操作控件的点击操作时,电子设备可以执行应用界面中该目标操作控件被触控时的功能。
例如,第一区域映射有第二区域显示的缩小后的视频播放应用界面中的视频播放控件时,电子设备检测到用户作用于第一区域中视频播放控件的点击操作时,电子设备可以执行缩小后的视频播放应用界面中的视频播放控件被触控时的功能,即视频播放功能,以控制缩小的屏幕显示区域显示视频播放画面。
又例如,视频播放应用界面中的视频播放控件位于视频播放界面的中心位置时,若电子设备预先建立有视频播放界面的中心位置与第一区域的中心位置之间的坐标映射关系,则电子设备检测到用户作用于第一区域中心位置的点击操作时,电子设备可以将作用于第一区域中心位置的点击操作,映射为作用于缩小后的视频播放界面中播放控件的点击操作;然后电子设备可以响应该作用于缩小后的视频播放界面中播放控件的点击操作,对缩小后的视频播放界面中的视频进行播放,以控制缩小的屏幕显示区域显示视频播放画面。
可以理解,上述触摸操作及对应执行的功能仅为举例,本申请实施例并不对其作限定。
本申请实施例提供的单手操作方法,电子设备可以根据用户的单手握持姿态,自适应调整单手操作模式的触控区域,以确保该触控区域适配于用户的单手操作。显示屏中除单手操作模式的触控区域外的剩余区域,均可作为单手操作模式的显示区域。该显示区域可以显示有缩小后的原显示区域显示的用户界面。此外,当单手操作模式的显示区域在显示了缩小后的原显示区域显示的用户界面之后,若还有剩余显示区域,电子设备也可以在该剩余显示区域将桌面应用程序进行 重新排列后显示。同时电子设备还建立单手操作模式的操作区域与单手操作模式的显示区域之间的映射关系,从而当电子设备检测到在单手操作模式的触控区域的触控操作之后,可根据该映射关系,映射到在单手操作模式的显示区域的触控操作,使得电子设备能够执行该单手操作模式的显示区域被触控时所需执行的功能。如此,用户通过在触控区域内进行触摸操作,便可对电子设备缩小显示的原有界面进行全范围的操作。在无需应用程序新增开发或适配的情况下,解决了用户单手在电子设备上进行操作时屏幕显示的部分内容可能手指触摸不到的问题。
可以理解的是,电子设备为了实现上述功能,其包含了执行各个功能相应的硬件和/或软件模块。结合本文中所公开的实施例描述的各示例的算法步骤,本申请能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。本领域技术人员可以结合实施例对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
本实施例可以根据上述方法示例对电子设备进行功能模块的划分,例如,可以对应各个功能划分各个功能模块,也可以将两个或两个以上的功能集成在一个处理模块中。上述集成的模块可以采用硬件的形式实现。需要说明的是,本实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
本申请的实施例还提供一种计算机存储介质,该计算机存储介质中存储有计算机指令,当该计算机指令在电子设备上运行时,使得电子设备执行上述相关方法步骤实现上述实施例中的界面显示方法。
本申请的实施例还提供了一种计算机程序产品,当该计算机程序产品在计算机上运行时,使得计算机执行上述相关步骤,以实现上述实施例中电子设备执行的界面显示方法。
另外,本申请的实施例还提供一种装置,这个装置具体可以是芯片,组件或模块,该装置可包括相连的处理器和存储器;其中,存储器用于存储计算机执行指令,当装置运行时,处理器可执行存储器存储的计算机执行指令,以使芯片执行上述各方法实施例中电子设备执行的界面显示方法。
其中,本实施例提供的电子设备、计算机存储介质、计算机程序产品或芯片均用于执行上文所提供的对应的方法,因此,其所能达到的有益效果可参考上文所提供的对应的方法中的有益效果,此处不再赘述。
通过以上实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。
在本申请所提供的几个实施例中,应该理解到,所揭露的装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个装置,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是一个物理单元或多个物理单元,即可以位于一个地方,或者也可以分布到多个不同地方。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该软件产品存储在一个存储介质中,包括若干指令用以使得一个设备(可以是单片机,芯片等)或处理器(processor) 执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(read only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上内容,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (26)

  1. 一种单手操作方法,其特征在于,应用于电子设备,所述电子设备包括显示屏,所述方法包括:
    显示第一界面;
    响应单手操作模式的触发指令,确定所述显示屏的第一区域和第二区域;其中,所述第一区域为用户单手握持所述电子设备时,所述用户单手的手指在所述显示屏上能触摸到的区域,所述第二区域为所述显示屏上除所述第一区域以外的剩余区域;
    显示缩小后的所述第一界面于所述第二区域;
    响应作用于所述第一区域的触摸操作,对缩小后的所述第一界面执行与所述触摸操作对应的功能。
  2. 根据权利要求1所述的方法,其特征在于,所述确定所述显示屏的第一区域,包括:
    提示用户单手握持所述电子设备时,使用所述单手的手指在所述显示屏上按照指定轨迹滑动;
    根据所述单手的手指的所述滑动,确定所述单手的手指在所述显示屏上能触摸到的最大区域;
    根据所述最大区域,确定所述显示屏的第一区域。
  3. 根据权利要求1所述的方法,其特征在于,所述第一界面包括目标控件,所述响应作用于所述第一区域的触摸操作,对缩小后的所述第一界面执行与所述触摸操作对应的功能,包括:
    响应作用于所述第一区域中目标位置的触摸操作,将作用于所述第一区域中目标位置的触摸操作,映射为作用于缩小后的所述第一界面中所述目标控件的触摸操作;其中,所述第一区域中的目标位置与所述第一界面中所述目标控件所在的位置之间预先建立有坐标映射关系;
    响应作用于缩小后的所述第一界面中所述目标控件的触摸操作,对缩小后的所述第一界面执行所述目标控件对应的功能。
  4. 根据权利要求3所述的方法,其特征在于,所述第一界面为视频播放界面,所述目标控件为位于所述视频播放界面的中心位置的播放控件,所述视频播放界面的中心位置与所述第一区域的中心位置预先建立有坐标映射关系;
    所述响应作用于所述第一区域的触摸操作,对缩小后的所述第一界面执行与所述触摸操作对应的功能,包括:
    响应作用于所述第一区域中心位置的触摸操作,将作用于所述第一区域中心位置的触摸操作,映射为作用于缩小后的所述视频播放界面中所述播放控件的触摸操作;
    响应作用于缩小后的所述视频播放界面中所述播放控件的触摸操作,对缩小后的所述视频播放界面中的视频进行播放。
  5. 根据权利要求1所述的方法,其特征在于,所述第一界面包括目标控件,在显示缩小后的所述第一界面之后,所述方法还包括:
    显示所述目标控件于所述第一区域;
    所述响应作用于所述第一区域的触摸操作,对缩小后的所述第一界面执行与所述触摸操作对应的功能,包括:
    响应作用于所述第一区域中所述目标控件的触摸操作,对缩小后的所述第一界面执行所述目标控件对应的功能。
  6. 根据权利要求5所述的方法,其特征在于,所述第一界面为视频播放界面,所述目标控件为播放控件,在显示缩小后的所述第一界面之后,所述方法还包括:
    显示所述播放控件于所述第一区域;
    所述响应作用于所述第一区域的触摸操作,对缩小后的所述第一界面执行与所述触摸操作对应的功能,包括:
    响应作用于所述第一区域中所述播放控件的触摸操作,对缩小后的所述视频播放界面中的视频进行播放。
  7. 根据权利要求5所述的方法,其特征在于,所述目标控件为第一控件,在显示缩小后的所述第一界面之后,所述方法还包括:
    根据所述第一界面中每个控件的显示样式,从所述第一界面中获取固定显示于所述第一界面 内第一位置的控件,作为所述第一控件;其中,所述第一位置包括顶部位置、底部位置中的至少一种。
  8. 根据权利要求7所述的方法,其特征在于,所述第一控件包括标题栏、导航栏、菜单栏中的至少一种。
  9. 根据权利要求5所述的方法,其特征在于,所述目标控件为第二控件,在显示缩小后的所述第一界面之后,所述方法还包括:
    根据所述第一界面中每个控件的显示样式,从所述第一界面中获取非固定显示于所述第一界面内第一位置的控件,作为第二控件;其中,所述第一位置包括顶部位置、底部位置中的至少一种;
    显示选择光标于缩小后的所述第一界面,所述选择光标包括一边界,所述边界定义了所述选择光标在缩小后的所述第一界面中的选择区域,所述选择区域用于选择至少一个所述第二控件;
    所述显示所述目标控件于所述第一区域,包括:
    显示所述选择光标的选择区域中的所述第二控件于所述第一区域。
  10. 根据权利要求9所述的方法,其特征在于,所述第一界面为浏览界面,所述浏览界面包括多个所述第二控件,多个所述第二控件中包括第一选项控件,所述方法还包括:
    当所述选择光标的选择区域中包括所述第一选项控件时,显示所述第一选项控件于所述第一区域。
  11. 根据权利要求9所述的方法,其特征在于,在从所述第一界面中获取非固定显示于所述第一界面内第一位置的控件,作为第二控件之后,所述方法还包括:
    对所述显示样式相匹配的多个所述第二控件进行组合处理,得到组合控件;
    将所述组合控件所占据的区域,确定为所述选择光标的选择区域;
    所述显示所述选择光标的选择区域内的所述第二控件于所述第一区域,包括:
    显示所述选择光标的选择区域中的所述组合控件于所述第一区域。
  12. 根据权利要求11所述的方法,其特征在于,所述方法还包括:
    根据每个所述第二控件的显示样式,将沿同一方向排列的多个所述第二控件,作为所述显示样式相匹配的多个所述第二控件。
  13. 根据权利要求12所述的方法,其特征在于,所述第一界面为应用界面,所述方法还包括:
    当第一界面为第一应用程序的应用界面时,将沿水平方向排列的多个所述第二控件,作为所述显示样式相匹配的多个所述第二控件;
    当第一界面为第二应用程序的应用界面时,将沿垂直方向排列的多个所述第二控件,作为所述显示样式相匹配的多个所述第二控件;其中,所述第一应用程序不同于所述第二应用程序。
  14. 根据权利要求12所述的方法,其特征在于,所述第一界面为浏览界面,所述浏览界面包括多个所述第二控件,多个所述第二控件中包括沿水平方向排列的第一选项控件、第二选项控件、第三选项控件,所述方法还包括:
    对所述沿水平方向排列的第一选项控件、第二选项控件、第三选项控件进行组合处理,得到选项组合控件;
    当所述选择光标的选择区域中包括所述选项组合控件时,显示所述选项组合控件于所述第一区域;其中,所述选择光标的选择区域的大小与所述选项组合控件所占用的区域大小相匹配。
  15. 根据权利要求11-14任一项所述的方法,其特征在于,所述方法还包括:
    在检测到所述组合控件所占用的区域大小与所述第一区域的大小不匹配时,调整所述组合控件中每个所述第二控件的显示样式,得到调整后的组合控件;其中,所述调整后的组合控件所占用的区域大小与所述第一区域的大小匹配;
    显示所述调整后的组合控件于所述第一区域。
  16. 根据权利要求15所述的方法,其特征在于,所述调整所述组合控件中每个所述第二控件的显示样式,包括:
    根据所述第一区域的大小,调整所述组合控件中每个所述第二控件的显示大小;
    或者
    根据所述第一区域的大小,调整所述组合控件中相邻两个所述第二控件之间的显示间距;
    或者
    根据所述第一区域的大小,调整所述组合控件中每个所述第二控件的显示位置。
  17. 根据权利要求1-16任一项所述的方法,其特征在于,所述第一界面为应用程序的首页,所述响应作用于所述第一区域的触摸操作,对缩小后的所述第一界面执行与所述触摸操作对应的功能,包括:
    响应作用于所述第一区域的左滑操作,退出所述应用程序,并显示缩小后的桌面界面于所述第二区域。
  18. 根据权利要求1-16任一项所述的方法,其特征在于,所述第一界面为应用程序的非首页时,所述响应作用于所述第一区域的触摸操作,对缩小后的所述第一界面执行与所述触摸操作对应的功能,包括:
    响应作用于所述第一区域的左滑操作,显示缩小后的第二界面,所述第二界面为所述第一界面的上一层级的界面。
  19. 根据权利要求1-16任一项所述的方法,其特征在于,所述响应作用于所述第一区域的触摸操作,对缩小后的所述第一界面执行与所述触摸操作对应的功能,包括:
    检测到作用于所述第一区域的预设手势操作,所述预设手势操作用于触发第三界面中的预设功能;
    响应所述预设手势操作,将所述第二区域显示的缩小后的所述第一界面切换至缩小后的所述第三界面,并执行所述预设功能。
  20. 根据权利要求1-19任一项所述的方法,其特征在于,所述第二区域包括目标显示区域和第三区域,所述显示缩小后的所述第一界面于所述第二区域,包括:
    显示缩小后的所述第一界面于所述目标显示区域;
    显示多个图标控件于所述第三区域;其中,所述图标控件包括应用程序的图标控件、快捷功能的图标控件中的至少一种。
  21. 根据权利要求20所述的方法,其特征在于,所述显示多个图标控件于所述第三区域,包括:
    将桌面界面中的多个应用程序的图标控件进行重新排列;
    显示重新排列后的所述多个应用程序的图标控件于所述第三区域。
  22. 根据权利要求20所述的方法,其特征在于,所述方法还包括:
    显示切换控件于所述第一区域,其中,所述第一区域的作用区域为所述目标显示区域;
    响应作用于所述第一区域中所述切换控件的触摸操作,确定所述第一区域的作用区域从所述目标显示区域切换至所述第三区域;
    响应作用于所述第一区域的触摸操作,对所述第三区域中的所述多个图标控件执行与所述第一区域的触摸操作对应的功能。
  23. 一种电子设备,其特征在于,所述电子设备包括显示屏、存储器和一个或多个处理器;所述显示屏、所述存储器和所述处理器耦合;所述存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令,当所述处理器执行所述计算机指令时,所述电子设备执行如权利要求1-22中任一项所述的方法。
  24. 一种芯片***,其特征在于,所述芯片***应用于电子设备;所述芯片***包括一个或多个接口电路和一个或多个处理器;所述接口电路和所述处理器通过线路互联;所述接口电路用于从所述电子设备的存储器接收信号,并向所述处理器发送所述信号,所述信号包括所述存储器中存储的计算机指令;当所述处理器执行所述计算机指令时,所述电子设备执行如权利要求1-22中任一项所述的方法。
  25. 一种计算机存储介质,其特征在于,包括计算机指令,当所述计算机指令在电子设备上运行时,使得所述可穿戴设备执行如权利要求1-22中任一项所述的方法。
  26. 一种计算机程序产品,其特征在于,当所述计算机程序产品在电子设备上运行时,使得所述电子设备执行如权利要求1-22中任一项所述的方法。
PCT/CN2023/127972 2022-11-30 2023-10-30 一种单手操作方法及电子设备 WO2024114234A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211521350.3 2022-11-30
CN202211521350.3A CN118113181A (zh) 2022-11-30 2022-11-30 一种单手操作方法及电子设备

Publications (1)

Publication Number Publication Date
WO2024114234A1 true WO2024114234A1 (zh) 2024-06-06

Family

ID=91216703

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/127972 WO2024114234A1 (zh) 2022-11-30 2023-10-30 一种单手操作方法及电子设备

Country Status (2)

Country Link
CN (1) CN118113181A (zh)
WO (1) WO2024114234A1 (zh)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012077273A1 (ja) * 2010-12-07 2012-06-14 パナソニック株式会社 電子機器
CN103176744A (zh) * 2013-04-12 2013-06-26 深圳市中兴移动通信有限公司 一种显示设备及其信息处理方法
CN103353826A (zh) * 2013-04-16 2013-10-16 深圳市中兴移动通信有限公司 一种显示设备及其信息处理方法
CN104049883A (zh) * 2013-03-15 2014-09-17 广州三星通信技术研究有限公司 显示子屏幕和在子屏幕中进行操作的方法和***
US20140289642A1 (en) * 2013-02-28 2014-09-25 Roy Varada Prasad Method of single-handed software operation of large form factor mobile electronic devices
CN107924274A (zh) * 2015-07-31 2018-04-17 麦克赛尔株式会社 信息终端装置
CN111566606A (zh) * 2018-08-20 2020-08-21 华为技术有限公司 界面的显示方法及电子设备

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012077273A1 (ja) * 2010-12-07 2012-06-14 パナソニック株式会社 電子機器
US20140289642A1 (en) * 2013-02-28 2014-09-25 Roy Varada Prasad Method of single-handed software operation of large form factor mobile electronic devices
CN104049883A (zh) * 2013-03-15 2014-09-17 广州三星通信技术研究有限公司 显示子屏幕和在子屏幕中进行操作的方法和***
CN103176744A (zh) * 2013-04-12 2013-06-26 深圳市中兴移动通信有限公司 一种显示设备及其信息处理方法
CN103353826A (zh) * 2013-04-16 2013-10-16 深圳市中兴移动通信有限公司 一种显示设备及其信息处理方法
CN107924274A (zh) * 2015-07-31 2018-04-17 麦克赛尔株式会社 信息终端装置
CN111566606A (zh) * 2018-08-20 2020-08-21 华为技术有限公司 界面的显示方法及电子设备

Also Published As

Publication number Publication date
CN118113181A (zh) 2024-05-31

Similar Documents

Publication Publication Date Title
US11809693B2 (en) Operating method for multiple windows and electronic device supporting the same
WO2022022495A1 (zh) 一种跨设备的对象拖拽方法及设备
US10747353B2 (en) Electronic device and method of controlling the electronic device based on touch input
WO2022100315A1 (zh) 应用界面的生成方法及相关装置
WO2021244443A1 (zh) 分屏显示方法、电子设备及计算机可读存储介质
WO2019233306A1 (zh) 图标显示方法、装置及终端
WO2021184375A1 (zh) 执行手势指令的方法、装置、***及存储介质
CN105359121B (zh) 使用接收数据的应用远程操作
WO2022062898A1 (zh) 一种窗口显示方法及设备
US20140078091A1 (en) Terminal Device and Method for Quickly Starting Program
WO2021063098A1 (zh) 一种触摸屏的响应方法及电子设备
WO2021213449A1 (zh) 一种触控操作方法及设备
WO2023226455A1 (zh) 应用图标的显示方法、电子设备及可读存储介质
KR20160073714A (ko) 전자 기기 및 이의 웹 페이지 디스플레이 방법
WO2022213831A1 (zh) 一种控件显示方法及相关设备
WO2021244459A1 (zh) 一种输入方法及电子设备
EP4310648A1 (en) Service card processing method, and electronic device
WO2024114234A1 (zh) 一种单手操作方法及电子设备
CN114461312B (zh) 显示的方法、电子设备及存储介质
WO2024109220A1 (zh) 显示卡片的方法、电子设备及可读存储介质
WO2024001871A1 (zh) 一种操控方法和电子设备
WO2023160455A1 (zh) 删除对象的方法及电子设备
KR102382074B1 (ko) 멀티윈도우 운용 방법 및 이를 지원하는 전자 장치
WO2024012354A1 (zh) 一种显示方法及电子设备
EP4339765A1 (en) Method for displaying multiple interfaces, and electronic device