CN112269512A - Single-hand operation method and device for mobile equipment - Google Patents

Single-hand operation method and device for mobile equipment Download PDF

Info

Publication number
CN112269512A
CN112269512A CN202011193545.0A CN202011193545A CN112269512A CN 112269512 A CN112269512 A CN 112269512A CN 202011193545 A CN202011193545 A CN 202011193545A CN 112269512 A CN112269512 A CN 112269512A
Authority
CN
China
Prior art keywords
display screen
graph
displaying
point control
virtual point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011193545.0A
Other languages
Chinese (zh)
Inventor
成双春
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan MgtvCom Interactive Entertainment Media Co Ltd
Original Assignee
Hunan MgtvCom Interactive Entertainment Media Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan MgtvCom Interactive Entertainment Media Co Ltd filed Critical Hunan MgtvCom Interactive Entertainment Media Co Ltd
Priority to CN202011193545.0A priority Critical patent/CN112269512A/en
Publication of CN112269512A publication Critical patent/CN112269512A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a single-hand operation method and a single-hand operation device for mobile equipment, wherein the method comprises the steps of displaying a graph for indicating action through a display screen of the mobile equipment when the state of the mobile equipment meets a preset single-hand mode trigger condition, and displaying a semitransparent virtual point control layer in a near-finger area; respectively multiplying the actual moving distances of the mobile equipment in two directions of the plane of the display screen by the corresponding amplification values to obtain target moving amounts of the graph in the two directions of the display screen, and controlling the graph to move by the corresponding target moving amounts in the two directions of the display screen; when the user clicking operation is detected through the virtual point control layer, a clicking event is generated at the position of the graph. When the method is used for interacting the content displayed by the display screen, the content displayed by the display screen does not need to be zoomed or moved, so that the influence on the original experience of a user is reduced to the minimum; the method is independent of the coverage area of the fingers of the user, is not limited by the size of the screen of the mobile device, and has wider use scenes.

Description

Single-hand operation method and device for mobile equipment
Technical Field
The invention relates to the field of mobile equipment, in particular to a method and a device for operating the mobile equipment by one hand.
Background
With the development of the mobile device industry, mobile devices with large screens, such as large-screen smart phones and tablet computers, have become very popular. While enjoying better visual experience brought by a mobile device with a large screen, users also encounter a new problem of difficulty in operating the device with one hand. At present, the convenience of single-hand operation of a large-screen mobile device is improved mainly by reducing screen content, pulling down and hovering an interface and setting a common button on top or bottom.
The way to reduce the screen content is to reduce the display content of the mobile device in the one-handed operation mode, which makes certain interactions more difficult to operate and reduces the user's look and feel. The interface drop-down hovering mode is that the whole screen content is triggered to move downwards by means of a drop-down gesture, so that part of the content is moved out of the screen and cannot be seen, and the display content of the screen is changed; this approach solves the problem of the top being inaccessible to a single hand, but does not solve the operational problem of screen width, while the lower half of the interface becomes invisible, reducing user experience. The common button top or bottom mode only solves limited few operations, all interactions cannot be placed at the top and the bottom, and the operations at other deeper positions cannot be reached. Therefore, there is a need for a one-handed operation method that can facilitate one-handed operation without degrading user experience.
Disclosure of Invention
In view of this, the present invention provides a method and an apparatus for one-handed operation of a mobile device, which are intended to achieve convenient one-handed operation without reducing user experience.
In order to achieve the above object, the following solutions are proposed:
in a first aspect, a method for one-handed operation of a mobile device is provided, which includes:
when the state of the mobile equipment meets the preset one-hand mode triggering condition, displaying a graph for indicating action through a display screen of the mobile equipment, and displaying a semitransparent virtual point control layer in a near finger area;
respectively multiplying the actual moving distances of the mobile equipment in two directions of the plane of the display screen by the corresponding amplification values to obtain the target moving amounts of the graph in the two directions of the display screen, and controlling the graph to move by the corresponding target moving amounts in the two directions of the display screen;
and when the user clicking operation is detected through the virtual point control layer, generating a clicking event at the position of the graph.
Preferably, the preset one-hand mode triggering condition includes: the rotation angle of the mobile device is greater than an angle threshold.
Preferably, the initial position of the icon is: a geometric center of the display screen.
Preferably, after the state of the mobile device meets the preset one-hand mode triggering condition and before the semi-transparent virtual point control layer is displayed in the near-finger area, the method further comprises the following steps:
prompting a user to select an operation mode from left-hand operation and right-hand operation by utilizing a popup window mode;
when the user selects left-hand operation, displaying a semitransparent virtual point control layer in the lower left corner area of the display screen;
and when the user selects the right-hand operation, displaying a semitransparent virtual point control layer in the lower right corner area of the display screen.
Preferably, when the graphic moves on the display screen, the method further includes:
when the center of the graph moves into a clickable element displayed on the display screen, generating a semi-transparent covering layer with the same size as the clickable element on the uppermost layer of the clickable element, wherein the graph exists in an invisible form;
and when the center of the graph moves out of the clickable element displayed by the display screen, canceling the semi-transparent covering layer and displaying the graph.
In a second aspect, a single-hand operation device for a mobile device is provided, which includes:
the condition triggering unit is used for displaying a graph for indicating action through a display screen of the mobile equipment and displaying a semitransparent virtual point control layer in a near finger area when the state of the mobile equipment meets a preset one-hand mode triggering condition;
the icon moving unit is used for multiplying the actual moving distance of the mobile equipment in two directions of the plane of the display screen by the corresponding amplification value respectively to obtain the target moving amount of the graph in the two directions of the display screen and controlling the graph to move by the corresponding target moving amount in the two directions of the display screen;
and the click control unit is used for generating a click event at the position of the graph when the click operation of the user is detected through the virtual point control layer.
Preferably, the preset one-hand mode triggering condition includes: the rotation angle of the mobile device is greater than an angle threshold.
Preferably, the initial position of the icon is: a geometric center of the display screen.
Preferably, the mobile device one-handed operation device further includes:
the left-hand operation mode selection unit is used for prompting a user to select one operation mode from left-hand operation and right-hand operation by utilizing a popup window form after the state of the mobile equipment meets a preset single-hand mode trigger condition and before a semitransparent virtual point control layer is displayed in a near finger area; when the user selects left-hand operation, displaying a semitransparent virtual point control layer in the lower left corner area of the display screen; and when the user selects the right-hand operation, displaying a semitransparent virtual point control layer in the lower right corner area of the display screen.
Preferably, the mobile device single-hand operation device further includes:
the absorption effect unit is used for generating a semitransparent covering layer with the same size as the clickable element on the uppermost layer of the clickable element when the center of the graph moves into the clickable element displayed on the display screen, and the graph exists in an invisible form; and when the center of the graph moves out of the clickable element displayed by the display screen, canceling the semi-transparent covering layer and displaying the graph.
Compared with the prior art, the technical scheme of the invention has the following advantages:
according to the technical scheme, the method comprises the steps that when the state of the mobile equipment meets a preset one-hand mode triggering condition, a graph used for indicating action is displayed through a display screen of the mobile equipment, and a semitransparent virtual point control layer is displayed in a near finger area; respectively multiplying the actual moving distances of the mobile equipment in two directions of the plane of the display screen by the corresponding amplification values to obtain target moving amounts of the graph in the two directions of the display screen, and controlling the graph to move by the corresponding target moving amounts in the two directions of the display screen; when the user clicking operation is detected through the virtual point control layer, a clicking event is generated at the position of the graph. When the method is used for interacting the content displayed by the display screen, the content displayed by the display screen does not need to be zoomed or moved, so that the influence on the original experience of a user is reduced to the minimum; the method is independent of the coverage area of the fingers of the user, is not limited by the size of the screen of the mobile device, and has wider use scenes.
Furthermore, the single-hand mode is triggered by rotating the mobile equipment, and other inlets such as buttons and the like do not need to be additionally arranged on a display screen in the hidden operation, so that the invasion of display contents and misoperation caused by sliding gestures are reduced.
Still further, when the graph moves into the clickable element, the adsorption effect is triggered, namely a semitransparent covering layer with the same size as the clickable element is generated on the uppermost layer of the clickable element, so that the clickable element is positioned, subsequent click operation is completed, and the situation that the target position cannot be positioned due to inaccurate displacement operation is avoided.
Of course, it is not necessary for any product in which the invention is practiced to achieve all of the above-described advantages at the same time.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
Fig. 1 is a flowchart of a method for one-handed operation of a mobile device according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a mobile device being rotated to trigger a one-handed mode according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating content displayed on a display screen after entering a single-handed mode according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of two directions of a display screen according to an embodiment of the present invention;
FIG. 5 is a schematic illustration of the adsorption effect provided by an embodiment of the present invention;
fig. 6 is a schematic diagram of a single-handed operating device of a mobile device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, a method for operating a mobile device with one hand provided by the embodiment may include the following steps:
s11: and when the state of the mobile equipment meets the preset one-hand mode triggering condition, displaying a graph for indicating action through a display screen of the mobile equipment, and displaying a semitransparent virtual point control layer in the near-finger area.
In some embodiments, the preset one-handed mode trigger condition includes a turning angle of the mobile device being greater than an angle threshold. For example, when the rotation angle of the mobile device is larger than 150 °, the state of the mobile device is determined to meet the preset one-hand mode trigger condition, and a graph and a virtual point control layer for indicating action are displayed. Referring to fig. 2, a schematic diagram of turning the mobile device to trigger a one-handed mode is shown. The dotted lines indicate the direction of rotation, and in particular may be left or right.
In some embodiments, the initial position of the graphic for indicating an effect is the geometric center of the display screen. Namely, when the state of the mobile equipment meets the preset one-hand mode triggering condition, the graph is displayed in the geometric center of the display screen. In order to reduce the influence of the graphic for pointing effect on the display content of the display screen, the graphic for pointing effect may be a smaller cursor pointer.
And when the single-hand mode is started, displaying a semitransparent virtual point control layer in the near finger area. The semi-transparent virtual point control layer formed in the near-finger area does not influence the display of the display content of the display screen.
Referring to fig. 3, there is shown a graphic 31 for pointing, a semi-transparent virtual point control layer 32 and elements 33 of a display screen display. The user can see the elements 33 below the semi-transparent virtual point control layer 32 with less impact on the user. The graphic 31 for indicating action is small and does not affect the display of the element 33.
S12: and respectively multiplying the actual moving distances of the mobile equipment in two directions of the plane of the display screen by the corresponding amplification values to obtain the target moving amounts of the graph in the two directions of the display screen, and controlling the graph to move by the corresponding target moving amounts in the two directions of the display screen.
The two directions of the plane of the display screen are respectively the horizontal direction and the vertical direction, referring to fig. 4, the x-axis direction is the horizontal direction, and the y-axis direction is the vertical direction. The actual moving distance of the mobile device in the x-axis direction and the actual moving distance in the y-axis direction can be detected through a gyroscope of the mobile device. An amplification value corresponding to the actual movement distance in the x-axis direction and an amplification value corresponding to the actual movement distance in the y-axis direction are set in advance. When the graph used for indicating action is moved by controlling the movement of the mobile equipment, multiplying the actual movement distance in the x-axis direction by the corresponding amplification value to obtain the target movement amount of the image in the x-axis direction of the display screen; similarly, the actual movement distance in the y-axis direction is multiplied by the corresponding amplification value to obtain the target movement amount of the image in the y-axis direction of the display screen. The graphic movement controlled by the gyroscope for pointing is moved to the target position without being limited by the length of the finger and the size of the device.
S13: when the user clicking operation is detected through the virtual point control layer, a clicking event is generated at the position of the graph.
According to the one-hand operation method of the mobile equipment, the movement control of the icon for indicating is achieved by controlling the movement of the mobile equipment, the clicking of the position of the graph is achieved by clicking the virtual point control layer, and the operation is convenient. The dead-angle-free operation coverage of the large-screen mobile equipment in the single-hand mode is realized in the true sense, the invasion to the display content of the display screen is extremely small, the influence on the original experience of a user is minimized, and the capability of operation by two hands is achieved. Under harsh conditions, the situation that only one hand is allowed to operate can be realized by holding the equipment with two hands, and the capacity of fragmentizing time processing work is greatly improved; the application program providing method provides a richer use scene for teaching and demonstrating by using the tablet personal computer in the fields of education, office and the like.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
In some embodiments, after the state of the mobile device meets a preset one-hand mode trigger condition and before the semi-transparent virtual point control layer is displayed in the near-finger area, the method further includes: prompting a user to select an operation mode from left-hand operation and right-hand operation by utilizing a popup window mode; when the user selects left-hand operation, displaying a semitransparent virtual point control layer in the lower left corner area of the display screen; when the user selects the right-hand operation, a semi-transparent virtual point control layer is displayed in the lower right corner area of the display screen.
In other embodiments, the user may set the right-handed operation or the left-handed operation in advance; if the user has no setting, the default is one of left-handed operation and right-handed operation. After the user sets the right-hand operation or the left-hand operation in advance, the user enters the single-hand operation mode according to the trigger condition, and the semitransparent virtual point control layer is displayed in the corresponding area of the display screen according to the set single-hand operation mode embodied by the user. And if the user does not set the right-hand operation or the left-hand operation, after the trigger condition is met and the single-hand operation mode is entered, displaying the semitransparent virtual point control layer in the corresponding area of the display screen in the default single-hand operation mode.
The horizontal rotation of the mobile device can be easily realized due to the flexibility of the wrist of a human, and the triggering and starting of the one-hand mode is not limited by external conditions and space. The inventor finds in the process of implementing the present invention that the picture for indicating action is simply moved to implement the position movement for expectation, and the following two problems may exist in practical use.
The first problem is that: the icon used for indicating the role moves according to the speed of the user changing the equipment; too fast a user's movement may result in an inability to accurately move to the element that the user intended to move for subsequent operations. It is more difficult to locate elements with a large number of elements and a small area on the display screen. The farther the current position of the icon is from the target position, the more time is required to move; meanwhile, the size of the target element limits the moving speed of the target element, because if the target element is moved too fast, the target element is stopped when being reached, so the target element has to be decelerated in advance according to the size of the target element, the speed of reaching the target element is reduced, and the time of reaching the target element is prolonged. The smaller the target element, the earlier the deceleration is required and the more time is spent.
The second problem is that: the application interface of the mobile device is designed for finger touch operation, and touch 'surface' response is not 'point' response of a traditional computer mouse pointer. When the icon for indicating action moves to a position between two clickable elements, only pixel-level deviation exists actually, the naked eye cannot determine within which clickable element the center of the icon for indicating action is currently located, the icon for indicating action needs to be moved continuously until the icon for indicating action is obviously located within a certain clickable element, and the situation of the first problem may occur in the process of moving continuously.
In order to solve the problem that the icon for indicating the action accurately positions the target element, the invention introduces the design of adsorption effect and semi-transparent covering layer. The adsorption effect is triggered as long as the center of the icon for indicating the role is in the clickable element, namely, when the center of the graph for indicating the role moves into the clickable element displayed on the display screen, a semi-transparent covering layer with the same size as the clickable element is generated on the uppermost layer of the clickable element, and the graph exists in an invisible form. And continuously moving the icon as long as the central point is still in the element, so that the state of the semitransparent covering layer is not changed, and the adsorption effect is generated visually. The adsorption effect defines the target element currently positioned by the user, if the target element desired by the user is positioned, the user can unconsciously slow down the positioning speed until the positioning speed stops, and the semitransparent floating layer with the adsorption effect still defines the target element desired by the user within a limited inertial distance. Even if the moving amplitude of the user to the mobile equipment is too large or the area of the target element is small, the inertia distance enables the center of the icon to be away from the target element, and the user can reposition the icon to generate adsorption only by slightly calling back the icon to realize the positioning of the expected element, so that the first problem is solved. Secondly, the movement of the icon in the display screen is a pixel-level displacement, because the semi-transparent cover layer covering the target element clearly indicates the element that the user can currently operate, even if the icon moves to a small extent during the operation, as long as the center is still within the target element, the semi-transparent cover layer state does not change, like being adsorbed on the target element, avoiding the need to correct the position to perform a secondary movement, thereby solving the second problem. When the center of the graphic moves out of the clickable element displayed on the display screen, the semi-transparent masking is canceled and the graphic is displayed.
Referring to fig. 5, showing the adsorption effect, the dotted line is a semi-transparent cover layer, and a graph for indicating action is arranged below the semi-transparent cover layer; the uppermost layer of the upper left corner element of the display screen display on the right side of fig. 5 generates a semi-transparent masking layer. Triggering an adsorption effect when the center of the graph for indicating action moves into a clickable element on the display screen; the semi-transparent covering layer responds to the clicking operation of the user instead of the graph. The transition of the graphic to the translucent covering appears as a graphic adsorbed on the clickable element. It should be noted that the semi-transparent cover layer drawn by the dotted line in fig. 5 is for explaining the relationship between the position and the size of the semi-transparent cover layer and the icon for indicating the role, and it is not intended that the semi-transparent cover layer is outside the display screen.
Although the operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. Under certain circumstances, multitasking and parallel processing may be advantageous.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order, and/or performed in parallel. Moreover, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
Computer program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including but not limited to an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. Where a remote computer is involved, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider), described below as an embodiment of the apparatus, and used to perform embodiments of the method of the present invention. For details which are not disclosed in the embodiments of the apparatus of the present invention, reference is made to the embodiments of the method of the present invention.
Referring to fig. 6, a single-hand operation apparatus for a mobile device provided in this embodiment includes: a condition trigger unit 61, an icon moving unit 62, and a click control unit 63.
And the condition triggering unit 61 is used for displaying a graph for indicating action and displaying a semitransparent virtual point control layer in a near-finger area through a display screen of the mobile device when the state of the mobile device meets a preset one-hand mode triggering condition.
And an icon moving unit 62, configured to multiply actual moving distances of the mobile device in two directions of a plane of the display screen by corresponding amplification values, respectively, to obtain target moving amounts of the graphic in the two directions of the display screen, and control the graphic to move by the corresponding target moving amounts in the two directions of the display screen.
And the click control unit 63 is used for generating a click event at the position of the graph when the click operation of the user is detected through the virtual point control layer.
The units described in the embodiments of the present disclosure are implemented by software. Where the name of an element does not in some cases constitute a limitation on the element itself.
In some embodiments, the preset one-handed mode trigger condition includes a turning angle of the mobile device being greater than an angle threshold.
In some embodiments, the initial position of the icon is the geometric center of the display screen.
In some embodiments, the mobile device single-hand operation device further comprises a left-hand operation mode selection unit and a right-hand operation mode selection unit, wherein the left-hand operation mode selection unit is used for prompting the user to select one operation mode from left-hand operation and right-hand operation in a popup window mode after the state of the mobile device meets a preset single-hand mode trigger condition and before the semitransparent virtual point control layer is displayed in the near-finger area; when the user selects left-hand operation, displaying a semitransparent virtual point control layer in the lower left corner area of the display screen; when the user selects the right-hand operation, a semi-transparent virtual point control layer is displayed in the lower right corner area of the display screen.
In some embodiments, the mobile device single-handed operation apparatus further comprises: the absorption effect unit is used for generating a semitransparent covering layer with the same size as the clickable element on the uppermost layer of the clickable element when the center of the graph moves into the clickable element displayed on the display screen, and the graph exists in an invisible form; and when the center of the graph moves out of the clickable element displayed on the display screen, canceling the semi-transparent covering layer and displaying the graph.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
In this document, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The embodiments in the present description are mainly described as different from other embodiments, the same and similar parts in the embodiments may be referred to each other, and the features described in the embodiments in the present description may be replaced with each other or combined with each other.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. A method for one-handed operation of a mobile device, comprising:
when the state of the mobile equipment meets the preset one-hand mode triggering condition, displaying a graph for indicating action through a display screen of the mobile equipment, and displaying a semitransparent virtual point control layer in a near finger area;
respectively multiplying the actual moving distances of the mobile equipment in two directions of the plane of the display screen by the corresponding amplification values to obtain the target moving amounts of the graph in the two directions of the display screen, and controlling the graph to move by the corresponding target moving amounts in the two directions of the display screen;
and when the user clicking operation is detected through the virtual point control layer, generating a clicking event at the position of the graph.
2. The method of claim 1, wherein the preset one-handed mode trigger condition comprises:
the rotation angle of the mobile device is greater than an angle threshold.
3. The method of claim 1, wherein the initial position of the icon is:
a geometric center of the display screen.
4. The method for one-handed operation of a mobile device according to claim 1, further comprising, after the state of the mobile device meets a preset one-handed mode trigger condition and before displaying the semi-transparent virtual point control layer in the near-finger area:
prompting a user to select an operation mode from left-hand operation and right-hand operation by utilizing a popup window mode;
when the user selects left-hand operation, displaying a semitransparent virtual point control layer in the lower left corner area of the display screen;
and when the user selects the right-hand operation, displaying a semitransparent virtual point control layer in the lower right corner area of the display screen.
5. The method for one-handed operation of mobile equipment according to any one of claims 1 to 4, further comprising, when the graphic moves on the display screen:
when the center of the graph moves into a clickable element displayed on the display screen, generating a semi-transparent covering layer with the same size as the clickable element on the uppermost layer of the clickable element, wherein the graph exists in an invisible form;
and when the center of the graph moves out of the clickable element displayed by the display screen, canceling the semi-transparent covering layer and displaying the graph.
6. A mobile device one-handed operation apparatus, comprising:
the condition triggering unit is used for displaying a graph for indicating action through a display screen of the mobile equipment and displaying a semitransparent virtual point control layer in a near finger area when the state of the mobile equipment meets a preset one-hand mode triggering condition;
the icon moving unit is used for multiplying the actual moving distance of the mobile equipment in two directions of the plane of the display screen by the corresponding amplification value respectively to obtain the target moving amount of the graph in the two directions of the display screen and controlling the graph to move by the corresponding target moving amount in the two directions of the display screen;
and the click control unit is used for generating a click event at the position of the graph when the click operation of the user is detected through the virtual point control layer.
7. The device of claim 6, wherein the preset one-handed mode trigger condition comprises:
the rotation angle of the mobile device is greater than an angle threshold.
8. The mobile device one-handed operation device according to claim 6, wherein the initial position of the icon is:
a geometric center of the display screen.
9. The mobile device one-handed operation apparatus according to claim 6, further comprising:
the left-hand operation mode selection unit is used for prompting a user to select one operation mode from left-hand operation and right-hand operation by utilizing a popup window form after the state of the mobile equipment meets a preset single-hand mode trigger condition and before a semitransparent virtual point control layer is displayed in a near finger area; when the user selects left-hand operation, displaying a semitransparent virtual point control layer in the lower left corner area of the display screen; and when the user selects the right-hand operation, displaying a semitransparent virtual point control layer in the lower right corner area of the display screen.
10. The device of any one of claims 6 to 9, further comprising:
the absorption effect unit is used for generating a semitransparent covering layer with the same size as the clickable element on the uppermost layer of the clickable element when the center of the graph moves into the clickable element displayed on the display screen, and the graph exists in an invisible form; and when the center of the graph moves out of the clickable element displayed by the display screen, canceling the semi-transparent covering layer and displaying the graph.
CN202011193545.0A 2020-10-30 2020-10-30 Single-hand operation method and device for mobile equipment Pending CN112269512A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011193545.0A CN112269512A (en) 2020-10-30 2020-10-30 Single-hand operation method and device for mobile equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011193545.0A CN112269512A (en) 2020-10-30 2020-10-30 Single-hand operation method and device for mobile equipment

Publications (1)

Publication Number Publication Date
CN112269512A true CN112269512A (en) 2021-01-26

Family

ID=74344445

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011193545.0A Pending CN112269512A (en) 2020-10-30 2020-10-30 Single-hand operation method and device for mobile equipment

Country Status (1)

Country Link
CN (1) CN112269512A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103309564A (en) * 2013-07-01 2013-09-18 贝壳网际(北京)安全技术有限公司 Element information display method and device
CN106371688A (en) * 2015-07-22 2017-02-01 小米科技有限责任公司 Full-screen single-hand operation method and apparatus
CN106775199A (en) * 2016-11-11 2017-05-31 北京奇虎科技有限公司 The touch operation method and terminal of screen interface
CN109189285A (en) * 2018-08-16 2019-01-11 恒生电子股份有限公司 Operation interface control method and device, storage medium, electronic equipment
CN110321056A (en) * 2019-07-15 2019-10-11 深圳传音控股股份有限公司 Control moving method, mobile phone and storage medium based on terminal
CN110389704A (en) * 2019-06-18 2019-10-29 中国平安财产保险股份有限公司 One-handed performance method, mobile terminal and the storage medium of mobile terminal
CN111343341A (en) * 2020-05-20 2020-06-26 北京小米移动软件有限公司 One-hand mode implementation method based on mobile equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103309564A (en) * 2013-07-01 2013-09-18 贝壳网际(北京)安全技术有限公司 Element information display method and device
CN106371688A (en) * 2015-07-22 2017-02-01 小米科技有限责任公司 Full-screen single-hand operation method and apparatus
CN106775199A (en) * 2016-11-11 2017-05-31 北京奇虎科技有限公司 The touch operation method and terminal of screen interface
CN109189285A (en) * 2018-08-16 2019-01-11 恒生电子股份有限公司 Operation interface control method and device, storage medium, electronic equipment
CN110389704A (en) * 2019-06-18 2019-10-29 中国平安财产保险股份有限公司 One-handed performance method, mobile terminal and the storage medium of mobile terminal
CN110321056A (en) * 2019-07-15 2019-10-11 深圳传音控股股份有限公司 Control moving method, mobile phone and storage medium based on terminal
CN111343341A (en) * 2020-05-20 2020-06-26 北京小米移动软件有限公司 One-hand mode implementation method based on mobile equipment

Similar Documents

Publication Publication Date Title
KR102219912B1 (en) Remote hover touch system and method
US8854325B2 (en) Two-factor rotation input on a touchscreen device
US10275145B2 (en) Drawing support tool
Pfeuffer et al. Gaze and touch interaction on tablets
EP2715491B1 (en) Edge gesture
JP5691464B2 (en) Information processing device
US9524097B2 (en) Touchscreen gestures for selecting a graphical object
KR101379398B1 (en) Remote control method for a smart television
TWI658396B (en) Interface control method and electronic device using the same
US9372590B2 (en) Magnifier panning interface for natural input devices
US20140191972A1 (en) Identification and use of gestures in proximity to a sensor
US9280265B2 (en) Input control device, input control method, and input control program for controlling display target upon receiving input on display screen of display device
KR20130052749A (en) Touch based user interface device and methdo
WO2016138661A1 (en) Processing method for user interface of terminal, user interface and terminal
JP5780438B2 (en) Electronic device, position designation method and program
JP5374564B2 (en) Drawing apparatus, drawing control method, and drawing control program
CN103513914A (en) Touch control method and device of application object
KR20160019762A (en) Method for controlling touch screen with one hand
CN111142775A (en) Gesture interaction method and device
US20150100912A1 (en) Portable electronic device and method for controlling the same
Karam et al. Finger click detection using a depth camera
CN112269512A (en) Single-hand operation method and device for mobile equipment
EP2634679A1 (en) Two-factor rotation input on a touchscreen device
US9864500B2 (en) Application for controlling auto scroll of content as a function of tilting the device
Kuribara et al. HandyScope: A remote control technique using circular widget on tabletops

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210126

RJ01 Rejection of invention patent application after publication