CN113476822A - Touch method and device - Google Patents

Touch method and device Download PDF

Info

Publication number
CN113476822A
CN113476822A CN202110654261.5A CN202110654261A CN113476822A CN 113476822 A CN113476822 A CN 113476822A CN 202110654261 A CN202110654261 A CN 202110654261A CN 113476822 A CN113476822 A CN 113476822A
Authority
CN
China
Prior art keywords
touch
control
sliding
operation area
compass
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110654261.5A
Other languages
Chinese (zh)
Other versions
CN113476822B (en
Inventor
郭宇
张潇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202110654261.5A priority Critical patent/CN113476822B/en
Publication of CN113476822A publication Critical patent/CN113476822A/en
Application granted granted Critical
Publication of CN113476822B publication Critical patent/CN113476822B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/22Setup operations, e.g. calibration, key configuration or button assignment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • A63F2300/1075Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a touch method and touch equipment, relates to the field of touch, and solves the problem that when a user slides back towards a mobile compass outside a compass operating area of the mobile compass, the movement of a compass control of the mobile compass is delayed. The specific scheme is as follows: the electronic equipment can display a first interface, wherein the first interface comprises a control operation area and a sliding control positioned in the control operation area, and the sliding control is used for moving in the control operation area to control the controlled object to move; detecting a first operation that the touch control medium slides from the inside of the control operation area to the outside of the control operation area along a first direction, and controlling the sliding control to move to a first position of the boundary of the control operation area along the first direction; and detecting a second operation that the direction of the touch medium is changed by sliding in the first direction and the touch medium slides in the second direction outside the control operation area, and controlling the sliding control to move along the second direction in the control operation area.

Description

Touch method and device
Technical Field
The embodiment of the application relates to the field of touch control, in particular to a touch control method and device.
Background
At present, a mobile compass (or called as a mobile roulette wheel) is arranged in many mobile phone games, and when a user plays games by using a mobile phone, the user can control characters in the games to move through the arranged mobile compass. For example, as shown in fig. 1, the movement compass may be composed of two circles, wherein an area of a circle with a relatively large radius defines a user-operable area (i.e., a compass operation area), and a circle with a relatively small radius serves as a compass control, and a user may drag the circle with the relatively small radius in the circle with the relatively large radius by means of touch dragging, so as to control a character in the game to move in a corresponding direction. Namely, the user can drag the compass control in the compass operation area to control the character in the game to move according to the corresponding direction.
Generally, after a user drags the compass control in the compass operation area, a direction in which the initial position (or referred to as a default position) of the compass control points to the current position of the compass control is a direction in which a character in the game moves. Thus, when the user needs to control the game character to move in a reverse direction, the user can drag the compass control to move in the opposite direction within the compass manipulation region (i.e., the foldback drag). However, since the compass operating region is generally small, the user's finger often slips out of the compass operating region when dragging the compass control. At this time, if the user needs to drag the compass control to move in the opposite direction, the user can drag the compass control only after the user slides a certain distance in the opposite direction and enters the compass operation area. Therefore, when the user slides a finger in the opposite direction to drag the compass control to control the game character to move back, the movement of the compass control is delayed, and further the control of the movement of the game character is delayed.
Disclosure of Invention
The embodiment of the application provides a touch method and touch equipment, and solves the problem that when a user slides back and forth towards a mobile compass outside a compass operation area of the mobile compass, the movement of a compass control of the mobile compass is delayed.
In order to achieve the above purpose, the embodiment of the present application adopts the following technical solutions:
in a first aspect, an embodiment of the present application provides a touch method, which may be applied to an electronic device. The method comprises the following steps: the electronic equipment can display a first interface, wherein the first interface comprises a control operation area and a sliding control positioned in the control operation area, and the sliding control is used for moving in the control operation area to control the controlled object to move; detecting a first operation that the touch control medium slides from the inside of the control operation area to the outside of the control operation area along a first direction, and controlling the sliding control to move to a first position of the boundary of the control operation area along the first direction; detecting a second operation that the direction of the touch medium is changed by sliding in the first direction and the touch medium slides in the second direction outside the control operation area, and controlling the sliding control to move in the second direction in the control operation area; and the extension line of the path of the touch medium sliding along the second direction is intersected with the control operation area.
By adopting the technical scheme, when a user needs to drag the sliding control through the touch medium (such as a finger or a touch pen) to control the controlled object to move back, even if the touch medium changes the sliding direction to perform the back-folding sliding without being in the control operation area, the sliding control can move according to the back-folding sliding direction of the touch medium so as to control the controlled object to move correspondingly. Therefore, when the user drags the sliding control, the sliding control can be prevented from being dragged only after the touch medium enters the control operation area after sliding for a certain distance because the touch medium is not in the control operation area. The delay of the user dragging the sliding control at the moment is reduced, and the delay of the user controlling the controlled object is further reduced.
In one possible implementation manner, after detecting a second operation that the touch medium slides in a second direction outside the control operation area and changes direction from the first direction to the second direction, the method further includes: detecting a third operation that the direction of the touch medium is changed by sliding in the second direction outside the control operation area and the touch medium slides in the third direction, and controlling the sliding control to move to a second position, wherein the second position is an intersection point of a connection line between the touch position of the touch medium sliding in the third direction and the default position of the sliding control and the boundary of the control operation area; and the extension line of the sliding path of the touch medium along the third direction is not intersected with the control operation area.
Therefore, when the user slides towards the control operation area outside the control operation area through the touch medium and does not slide into the control operation area, and the user slides the touch medium towards the direction far away from the control operation area, the electronic equipment can not forecast points according to the touch operation of the user any more but directly control the sliding control according to the touch forecast points when the user slides currently. Thereby further improving the handedness of the sliding control.
In another possible implementation manner, the method further includes: and controlling the controlled object to move along the direction of the default position of the sliding control pointing to the position of the sliding control.
Therefore, the user can conveniently control the controlled object to move by dragging the sliding control.
In another possible implementation, moving the slider control to a first position of the boundary of the control operation area along a first direction includes: and controlling the sliding control to slide to a first position of the boundary of the control operation area along the touch medium along a first direction according to the touch report point corresponding to the touch medium in the first operation process.
The sliding control moves according to the touch report points in the user touch operation process, so that the movement of the sliding control and the movement of the touch medium can be synchronized, and the movement of the sliding control has the following chirality.
In another possible implementation manner, the controlling the sliding control to move in the second direction in the control operation area includes: determining that the touch medium slides in a second direction from the first direction to the second direction outside the control operation area according to the touch report point corresponding to the touch medium in the second operation process; generating a predicted point along a second direction in the control operation area according to a touch point corresponding to the touch medium in a second operation process; and controlling the sliding control to move along the second direction in the control operation area according to the forecast point.
The touch control method comprises the steps of generating a predicted point according to a touch report point corresponding to a touch medium, enabling the generation of the predicted point to correspond to the touch report point, and then controlling a sliding control according to the predicted point, so that the sliding control can move along the sliding direction of the touch medium when the touch medium does not slide into a control operation area, and the time delay of a user for controlling the sliding control is reduced.
In another possible implementation manner, a prejudgment area is arranged outside the control operation area; the second operation is that the touch control medium slides in a second direction from the first direction to change the direction in the pre-judging area; determining that the touch medium slides in the second direction from the first direction to change the direction outside the control operation area according to the corresponding touch report point in the second operation process, wherein the method comprises the following steps: and determining that the direction of the touch medium is changed from the sliding in the first direction to the sliding in the second direction in the pre-judging area according to the corresponding touch report point in the second operation process.
By setting the pre-judgment area, the operation of the touch media in all the touch areas can be prevented from being monitored, and the operation pressure is reduced. Moreover, the error judgment of the user when performing other operations outside the control operation area can be avoided.
In another possible implementation manner, the predetermined area is preset, or the predetermined area is generated according to a historical touch operation of the touch medium in a preset area range of the position of the control operation area.
In another possible implementation manner, after controlling the sliding control to move in the second direction within the control operation region according to the forecast point, the method further includes: and under the condition that the touch medium slides into the control operation area along the second direction, controlling the sliding control to slide along the second direction in the control operation area along with the touch medium according to the touch report point corresponding to the touch medium.
Therefore, when the user slides towards the control operation area through the touch medium, the sliding control can move according to the forecast point when the touch medium does not slide into the control operation area, and when the touch medium slides into the control operation area, the sliding control can move according to the actual touch forecast point of the touch medium at the moment. Therefore, when the user slides into the control operation area through the touch medium, the sliding control can be more accurately dragged and controlled by the user.
In another possible implementation, controlling the slide control to slide to the second position includes: determining that the touch medium slides in the third direction from the second direction to change the direction outside the control operation area according to the touch report point corresponding to the touch medium in the third operation process; and controlling the sliding control to slide to the second position.
In another possible implementation manner, the first interface comprises a game interface, the control operation area comprises a compass operation area, the slide control comprises a compass control, and the controlled object comprises a game character in the game interface.
In another possible implementation, the touch medium includes a stylus, a user's finger, and the like.
In a second aspect, an embodiment of the present application provides a touch device, which can be applied to an electronic device, for implementing the method in the first aspect. The functions of the device can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules corresponding to the above functions, for example, a display module, a processing module, and the like.
The display module can be used for displaying a first interface, the first interface comprises a control operation area and a sliding control located in the control operation area, and the sliding control is used for moving in the control operation area to control the controlled object to move; the processing module may be configured to detect a first operation that the touch medium slides from inside the control operation area to outside the control operation area along a first direction, and control the sliding control to move to a first position of a boundary of the control operation area along the first direction; detecting a second operation that the direction of the touch medium is changed by sliding in the first direction and the touch medium slides in the second direction outside the control operation area, and controlling the sliding control to move in the second direction in the control operation area; and the extension line of the path of the touch medium sliding along the second direction is intersected with the control operation area.
In a possible implementation manner, the processing module may be further configured to detect a third operation that the touch medium slides in a third direction from the second direction to the third direction outside the control operation area, and control the sliding control to move to the second position, where the second position is an intersection point of a connection line between the touch position of the touch medium and the default position of the sliding control and a boundary of the control operation area; and the extension line of the sliding path of the touch medium along the third direction is not intersected with the control operation area.
In another possible implementation manner, the processing module may be further configured to control the controlled object to move along a direction in which the default position of the sliding control points to the position of the sliding control.
In another possible implementation manner, the processing module is specifically configured to control the sliding control to slide to the first position of the boundary of the control operation area along the touch medium along the first direction according to the touch report corresponding to the touch medium in the first operation process.
In another possible implementation manner, the processing module is specifically configured to determine, according to a touch report point corresponding to the touch medium in the second operation process, that the direction of the touch medium is changed from the first direction sliding outside the control operation area to the second direction sliding; generating a predicted point along a second direction in the control operation area according to a touch point corresponding to the touch medium in a second operation process; and controlling the sliding control to move along the second direction in the control operation area according to the forecast point.
In another possible implementation manner, a prejudgment area is arranged outside the control operation area; the second operation is that the touch control medium slides in a second direction from the first direction to change the direction in the pre-judging area; and the processing module is specifically used for determining that the direction of the touch medium is changed from the first direction sliding in the pre-judging area and the touch medium slides in the second direction according to the corresponding touch report point in the second operation process.
In another possible implementation manner, the predetermined area is preset, or the predetermined area is generated according to a historical touch operation of the touch medium in a preset area range of the position of the control operation area.
In another possible implementation manner, the processing module may be further configured to control the sliding control to slide along the second direction in the control operation area along with the touch medium according to the touch report corresponding to the touch medium when the touch medium slides into the control operation area along the second direction.
In another possible implementation manner, the processing module is specifically configured to determine, according to a touch report point corresponding to the touch medium in the third operation process, that the direction of the touch medium is changed from the second direction sliding outside the control operation area to the third direction sliding; and controlling the sliding control to slide to the second position.
In another possible implementation manner, the first interface comprises a game interface, the control operation area comprises a compass operation area, the slide control comprises a compass control, and the controlled object comprises a game character in the game interface.
In another possible implementation, the touch medium includes a stylus, a user's finger, and the like.
In a third aspect, an embodiment of the present application provides an electronic device, including: a processor, a memory for storing instructions executable by the processor. The processor is configured to execute the above instructions, so that the electronic device implements the touch method according to the first aspect or any one of the possible implementation manners of the first aspect.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium having computer program instructions stored thereon. The computer program instructions, when executed by the electronic device, cause the electronic device to implement the touch method of the first aspect or any of its possible implementations.
In a fifth aspect, an embodiment of the present application provides a computer program product, which includes computer readable code, and when the computer readable code is executed in an electronic device, causes the electronic device to implement the touch method according to the first aspect or any one of the possible implementation manners of the first aspect.
It should be understood that the beneficial effects of the second to fifth aspects can be seen from the description of the first aspect, and are not repeated herein.
Drawings
Fig. 1 is a schematic composition diagram of a mobile compass provided in the related art;
FIG. 2 is a schematic view of a scene of a game character controlled by a mobile compass according to the related art;
FIG. 3 is a schematic view of another related art scenario for controlling a game character using a mobile compass;
fig. 4 is a schematic diagram illustrating a position of a touch report point during a user touch operation according to the related art;
fig. 5 is a schematic composition diagram of an electronic device according to an embodiment of the present disclosure;
fig. 6 is a simplified schematic diagram of a software architecture of an electronic device according to an embodiment of the present application;
fig. 7 is a schematic position diagram of a prediction area according to an embodiment of the present application;
fig. 8 is a schematic position diagram of a touch track according to an embodiment of the present disclosure;
fig. 9 is a schematic position diagram of another prediction area provided in the embodiment of the present application;
fig. 10 is a flowchart illustrating a touch method according to an embodiment of the present disclosure;
fig. 11 is a schematic interface diagram when a touch method provided in the embodiment of the present application is applied;
fig. 12 is a schematic view of a scene when a touch method according to an embodiment of the present disclosure is applied;
fig. 13 is a schematic diagram illustrating a position of a touch point during a user touch operation according to an embodiment of the present disclosure;
fig. 14 is a schematic diagram illustrating positions of a touch newspaper point and a report point provided in the present embodiment;
fig. 15 is a schematic diagram illustrating positions of another predicted point and a touch point according to an embodiment of the present disclosure;
fig. 16 is a schematic diagram illustrating positions of another predicted point and a touch point according to an embodiment of the present disclosure;
fig. 17 is a schematic view of another scenario when the touch method provided in the embodiment of the present application is applied;
fig. 18 is a schematic diagram illustrating positions of another predicted point and a touch point according to an embodiment of the present disclosure;
fig. 19 is a schematic structural diagram of a touch device according to an embodiment of the present disclosure.
Detailed Description
With the continuous development of electronic devices, many electronic devices have a touch function (for example, electronic devices with touch screens such as mobile phones and tablet computers). Taking an electronic device as an example of a mobile phone, a mobile compass is set in many game applications in the mobile phone at present, so that a user can control characters in a game to move through the mobile compass. For example, as shown in fig. 1, the movement compass may be composed of two concentric circles, wherein an area of a circle with a relatively large radius defines a user-operable area (i.e., a compass operation area 101), and a circle with a relatively small radius serves as a compass control 102, and a user can drag the circle with the relatively small radius within the circle with the relatively large radius by means of touch dragging, so as to control a character in a game to move in a corresponding direction. That is, the user can drag the compass control 102 in the compass operating region 101 to control the character in the game to move according to the corresponding direction.
Generally, after a user drags the compass control in the compass operation area, a direction in which the initial position of the compass control (or referred to as a default position, such as a geometric center of the compass operation area) points to the current position of the compass control is a direction in which a character in the game moves. For example, as shown in fig. 2 (a), the compass operating region 201 is a circular region, the compass control 202 is located in the circular region, and the geometric center of the compass control 202 coincides with the center of the compass operating region 201 when the compass control 202 is located at the initial position. When the user drags the compass control 202, the compass control 202 moves along with the user, as shown in (b) of fig. 2, after the compass control 202 moves, the direction in which the initial position of the compass control 202 points to the current position of the compass control 202 is the direction in which the character 203 in the game controlled by the mobile compass moves.
When the user slides the finger to drag the compass control, the compass control can move correspondingly according to the touch control touch report point when the user slides the finger, so that the compass control can move correspondingly (namely dragged by the user) when the user slides the finger in the compass operation area, and the compass control cannot be dragged by the user when the user slides the finger outside the compass operation area. For example, as shown in fig. 3 (a), when the user drags the compass control 301 located at the initial position in a certain direction, the user's finger slides out of the compass operating area 302, and then the compass control 301 is not dragged by the user's finger any more after the finger moves to the boundary of the compass operating area 302. That is, after the user's finger slides out of the compass operating region 302, the compass control 301 will not move in the direction of the user's finger following the user's finger sliding. As shown in fig. 3 (b), when the user's finger slides outside the compass manipulation region 302 in a direction opposite to the direction in which the finger slides as shown in fig. 3 (a), the compass control 301 is still at the boundary of the compass manipulation region 302 and does not move in the sliding direction of the user's finger with the sliding of the user's finger when the user's finger has not yet slid into the compass manipulation region 302.
When the user needs to control the game character to move back, the user can drag the compass control to move towards the opposite direction in the compass operation area (i.e. back dragging). However, when the user drags the compass control, the finger of the user often slides out of the compass operation area, and at this time, if the user needs to drag the compass control to move in the opposite direction, the finger needs to slide for a certain distance in the opposite direction, and the user can drag the compass control after the finger enters the compass operation area. For example, as shown in fig. 4, when the user slides the finger to drag the compass control to move back, the mobile phone may obtain the touch newspaper point when the user slides the finger according to a certain sampling rate. Touch points of the user finger can be sequentially from point 1 to point 17 along the sliding track. The touch points 1 to 4 are located in the compass operating area 401, so that when the finger of the user slides from the position of the touch point 1 to the position of the touch point 4 (i.e. when the finger of the user slides in the compass operating area 401), the compass control (not shown in the figure) can move according to the touch points 1 to 4. When the user's finger slides from the position of the point 5 located outside the compass operating area 401 to the position of the point 13 (i.e. when the user's finger slides outside the compass operating area 401), the compass control is still at the position of the point 4 because the touch point is located outside the compass operating area 401 at this time. When the finger of the user slides to the location of the touch point 14 to the location of the touch point 17 in the compass operation area 401 (i.e. when the finger of the user slides to the compass operation area 401), the compass control moves along with the finger of the user according to the touch points 14 to 17 since the touch point is located in the compass operation area 401.
From the above, when the user drags the compass control to control the character in the game to move back by sliding the finger, there may be a delay in the movement of the compass control, and thus a delay in the control of the movement of the game character.
In order to solve the above problem, embodiments of the present application provide a touch method and device. The touch method can be applied to a scene in which a user controls a controlled object (such as a character, an object and other virtual characters in a game) to move through a sliding control in a control operation area displayed in the touch electronic device in the electronic device with the touch function and the display function. For example, taking an electronic device as a mobile phone and a controlled object as a character in a game as an example, when the mobile phone runs the game, a mobile compass or a mobile roulette is displayed in a picture of the mobile phone, and the mobile compass includes a compass operating area (i.e., a control operating area), a compass control (i.e., a slide control) located in the compass operating area, and the game character. The user may control the game piece movement by dragging the compass control. For another example, taking the electronic device as a mobile phone and the controlled object as an unmanned aerial vehicle as an example, after the mobile phone is in communication connection with the unmanned aerial vehicle, the mobile phone may display a control joystick, and the control joystick may include a joystick operation region (i.e., a control operation region) and a joystick control (i.e., a sliding control) located in the joystick operation region. The user can control unmanned aerial vehicle to move through dragging the rocker control and moving in the rocker operation area.
The touch method may include: the electronic device is displayed with a first interface. The first interface may include a control operational area and a slider control located within the control operational area. The controlled object can be controlled to move by sliding the sliding control in the control operation area. When the touch control medium performs a touch control operation (or referred to as a first operation) that slides from inside the control operation area to outside the control operation area along a first direction, in response to the first operation, the electronic device may control the sliding control to slide along with the touch control medium along the first direction to a first position of a boundary of the control operation area. Then, in a case where the touch medium slides out of the control operation area in the first direction, if the touch medium performs a touch operation (or referred to as a second operation) of sliding in a second direction from the first direction to change the direction outside the control operation area, the electronic device may control the slide control to slide in the second direction within the control operation area in response to the second operation. The touch medium is a medium for a user to perform a touch operation, and may be a finger of the user or a stylus pen. An extension line of a path along which the touch medium slides in the second direction intersects with the control operation area. That is, when the user drags the sliding control through the touch medium to control the controlled object to move, if the touch medium slides out of the control operation area along the first direction and then slides back and forth towards the control operation area along the second direction, the sliding control starts to move along the second direction when the touch medium does not slide into the control operation area along the second direction.
By adopting the touch method, when a user needs to drag the sliding control through a touch medium (such as a finger or a touch pen) to control the controlled object to move back, even if the touch medium changes the sliding direction to perform the back-folding sliding without being in the control operation area, the sliding control can move according to the back-folding sliding direction of the touch medium so as to control the controlled object to move correspondingly. Therefore, when the user drags the sliding control, the sliding control can be prevented from being dragged only after the touch medium enters the control operation area after sliding for a certain distance because the touch medium is not in the control operation area. The delay of the user dragging the sliding control at the moment is reduced, and the delay of the user controlling the controlled object is further reduced.
Hereinafter, a touch method provided in an embodiment of the present application will be described with reference to the drawings.
In the embodiment of the present application, the electronic device may be a mobile phone, a tablet computer, a handheld computer, a PC, a cellular phone, a Personal Digital Assistant (PDA), a wearable device (e.g., a smart watch, a smart band), a smart home device (e.g., a television), a vehicle machine (e.g., a vehicle-mounted computer), a smart screen, a game machine, and an Augmented Reality (AR)/Virtual Reality (VR) device. The embodiment of the present application is not particularly limited to the specific device form of the electronic device.
Exemplarily, taking an electronic device as a mobile phone as an example, fig. 5 shows a schematic structural diagram of an electronic device provided in an embodiment of the present application. That is, the electronic device shown in fig. 5 may be a cellular phone, for example.
As shown in fig. 5, the electronic device may include: radio Frequency (RF) circuit 510, memory 520, input unit 530, display unit 540, sensor 550, audio circuit 560, wireless fidelity (WiFi) module 570, processor 580, power supply 590, and bluetooth module 5100. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 5 does not constitute a limitation of the electronic device and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
A detailed description of some of the components of the electronic device is provided below with reference to fig. 5:
the memory 520 may be used to store software programs and modules, and the processor 580 executes various functional applications and data processing of the electronic device by operating the software programs and modules stored in the memory 520. The memory 520 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, a boot loader (boot loader), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the electronic device, and the like. Further, the memory 520 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The input unit 530 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic device. Specifically, the input unit 530 may include a touch panel 531 and other input devices 532. The touch panel 531, also called a touch screen, can collect touch operations of a user on or near the touch panel 531 (for example, operations of the user on or near the touch panel 531 by using any suitable object or accessory such as a finger or a stylus pen), and drive the corresponding connection device according to a preset program. Alternatively, the touch panel 531 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, and sends the touch point coordinates to the processor 580, and can receive and execute commands sent by the processor 580. In addition, the touch panel 531 may be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The input unit 530 may include other input devices 532 in addition to the touch panel 531. In particular, other input devices 532 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 540 may be used to display information input by a user or information provided to the user and various menus of the electronic device. The display unit 540 may include a display panel 541, and optionally, the display panel 541 may be configured in the form of a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), or the like. Further, the touch panel 531 may cover the display panel 541, and when the touch panel 531 detects a touch operation on or near the touch panel 531, the touch panel is transmitted to the processor 580 to determine the type of the touch event, and then the processor 580 provides a corresponding visual output on the display panel 541 according to the type of the touch event. Although in fig. 5, the touch panel 531 and the display panel 541 are two independent components to implement the input and output functions of the electronic device, in some embodiments, the touch panel 531 and the display panel 541 may be integrated to implement the input and output functions of the electronic device.
The processor 580 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, performs various functions of the electronic device and processes data by running or executing software programs or modules stored in the memory 520 and calling data stored in the memory 520, thereby integrally monitoring the electronic device. Alternatively, processor 580 may include one or more processing units; preferably, the processor 580 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 580.
Of course, it should be understood that fig. 5 is only an exemplary illustration of the electronic device in the form of a mobile phone. If the electronic device is a tablet computer, a handheld computer, a PC, a PDA, a wearable device (e.g., a smart watch, a smart bracelet), a smart home device (e.g., a television), a vehicle machine (e.g., a vehicle-mounted computer), a smart screen, a game machine, an AR/VR device, and other device forms, the structure of the electronic device may include fewer structures than those shown in fig. 5, or more structures than those shown in fig. 5, which is not limited herein.
The methods in the following embodiments may be implemented in an electronic device having the above hardware structure.
The electronic device in the embodiment of the present application may be an electronic device capable of running an operating system and installing an application program. The operating system run by the electronic device can be android
Figure BDA0003113178560000081
Systematic and Hongmong
Figure BDA0003113178560000082
The system,
Figure BDA0003113178560000083
The system,
Figure BDA0003113178560000084
The system,
Figure BDA0003113178560000085
The system,
Figure BDA0003113178560000086
Systems, and the like. More specifically, it may be
Figure BDA0003113178560000087
The system,
Figure BDA0003113178560000088
Systems, and the like.
In the embodiment of the present application, the software architecture of the electronic device may refer to fig. 6. Fig. 6 shows a simplified schematic diagram of a software architecture of an electronic device according to an embodiment of the present application. As shown in fig. 6, the software architecture of the electronic device may include a bottom layer 601, a middle layer 602, and an application layer 603. The bottom layer 601 can generate touch newspaper points according to touch operations of a user, and the middle layer 602 can process the touch newspaper points. Each application in the electronic device is deployed in the application layer 603.
The following describes embodiments of the present application in detail by taking an electronic device as a mobile phone, and taking a user to control a game character in a game through a mobile compass displayed in the mobile phone (that is, a control operation area is a compass operation area, a slide control is a compass control, and a controlled object is a game character).
In this embodiment of the application, in a first interface displayed by an electronic device, a predetermined area may be set in an area outside a compass operation area of a mobile compass, and when a user drags a compass control of the mobile compass through a touch medium (e.g., a finger or a stylus pen) to control a controlled object to move, if a sliding motion of the touch medium sliding out of the compass operation area of the mobile compass in a first direction and then sliding back toward the compass operation area in a second direction is in the predetermined area, the electronic device may correspondingly generate a predetermined point in the compass operation area according to a touch report point when the touch medium slides toward the compass operation area in the second direction in the predetermined area, so that the compass control of the mobile compass can move according to the prediction point. Therefore, the compass control starts to move along the second direction when the touch medium slides along the second direction but does not slide into the compass operation area, and the delay of the user in dragging the compass control of the moving compass to move back through the back-turning sliding of the touch medium is reduced.
The predetermined area set in the first interface of the electronic device may be preset in the electronic device, or may be generated by the electronic device according to collected historical touch operations (such as historical touch report points or touch tracks generated by the historical touch operations) of the user (or the touch medium) in the preset area range of the position of the compass operation area. Of course, in some embodiments, the predetermined area may also be a preset default area, and then, as the user uses the predetermined area, the electronic device optimizes and refreshes the predetermined area according to the collected historical touch report points generated by the historical touch operation of the user. The electronic equipment can optimize the pre-judging area by collecting touch report points corresponding to touch operation of the user and then according to the touch report points, so that the pre-judging area is more adaptive to the corresponding user.
For example, a default pre-determined region may be preset in the electronic device, for example, as shown in fig. 7, a mobile compass is displayed in the electronic device, and the mobile compass includes a compass operating region 701 and a compass control 702 located in the compass operating region 701. The default look-ahead region 703 may be set as a circle around the compass operational area 701 at the periphery of the compass operational area 701. The width of the circular ring may be set according to an actual situation, or an area where the test user operates relatively frequently near the mobile compass (i.e., an area where touch-control points or touch-control tracks recorded near the mobile compass are relatively dense) may be used as the default pre-determination area 703 based on internal test big data (i.e., touch-control points or touch-control track data of the touch-control operation of the test user obtained during internal test of the electronic device). Then, the electronic device may collect touch newspaper points corresponding to the touch operations of the user (or the touch medium) in the actual use process of the user (which may be collecting touch newspaper points corresponding to the touch operations of the user or a touch track formed by the touch newspaper points when the electronic device displays the moving compass). For example, as shown in fig. 8, when the electronic device displays the movement compass 801, a touch track 802 composed of touch points corresponding to touch operations of the user near the movement compass 801 is collected. Then, the electronic device can take an area near the mobile compass where touch report points are relatively dense as an optimized pre-judgment area according to the collected touch report points or touch tracks. For example, taking the collected touch points or touch tracks of the electronic device as shown in fig. 8 as an example, as shown in fig. 9, the predicted area 901 optimized according to the collected touch points or touch tracks is a circular area covering most of the touch points or touch tracks as shown in the figure. It should be noted that before acquiring the touch report point or the touch trajectory corresponding to the touch operation of the user, the position of the mobile compass may be determined according to the display frame or the user setting, and then the touch report point or the touch trajectory corresponding to the touch operation near the mobile compass (or the compass operation area) may be acquired.
Fig. 10 shows a flowchart of a touch method according to an embodiment of the present application. As shown in fig. 10, the touch method may include the following S1001-S1004.
S1001, in response to a touch operation of a user (or a touch medium), the electronic device generates a corresponding touch report point according to the touch operation of the user through a bottom layer of the software architecture.
When the user performs touch operation, the electronic device can generate touch report points corresponding to the touch operation in real time. For example, when the touch operation is a sliding operation performed by a user through a touch medium (e.g., a mobile phone of the user, a stylus pen, etc.), the electronic device may generate the touch point at a certain sampling rate (e.g., when the user performs the sliding operation, the touch point as shown in fig. 4 is generated).
Optionally, in this embodiment of the application, before S1001, the electronic device may further start a function implementing the touch method in response to an operation of the user starting the function of the touch method, so that the user may manually start or close the function. Therefore, when the user starts the function, the electronic device realizes the corresponding function based on the touch method shown in fig. 10. Of course, if the above-mentioned pre-determination area needs to be set according to the collected touch report points of the touch operation of the user, the electronic device may collect the touch report points of the pre-determination area set by the user and generate the pre-determination area after the user starts the function.
For example, as shown in fig. 11, a user may open a system setting of the electronic device to turn on a touch optimization function. As shown in fig. 11 (a), the user may open a game mode option in the system setup. The game mode option may include a setting option "touch optimization". As shown in (b) of fig. 11, the user can turn on the touch optimization function by clicking the setting option "touch optimization".
Of course, in some other possible embodiments, the electronic device may also implement the method shown in fig. 10 by default, i.e., without the user setting on.
After the electronic device generates the touch report point according to the touch operation of the user, the bottom layer may send (report) the touch report point to the middle layer in the software architecture of the electronic device. After the middle layer receives the touch hit, the electronic device may perform the following S1002.
S1002, the electronic device determines that the touch operation of the user is a retrace sliding towards a compass operation area in a pre-determined area according to a touch report point through an intermediate layer of a software framework (the retrace sliding is that the touch medium slides in a second direction from a first direction to a second direction outside the compass operation area).
In some embodiments, the electronic device may determine, within the pre-determination area, a touch operation of the user according to the touch report generated by the implementation. When the sliding touch track formed by connecting the sequentially generated touch report points is turned back, it can be determined that the touch operation of the user at this time is the turn-back sliding towards the compass operation area (or towards the moving compass) in the pre-determined area. For example, as shown in fig. 12, the user may perform a sliding touch operation by using a finger as a touch medium. The user may start sliding from within the compass operating area 1201 in a certain direction (e.g., a first direction) to the anticipation area 1202 outside the compass operating area 1201, and then change the direction to slide in another direction (e.g., a second direction) (i.e., turn back the sliding). As shown in fig. 13, when the user performs the sliding touch operation shown in fig. 12, touch points sequentially generated by the electronic device are the point 1 to the point 10 along with the touch operation of the user. Wherein, report 1 to report 4 are located in the compass operating region 1301, and report 5 to report 10 are located in the pre-judging region 1302. When the newspaper point 10 is generated, since the sliding touch trajectory formed by connecting newspaper points 1 to 10 is turned back in the pre-determined area 1302, the electronic device can determine that the touch operation of the user is the turn-back sliding when the newspaper point 10 is generated.
Alternatively, when the user performs the sliding touch through the touch medium, the turning of the sliding touch trajectory composed of the sequentially connected touch points may be determined when an included angle between a portion of the sliding touch trajectory before the turning back (for example, the sliding trajectory composed of the touch points 1 to 9 shown in fig. 13) and a portion of the sliding touch trajectory after the turning back sliding (for example, the sliding trajectory composed of the touch points 9 to 10 shown in fig. 13) is smaller than a predetermined angle (for example, 30 degrees, 45 degrees, and the like). That is, after the electronic device forms the sliding touch trajectory according to the sequentially generated touch report points, if the sliding direction of the sliding touch trajectory changes and the included angle between the sliding touch trajectories before and after the change is smaller than a certain angle, the electronic device may determine that the current touch operation of the user is the fold-back sliding. Of course, in other embodiments, the foldback may be determined to occur as long as the sliding touch trajectory forms a kink.
In the embodiment of the present application, if the predetermined area is generated according to the touch report point corresponding to the collected user touch operation, the predetermined area may be generated before S1002 or S1001.
When the electronic device determines that the touch operation of the user is the return sliding towards the compass operation area in the pre-judging area, the electronic device can generate a pre-judging point according to a subsequent touch-judging point. For example, the electronic device may perform the following S1003.
S1003, the electronic equipment generates a forecast point located in a compass operation area of the mobile compass according to the touch report point through a middle layer of the software architecture.
In some possible embodiments, when the electronic device determines that the touch operation of the user is a retrace sliding (e.g., a sliding touch operation as shown in fig. 12), the electronic device generates a predicted point located in the compass operation area according to a subsequent touch report point (i.e., a touch report point during the retrace sliding), which may be based on a sliding direction during the retrace sliding. For example, as shown in fig. 14, the electronic device sequentially generates report point 1 to report point 10 according to the user operation, where report point 1 to report point 4 are located in a compass operation area 1401, and report point 5 to report point 10 are located in a prediction area 1402. When the report point 10 is generated, the electronic device determines that the touch operation of the user at the moment is the turning-back sliding according to the sliding touch track formed by sequentially connecting the report points 1 to the report point 10. Then, the electronic device may predict a sliding trajectory (for example, indicated by a dotted line in the figure) during the retrace sliding from a sliding touch trajectory of the retrace sliding formed by a connecting line between the waypoint 9 and the waypoint 10. Then, a predicted point (e.g., predicted point 11 to predicted point 14 shown in fig. 14) can be generated within the compass operation region 1401 of the mobile compass along the slide locus.
Alternatively, the predicted slide trajectory at the time of the fold-back slide may be determined from a connection between touch points (e.g., points 9 and 10 in fig. 14) generated at the time of the fold-back slide. Therefore, the position of the first predicted point may be an intersection point between a connection line between touch points generated when the slide is folded back and a boundary of a compass operation area of the mobile compass.
Illustratively, the compass operating area is a circular area. As shown in FIG. 15, the center of the compass operating region is D0(x0,y0) The radius is r. The sequentially generated touch report points comprise D1To D6Wherein the touch control report point during the back-turning sliding is D5(x5,y5) And D6(x6,y6). Wherein x is0、y0R (namely the circle center coordinate and the radius of the compass operation area) can be determined according to the corresponding application program of the mobile compass or can be obtained according to the application program of the mobile compassThe parameters related to the display are not limited herein. x is the number of5、y5、x6And y6All can be obtained by touch control report data. Therefore, the position coordinates of the first predicted point may satisfy the following formula:
Figure BDA0003113178560000101
wherein, the solution (x, y) of the above equation set can be used as the position coordinate of the first predicted point. When (x, y) does not exist, it indicates that the connection line between the touch report points during the retrace sliding (i.e., the direction of the retrace sliding) does not intersect with the compass operating area, and therefore the electronic device does not generate the report points. When only one value exists in (x, y), the connection line between the touch points during the retrace sliding (namely the direction of the retrace sliding) is tangent to the compass operation area, so that the electronic device does not generate the predicted points. When two values of (x, y) exist, it indicates that the connection line between the touch points during the retrace sliding (i.e. the direction of the retrace sliding) intersects with the compass operating area, so that the electronic device can generate a predicted point and retrace the touch point during the retrace sliding at a distance from the coordinate point corresponding to the two values of (x, y) (e.g. D as shown in fig. 15)5、D6Point) is closer as a predicted point (e.g., B as shown in fig. 15)1A dot).
Optionally, after the electronic device generates the first predicted point, the corresponding predicted point may be continuously generated according to the touch-control predicted point generated during the subsequent turn-back sliding of the touch-control operation of the user. That is, as the user slides back and forth, when the electronic device generates touch report points according to a certain sampling rate, the electronic device can generate one report point in the compass operating area each time one touch report point is generated. Thereby enabling the generated forecast point to be synchronized with the user's return slide operation. Of course, in some other possible embodiments, the electronic device may also continuously generate the report point according to a certain time interval as the user subsequently turns back and slides, which is not limited herein.
The distance between each predicted point can be set according to a certain rule, such as an geometric series, an arithmetic series, and the like. The distance between the forecast points can also be set according to other rules, for example, the distances between the forecast points are set to be equal, which is not limited herein, and the distances between the forecast points can be set according to the actual situation.
For example, in some possible implementations, the forecast points subsequently generated by the electronic device may be in the direction of sliding when sliding back (e.g., D as shown in fig. 15)5、D6The line of points) the spacing is generated in order from the first predicted point according to an equal ratio sequence. Wherein the distance between the first forecast point and the next forecast point (e.g., B as shown in FIG. 15)1And B2A distance d between1) It may be set to fold back a touch hit generated at the time of sliding (for example, a touch hit D as shown in fig. 15)5And D6) The distance between the touch points (or when the distances between the touch points generated when the touch points slide back are not equal, the average value of the distances) between the touch points is taken. Of course, the distance between the first prediction point and the next prediction point may also be preset according to the actual situation, or set according to other rules, which is not limited here. The common ratio of the distances between the predicted points can be set according to actual conditions, and is not limited herein, for example, the common ratio can be 0.5, 0.8, and the like.
In other possible embodiments, the forecast point subsequently generated by the electronic device may also be along the sliding direction when the foldback slides (e.g., D as shown in fig. 15)5、D6The line of points) the spacing is generated in order from the first predicted point according to a fixed value. Wherein the fixed value may be a touch hit generated when the slide is folded back (e.g., touch hit D as shown in fig. 15)5And D6) The distance between the touch points (or when the distances between the touch points generated when the touch points slide back are not equal, the average value of the distances) between the touch points is taken. The fixed value may be other values set according to actual conditions or values set according to other rules, and is not limited herein.
Illustratively, the prediction points are arranged in an equal ratio series according to the space between the prediction points, and the first item (namely, the first item) of the equal ratio seriesThe distance between one prediction point and the next prediction point, e.g. prediction point B as shown in FIG. 151And forecast Point B2A distance d between1) To fold back touch hit generated at the time of sliding (e.g., touch hit D as shown in FIG. 15)5And D6) The distance between them, the common ratio of the geometric series is 0.5 as an example. Referring to fig. 15, the touch hit during the folding sliding is D5(x5,y5) And D6(x6,y6) I.e. x5、y5、x6And y6Can be determined as a known value according to the touch report point data, and the generated first report point is B1(x1,y1) I.e. coordinate value x of the first predicted point1、y1Are known values. When the first forecast point (e.g., forecast point B as shown in FIG. 15) is generated1) Then, the subsequent forecast point Ba(xa,ya) Can be calculated according to the following formula:
Figure BDA0003113178560000111
wherein, as shown in FIG. 15, dnTo forecast point BaAnd forecast Point Ba-1The spacing therebetween.
It should be noted that, when the distance between the predicted points is set according to the above geometric series, according to the extreme value of the above geometric series summation formula:
Figure BDA0003113178560000121
it can be known that, when the compass control of the mobile compass moves according to the predicted point, the maximum moving distance is 2d1
Optionally, in some possible implementations, the above S1002 and S1003 may be executed when the generated touch newspaper point is located in the predetermined area (i.e., when the touch operation of the user enters the predetermined area), so as to save the computing power consumption of the electronic device when the user normally drags the compass control in the compass operation area.
After the electronic device generates the forecast point, the middle layer may send (report) the forecast point to an application layer in a software architecture of the electronic device, so that a compass control in the mobile compass can move correspondingly according to the forecast point. For example, the electronic device performs the following S1004.
And S1004, the electronic equipment controls the compass control of the mobile compass to move according to the forecast point through an application layer in the software architecture, so that the game character is controlled to move correspondingly. Namely, the electronic equipment controls the compass control to move to the position of the forecast point according to the forecast point.
Optionally, in some other embodiments, when the picture including the moving compass displayed by the electronic device is a picture that is projected or transmitted by other electronic devices, the electronic device may send a forecast point to other electronic devices, so that the other electronic devices control the compass control of the moving compass to move according to the forecast point.
Optionally, in this embodiment of the application, when the electronic device generates the predicted point, if the electronic device returns and slides to the compass operating area of the mobile compass through the touch medium in the touch operation process of the user, the electronic device may not generate the predicted point any more, and the compass control of the subsequent mobile compass directly moves according to the touch reported point (i.e., the actual reported point touched by the user) generated by the electronic device when the touch medium slides. For example, after the user slides into the compass operation area when sliding back, the middle layer does not generate the report point after receiving the touch report point generated by the bottom layer according to the touch operation of the user, and directly reports the touch report point to the application layer so that the compass control moves according to the touch report point.
For example, as shown in fig. 16, touch points corresponding to the touch operation of the user include point 1 to point 15, point 5 to point 13 are located in the pre-determination area 1601, and the rest of points are located in the compass operation area 1602. When the hit 10 is generated (i.e., the user touches and slides to the position of the hit 10), the electronic device may determine that the touch operation of the user at this time is the return sliding. Subsequently, as the user continues to fold back the slide, the electronic device may generate touch-sensitive touch-sensitive touch-sensitive touch-sensitive display. Accordingly, since the newspaper points 11 to 13 are located in the anticipation area 1601 and do not enter the compass operating area 1602, the electronic device may sequentially generate the forecast points (e.g., newspaper points a, b, and c shown in the figure) in the compass operating area 1602 along with the generation of the newspaper points 11 to 13. When the newspaper point 14 is generated (i.e. the user touches and slides to the position of the newspaper point 14), the electronic device may determine that the touch operation of the user returning and sliding is slid into the compass operation area 1602 according to the newspaper point 14, and at this time, when the electronic device generates the next touch newspaper point (e.g. the newspaper point 15 shown in the figure) according to the touch operation of the user, the electronic device may not generate the newspaper point any more, but directly control the compass control to move according to the newspaper point 15 and the subsequent touch newspaper points.
It should be noted that, in some other possible embodiments, after the user performs the retrace sliding into the compass operation area of the mobile compass, only when the latest touch report point corresponding to the retrace sliding approximately coincides with the latest forecast point that is continuously generated in the compass operation area, or the latest touch report point is more advanced in the retrace sliding direction than the latest forecast point (i.e., the latest touch report point catches up with the latest forecast point in the retrace sliding direction), the electronic device does not generate any forecast point any more so that the subsequent compass control directly moves according to the touch report point generated by the electronic device when the touch medium slides (i.e., the actual report point touched by the user). Therefore, the situation that the compass control moves towards the reverse direction of the turn-back sliding according to the touch report point (namely the compass control is pulled back) because the touch report point corresponding to the turn-back sliding does not catch up the forecast point when the electronic equipment does not generate the forecast point after the user turns back and slides into the compass operation area of the mobile compass is avoided.
Optionally, in this embodiment of the application, if the touch medium slides in the second direction and does not enter the compass operation area, the touch medium performs a touch operation (or referred to as a third operation) that changes the direction from the retrace sliding (sliding in the second direction) to the third direction sliding outside the compass operation area. Then, in response to the third operation, the electronic device may control the compass control to slide to a second position. The second position is an intersection point of a connecting line between the touch medium and a default position (or called as an initial position) of the compass control and a compass operation area boundary. The extension line of the path of the touch medium sliding along the third direction does not intersect with the compass operation area. It should be noted that the default position of the compass control may be the geometric center of the compass operating region. That is, when the electronic device determines that the touch operation is the return sliding in the pre-determination area according to the touch report point corresponding to the touch operation of the user, the electronic device generates a pre-determination point in the compass operation area of the mobile compass according to the subsequent touch report point, and if the electronic device determines that the sliding direction is changed again in the pre-determination area when the touch operation of the user has not entered the compass operation area according to the touch report point corresponding to the touch operation of the user at this time (that is, the sliding direction is changed again when the user has returned the sliding and has not yet slid into the compass operation area), and the changed sliding direction is opposite to the sliding direction of the return sliding, the electronic device may not generate the pre-determination point any more, but directly control the compass control of the mobile compass according to the subsequent touch report point (for example, control of the compass control is controlled to move to the second position). For example, when the user turns back and slides but does not slide into the compass operation area, the sliding direction changes in the pre-judging area, and the changed sliding direction is opposite to the sliding direction of the turning back and sliding, so that the middle layer does not generate a pre-reporting point after receiving a touch reporting point generated by the bottom layer according to the touch operation of the user, and directly reports the touch reporting point to the application layer, so that the compass control can move according to the touch reporting point at this time.
For example, as shown in fig. 17, the user may perform a sliding touch operation by using a finger as a touch medium. As shown in fig. 17 (a), the user may slide from within the compass operation region 1701 in a certain direction (e.g., a first direction) to the anticipation region 1702 outside the compass operation region 1701. At which point compass control 1703 may slide in a first direction to the boundary of compass operational area 1701. The user may then re-turn to slide (i.e., return a slide) in another direction (e.g., a second direction) toward the compass operational area 1701. At this point, the compass control 1703 may begin sliding in a second direction from the border of the compass operational area 1701 at the predicted point. As shown in fig. 17 (b), when the user has not slid into the compass operation region 1701 in the second direction, the direction is changed again to perform the sliding (e.g., sliding in the third direction). At this time, the compass control 1703 may move according to the actual touch report point currently touched by the user. As shown in fig. 18, when the user performs the sliding touch operation shown in fig. 17, along with the touch operation of the user, touch points corresponding to the touch operation of the user include points 1 to 16, points 5 to 16 are located in the pre-determined area 1801, and the rest of points are located in the compass operation area 1802. When the hit 10 is generated (i.e., the user touches and slides to the position of the hit 10), the electronic device may determine that the touch operation of the user at this time is the return sliding. Subsequently, as the user continues to fold back the slide, the electronic device may generate touch-sensitive touch-. Accordingly, since the report points 11 to 13 are located in the prediction area 1801 and do not enter the compass operating area 1802, the electronic device may sequentially generate prediction points (e.g., report points a, b, and c shown in the figure) in the compass operating area 1802 along with the generation of the report points 11 to 13. When the newspaper point 14 is generated (i.e. the user touches and slides to the position of the newspaper point 14), the electronic device may determine that the sliding direction of the touch operation of the user has changed in the anticipation area 1801 according to the newspaper point 14, and the changed direction is opposite to the direction of the return sliding, at this time, the electronic device may not generate the newspaper point any longer, but directly control the compass control to move according to the newspaper point 14 and the subsequently generated touch newspaper points (e.g. the newspaper point 15 and the newspaper point 16 shown in the figure).
The electronic device may determine whether the sliding direction is changed and whether the changed sliding direction is opposite to the direction of the fold-back sliding according to the sliding trajectory of the fold-back sliding and the sliding trajectory after the changed direction. For example, when the horizontal direction (i.e., the lateral direction of the display panel of the electronic device) after the decomposition of the direction of the changed sliding trajectory (e.g., the direction in which the waypoint 13 points to the waypoint 14 shown in fig. 18) is opposite to the horizontal direction after the decomposition of the direction of the fold-back sliding, it can be determined that the sliding direction of the touch operation by the user has changed, and the changed direction is opposite to the sliding direction of the fold-back sliding.
By adopting the method in the above embodiment, when a user needs to drag the sliding control through the touch medium (e.g., a finger or a stylus) to control the controlled object to move back, even if the touch medium does not exist in the control operation area when the sliding direction is changed to perform the back-folding sliding, the sliding control can move according to the back-folding sliding direction of the touch medium to control the controlled object to move correspondingly. Therefore, when the user drags the sliding control, the sliding control can be prevented from being dragged only after the touch medium enters the control operation area after sliding for a certain distance because the touch medium is not in the control operation area. The delay of the user dragging the sliding control at the moment is reduced, and the delay of the user controlling the controlled object is further reduced.
Corresponding to the method in the foregoing embodiment, the present application further provides a touch device. The apparatus may be applied to an electronic device for implementing the method in the foregoing embodiments. The functions of the device can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules corresponding to the above-described functions. For example, fig. 19 shows a schematic structural diagram of a touch device, and as shown in fig. 19, the touch device includes: a display module 1901, a processing module 1902, and the like.
The display module 1901 may be configured to display a first interface, where the first interface includes a control operation area and a sliding control located in the control operation area, and the sliding control is used to move in the control operation area to control the controlled object to move; the processing module 1902, configured to detect a first operation that the touch medium slides from inside the control operation area to outside the control operation area along a first direction, and control the sliding control to move to a first position of a boundary of the control operation area along the first direction; detecting a second operation that the direction of the touch medium is changed by sliding in the first direction and the touch medium slides in the second direction outside the control operation area, and controlling the sliding control to move in the second direction in the control operation area; and the extension line of the path of the touch medium sliding along the second direction is intersected with the control operation area.
In a possible implementation manner, the processing module 1902 may be further configured to detect a third operation that the touch medium slides in the third direction from the second direction to the second direction outside the control operation area, and control the sliding control to move to the second position, where the second position is an intersection point of a connection line between the touch position of the touch medium and the default position of the sliding control and a boundary of the control operation area; and the extension line of the sliding path of the touch medium along the third direction is not intersected with the control operation area.
In another possible implementation manner, the processing module 1902 may be further configured to control the controlled object to move along a direction in which the default position of the sliding control points to the position of the sliding control.
In another possible implementation manner, the processing module 1902 is specifically configured to control, according to a touch report corresponding to the touch medium in the first operation process, the sliding control to slide to a first position of the boundary of the control operation area along the touch medium along the first direction.
In another possible implementation manner, the processing module 1902 is specifically configured to determine, according to a touch breakpoint corresponding to the touch medium in the second operation process, that the touch medium slides in the second direction when the direction of the touch medium is changed from the first direction to the second direction outside the control operation area; generating a predicted point along a second direction in the control operation area according to a touch point corresponding to the touch medium in a second operation process; and controlling the sliding control to move along the second direction in the control operation area according to the forecast point.
In another possible implementation manner, a prejudgment area is arranged outside the control operation area; the second operation is that the touch control medium slides in a second direction from the first direction to change the direction in the pre-judging area; the processing module 1902 is specifically configured to determine, according to a corresponding touch report point in the second operation process, that the touch medium slides in the second direction when the direction of the touch medium is changed from the first direction in the predetermined area.
In another possible implementation manner, the predetermined area is preset, or the predetermined area is generated according to a historical touch operation of the touch medium in a preset area range of the position of the control operation area.
In another possible implementation manner, the processing module 1902 may be further configured to, when the touch medium slides into the control operation area along the second direction, control the sliding control to slide along the second direction in the control operation area along with the touch medium according to the touch report corresponding to the touch medium.
In another possible implementation manner, the processing module 1902 is specifically configured to determine, according to a touch breakpoint corresponding to the touch medium in the third operation process, that the touch medium changes direction from the second direction sliding outside the control operation area and slides in a third direction; and controlling the sliding control to slide to the second position.
In another possible implementation manner, the first interface comprises a game interface, the control operation area comprises a compass operation area, the slide control comprises a compass control, and the controlled object comprises a game character in the game interface.
In another possible implementation, the touch medium includes a stylus, a user's finger, and the like.
It should be understood that the division of units or modules (hereinafter referred to as units) in the above apparatus is only a division of logical functions, and may be wholly or partially integrated into one physical entity or physically separated in actual implementation. And the units in the device can be realized in the form of software called by the processing element; or may be implemented entirely in hardware; part of the units can also be realized in the form of software called by a processing element, and part of the units can be realized in the form of hardware.
For example, each unit may be a processing element separately set up, or may be implemented by being integrated into a chip of the apparatus, or may be stored in a memory in the form of a program, and a function of the unit may be called and executed by a processing element of the apparatus. In addition, all or part of the units can be integrated together or can be independently realized. The processing element described herein, which may also be referred to as a processor, may be an integrated circuit having signal processing capabilities. In the implementation process, the steps of the method or the units above may be implemented by integrated logic circuits of hardware in a processor element or in a form called by software through the processor element.
In one example, the units in the above apparatus may be one or more integrated circuits configured to implement the above method, such as: one or more ASICs, or one or more DSPs, or one or more FPGAs, or a combination of at least two of these integrated circuit forms.
As another example, when a unit in a device may be implemented in the form of a processing element scheduler, the processing element may be a general purpose processor, such as a CPU or other processor capable of invoking programs. As another example, these units may be integrated together and implemented in the form of a system-on-a-chip (SOC).
In one implementation, the means for implementing the respective corresponding steps of the above method by the above apparatus may be implemented in the form of a processing element scheduler. For example, the apparatus may include a processing element and a memory element, the processing element calling a program stored by the memory element to perform the method described in the above method embodiments. The memory elements may be memory elements on the same chip as the processing elements, i.e. on-chip memory elements.
In another implementation, the program for performing the above method may be in a memory element on a different chip than the processing element, i.e. an off-chip memory element. At this time, the processing element calls or loads a program from the off-chip storage element onto the on-chip storage element to call and execute the method described in the above method embodiment.
For example, the embodiments of the present application may also provide an apparatus, such as: an electronic device may include: a processor, a memory for storing instructions executable by the processor. The processor is configured to execute the above instructions, so that the electronic device implements the touch method according to the foregoing embodiments. The memory may be located within the electronic device or external to the electronic device. And the processor includes one or more.
In yet another implementation, the unit of the apparatus for implementing the steps of the method may be configured as one or more processing elements, and these processing elements may be disposed on the electronic device corresponding to the foregoing, where the processing elements may be integrated circuits, for example: one or more ASICs, or one or more DSPs, or one or more FPGAs, or a combination of these types of integrated circuits. These integrated circuits may be integrated together to form a chip.
For example, the embodiment of the present application also provides a chip, and the chip can be applied to the electronic device. The chip includes one or more interface circuits and one or more processors; the interface circuit and the processor are interconnected through a line; the processor receives and executes computer instructions from the memory of the electronic device through the interface circuitry to implement the methods described in the method embodiments above.
Embodiments of the present application further provide a computer program product, which includes computer instructions executed by an electronic device, such as the electronic device described above.
Through the above description of the embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical functional division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another device, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, that is, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be embodied in the form of software products, such as: and (5) programming. The software product is stored in a program product, such as a computer readable storage medium, and includes several instructions for causing a device (which may be a single chip, a chip, or the like) or a processor (processor) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.
For example, embodiments of the present application may also provide a computer-readable storage medium having stored thereon computer program instructions. The computer program instructions, when executed by the electronic device, cause the electronic device to implement the touch method as described in the aforementioned method embodiments.
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (13)

1. A touch method, comprising:
displaying a first interface, wherein the first interface comprises a control operation area and a sliding control positioned in the control operation area, and the sliding control is used for moving in the control operation area to control the movement of a controlled object;
detecting a first operation that a touch medium slides from the inside of the control operation area to the outside of the control operation area along a first direction, and controlling the sliding control to move to a first position of the boundary of the control operation area along the first direction;
detecting a second operation that the direction of the touch medium is changed by sliding in the first direction outside the control operation area and the touch medium slides in a second direction, and controlling the sliding control to move along the second direction in the control operation area; wherein an extension line of a path along which the touch medium slides in the second direction intersects with the control operation area.
2. The method of claim 1, wherein after the detecting a second operation that the touch-sensitive medium changes direction from the first-direction sliding outside the control-operation area to sliding in a second direction, the sliding control moves in the second direction within the control-operation area, the method further comprises:
detecting a third operation that the direction of the touch medium is changed by sliding in the second direction outside the control operation area and the touch medium slides in a third direction, and controlling the sliding control to move to a second position, wherein the second position is an intersection point of a connection line between a touch position of the touch medium sliding in the third direction and a default position of the sliding control and the control operation area boundary; wherein, the extension line of the path of the touch medium sliding along the third direction does not intersect with the control operation area.
3. The method according to claim 1 or 2, characterized in that the method further comprises:
and controlling the controlled object to move along the direction of the default position of the sliding control pointing to the position of the sliding control.
4. The method of claim 1, wherein moving the slider control in the first direction to a first position of a boundary of the control operation region comprises:
and controlling the sliding control to slide to a first position of the boundary of the control operation area along the first direction along with the touch medium according to a touch report point corresponding to the touch medium in the first operation process.
5. The method of claim 1, wherein the controlling the slider control to move in the second direction within the control operation region comprises:
determining that the direction of the touch medium is changed from the sliding in the first direction outside the control operation area and the touch medium slides in the second direction according to the touch report point corresponding to the touch medium in the second operation process;
generating a predicted point along the second direction in the control operation area according to a touch newspaper point corresponding to the touch medium in the second operation process;
and controlling the sliding control to move along the second direction in the control operation area according to the forecast point.
6. The method according to claim 5, characterized in that a prejudgment area is provided outside the control operation area; the second operation is that the touch control medium slides in the second direction in the pre-judging area from the first direction to change the direction;
the determining, according to the corresponding touch hit in the second operation process, that the direction of the touch medium is changed from the first direction sliding outside the control operation area to the second direction sliding includes:
and determining that the direction of the touch medium is changed from the sliding in the first direction to the sliding in the second direction in the pre-judging area according to the corresponding touch report point in the second operation process.
7. The method according to claim 6, wherein the predetermined area is preset, or the predetermined area is generated according to a historical touch operation of the touch medium within a preset area range of a position of the control operation area.
8. The method according to any one of claims 5 to 7, wherein after the controlling the sliding control to move in the second direction within the control operation region according to the forecast point, the method further comprises:
and under the condition that the touch medium slides into the control operation area along the second direction, controlling the sliding control to slide along with the touch medium in the control operation area along the second direction according to the touch report point corresponding to the touch medium.
9. The method of claim 2, wherein the controlling the slider control to slide to a second position comprises:
determining that the direction of the touch medium is changed by sliding in the second direction outside the control operation area and the touch medium slides in the third direction according to a touch report point corresponding to the touch medium in the third operation process;
and controlling the sliding control to slide to the second position.
10. The method of claim 1, wherein the first interface comprises a game interface, wherein the control action area comprises a compass action area, wherein the slider control comprises a compass control, and wherein the controlled object comprises a game character in the game interface.
11. The method of claim 1, wherein the touch-sensitive medium comprises a stylus, a user's finger, or the like.
12. An electronic device, comprising: a processor, a memory for storing the processor-executable instructions, the processor being configured to, when executing the instructions, cause the electronic device to implement the method of any of claims 1 to 11.
13. A computer readable storage medium having stored thereon computer program instructions; computer program instructions which, when executed by an electronic device, cause the electronic device to carry out the method of any one of claims 1 to 11.
CN202110654261.5A 2021-06-11 2021-06-11 Touch method and device Active CN113476822B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110654261.5A CN113476822B (en) 2021-06-11 2021-06-11 Touch method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110654261.5A CN113476822B (en) 2021-06-11 2021-06-11 Touch method and device

Publications (2)

Publication Number Publication Date
CN113476822A true CN113476822A (en) 2021-10-08
CN113476822B CN113476822B (en) 2022-06-10

Family

ID=77934845

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110654261.5A Active CN113476822B (en) 2021-06-11 2021-06-11 Touch method and device

Country Status (1)

Country Link
CN (1) CN113476822B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114489457A (en) * 2022-01-27 2022-05-13 北京字跳网络技术有限公司 Control method and device of virtual object, readable medium and electronic equipment

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030008696A1 (en) * 2001-07-09 2003-01-09 Abecassis David H. Area-based resource collection in a real-time strategy game
US20110093819A1 (en) * 2000-05-11 2011-04-21 Nes Stewart Irvine Zeroclick
CN106598465A (en) * 2016-12-20 2017-04-26 上海逗屋网络科技有限公司 Control method, device and equipment based on virtual rocker
CN107008003A (en) * 2017-04-13 2017-08-04 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and computer-readable recording medium
CN107168611A (en) * 2017-06-16 2017-09-15 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium
CN109753327A (en) * 2018-11-19 2019-05-14 努比亚技术有限公司 A kind of control layout method, terminal and computer readable storage medium
CN111481923A (en) * 2020-05-18 2020-08-04 网易(杭州)网络有限公司 Rocker display method and device, computer storage medium and electronic equipment
CN111589127A (en) * 2020-04-23 2020-08-28 腾讯科技(深圳)有限公司 Control method, device and equipment of virtual role and storage medium
CN111589112A (en) * 2020-04-24 2020-08-28 腾讯科技(深圳)有限公司 Interface display method, device, terminal and storage medium
CN111694494A (en) * 2020-06-10 2020-09-22 维沃移动通信有限公司 Control method and device
CN112698780A (en) * 2020-12-29 2021-04-23 贵阳动视云科技有限公司 Virtual rocker control method and device

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110093819A1 (en) * 2000-05-11 2011-04-21 Nes Stewart Irvine Zeroclick
US20030008696A1 (en) * 2001-07-09 2003-01-09 Abecassis David H. Area-based resource collection in a real-time strategy game
CN106598465A (en) * 2016-12-20 2017-04-26 上海逗屋网络科技有限公司 Control method, device and equipment based on virtual rocker
CN107008003A (en) * 2017-04-13 2017-08-04 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and computer-readable recording medium
CN107168611A (en) * 2017-06-16 2017-09-15 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium
CN109753327A (en) * 2018-11-19 2019-05-14 努比亚技术有限公司 A kind of control layout method, terminal and computer readable storage medium
CN111589127A (en) * 2020-04-23 2020-08-28 腾讯科技(深圳)有限公司 Control method, device and equipment of virtual role and storage medium
CN111589112A (en) * 2020-04-24 2020-08-28 腾讯科技(深圳)有限公司 Interface display method, device, terminal and storage medium
CN111481923A (en) * 2020-05-18 2020-08-04 网易(杭州)网络有限公司 Rocker display method and device, computer storage medium and electronic equipment
CN111694494A (en) * 2020-06-10 2020-09-22 维沃移动通信有限公司 Control method and device
CN112698780A (en) * 2020-12-29 2021-04-23 贵阳动视云科技有限公司 Virtual rocker control method and device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114489457A (en) * 2022-01-27 2022-05-13 北京字跳网络技术有限公司 Control method and device of virtual object, readable medium and electronic equipment
CN114489457B (en) * 2022-01-27 2024-01-19 北京字跳网络技术有限公司 Virtual object control method and device, readable medium and electronic equipment

Also Published As

Publication number Publication date
CN113476822B (en) 2022-06-10

Similar Documents

Publication Publication Date Title
TWI463361B (en) Control method and system by partial touch panel
US10795448B2 (en) Tactile glove for human-computer interaction
TWI529574B (en) Electronic device and operation method thereof
US20150185953A1 (en) Optimization operation method and apparatus for terminal interface
CN104185831B (en) For the system and method using input unit dynamic regulation user-interface parameters
JP5507494B2 (en) Portable electronic device with touch screen and control method
EP2508972A2 (en) Portable electronic device and method of controlling same
US9870144B2 (en) Graph display apparatus, graph display method and storage medium
CN112162665B (en) Operation method and device
WO2015084684A2 (en) Bezel gesture techniques
JP5780438B2 (en) Electronic device, position designation method and program
KR20110063561A (en) Device for controlling an electronic apparatus by handling graphic objects on a multi-contact touch screen
US20130120286A1 (en) Touch control device and method
KR20160049455A (en) Method of displaying an image by using a scroll bar and apparatus thereof
CN102929556A (en) Method and equipment for interaction control based on touch screen
US20120242576A1 (en) Information processing apparatus, information processing method, and program
WO2015102974A1 (en) Hangle-based hover input method
TWI482064B (en) Portable device and operating method thereof
US20120023426A1 (en) Apparatuses and Methods for Position Adjustment of Widget Presentations
CN101482799A (en) Method for controlling electronic equipment through touching type screen and electronic equipment thereof
CN113476822B (en) Touch method and device
CN112783408A (en) Gesture navigation method and device of electronic equipment, equipment and readable storage medium
WO2011079438A1 (en) An apparatus, method, computer program and user interface
CN112764647A (en) Display method, display device, electronic equipment and readable storage medium
WO2022228097A1 (en) Display method, display apparatus and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant