CN113663333A - Game control method and device, electronic equipment and storage medium - Google Patents

Game control method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113663333A
CN113663333A CN202110976432.6A CN202110976432A CN113663333A CN 113663333 A CN113663333 A CN 113663333A CN 202110976432 A CN202110976432 A CN 202110976432A CN 113663333 A CN113663333 A CN 113663333A
Authority
CN
China
Prior art keywords
virtual character
current position
game
game scene
touch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110976432.6A
Other languages
Chinese (zh)
Inventor
陈启明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202110976432.6A priority Critical patent/CN113663333A/en
Publication of CN113663333A publication Critical patent/CN113663333A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a game control method and device, electronic equipment and a storage medium. The method provides a graphical user interface comprising a mobile control through a terminal device, the content displayed by the graphical user interface at least partially comprises a game scene of a game, the game scene comprises virtual characters, and the method comprises the following steps: responding to the touch sliding operation aiming at the mobile control, acquiring a first relative position relation between the current position of a touch point of the touch sliding operation on the graphical user interface and the original point position of the mobile control, and controlling the virtual character to move towards the moving direction determined by the first relative position relation in the game scene; responding to the movement of the virtual character in the game scene, and if an obstacle exists in a preset search range with the current position of the virtual character as a central point, determining a target position on the graphical user interface based on the current position of the touch point; haptic feedback is performed at the target location. The method improves the game operation efficiency.

Description

Game control method and device, electronic equipment and storage medium
Technical Field
The present application relates to game technologies, and in particular, to a method and an apparatus for controlling a game, an electronic device, and a storage medium.
Background
The route searching is a function in the network game, and the virtual character can be controlled to move from the current position to the destination through the route searching, so that the task in the game is completed.
The path finding comprises manual path finding and automatic path finding. The manual way finding is that the player determines the moving path of the virtual character according to the self thinking and manually controls the virtual character to move along the moving path. The automatic path finding is that the computer automatically plans a moving path which can reach the end point according to the path finding algorithm and the starting point and the end point selected by the player, and automatically controls the virtual character to move along the moving path.
In the route searching process, if the user encounters the condition of traffic incapability, the player needs to manually adjust the moving path of the virtual character, so that the operation is inconvenient and the operation efficiency is low.
Disclosure of Invention
The application provides a game control method, a game control device, an electronic device and a storage medium, which are used for solving the problems of inconvenient operation and low operation efficiency caused by the fact that a player needs to manually adjust the moving path of a virtual character when the virtual character in a game finds a path and cannot pass the game.
In a first aspect, the present application provides a method for controlling a game, in which a terminal device provides a graphical user interface, the graphical user interface further includes a mobile control, content displayed by the graphical user interface at least partially includes a game scene of the game, the game scene includes a virtual character, and the method includes: responding to the touch sliding operation aiming at the mobile control, acquiring a first relative position relation between the current position of the touch point of the touch sliding operation on the graphical user interface and the original point position of the mobile control, and controlling the virtual character to move towards the movement direction determined by the first relative position relation in the game scene; responding to the movement of the virtual character in the game scene, and determining whether an obstacle exists in a preset search range which takes the current position of the virtual character as a central point in the game scene; if an obstacle exists in a preset search range with the current position of the virtual character as a central point, determining a target position on the graphical user interface based on the current position of the touch point; haptic feedback is performed at the target location.
In a second aspect, the present application provides a control apparatus for a game, which provides a graphical user interface through a terminal device, where the graphical user interface further includes a mobile control, where content displayed by the graphical user interface at least partially includes a game scene of the game, where the game scene includes a virtual character, and the apparatus includes: the control module is used for responding to the touch sliding operation aiming at the mobile control, acquiring a first relative position relation between the current position of a touch point of the touch sliding operation on the graphical user interface and the original point position of the mobile control, and controlling the virtual character to move towards the movement direction determined by the first relative position relation in the game scene; the first determination module is used for responding to the movement of the virtual character in the game scene, and determining whether an obstacle exists in a preset search range which takes the current position of the virtual character as a central point in the game scene; a second determining module, configured to determine a target position on the graphical user interface based on the current position of the touch point if an obstacle exists within a preset search range with the current position of the virtual character as a center point; a haptic feedback module to perform haptic feedback at the target location.
In a third aspect, the present application provides an electronic device, comprising: a memory, a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to implement the method of the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium having stored thereon computer-executable instructions for implementing the method according to the first aspect when executed by a processor.
In a fifth aspect, the present application provides a computer program product comprising a computer program which, when executed by a processor, implements the method of the first aspect.
The game control method, the game control device, the electronic device and the storage medium provided by the application obtain a first relative position relation between the current position of the touch point of the touch sliding operation on the graphical user interface and the original point position of the mobile control by responding to the touch sliding operation of the mobile control, control the virtual character to move towards the moving direction determined by the first relative position relation in the game scene, responding to the movement of the virtual character in the game scene, determining whether an obstacle exists in a preset search range which takes the current position of the virtual character as a central point in the game scene, if so, a target position is determined on the graphical user interface based on the current position of the touch point and haptic feedback is performed at the target position. In the moving process of the virtual character, whether obstacles exist around the virtual character is detected in real time, under the condition that the obstacles are detected, touch feedback is carried out based on the touch points, a user is prompted in a touch feedback mode, the user can know the road condition in front in advance, the moving direction of the virtual character is adjusted in time, and the operation efficiency is improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application.
FIG. 1 is a schematic diagram of a graphical user interface provided by an embodiment of the present application;
FIG. 2 is a first flowchart of a game control method according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram illustrating a relationship between a direction of a target position relative to a center point of a preset control and a direction of an obstacle relative to a virtual character according to an embodiment of the present application;
fig. 4 is a schematic diagram of a preset search range according to an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of a target location provided by an embodiment of the present application;
FIG. 6 is a schematic illustration of another target location provided by an embodiment of the present application;
FIG. 7 is a second flowchart of a game control method according to an embodiment of the present disclosure;
fig. 8 is a flowchart of a control method of a game according to an embodiment of the present application;
FIG. 9 is a first schematic view of a first predetermined length provided in an embodiment of the present application;
FIG. 10 is a second schematic view of a first predetermined length provided in accordance with an embodiment of the present application;
FIG. 11 is a first schematic view of a second predetermined length provided in an embodiment of the present application;
FIG. 12 is a second schematic view of a second predetermined length provided in the embodiments of the present application;
fig. 13 is a trend chart of the change of the friction degree of the preset control provided in the embodiment of the present application;
fig. 14 is a diagram illustrating an effect of a change in a friction degree of a preset control according to an embodiment of the present application;
fig. 15 is a schematic structural diagram of a control device of a game according to an embodiment of the present application;
fig. 16 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
With the above figures, there are shown specific embodiments of the present application, which will be described in more detail below. These drawings and written description are not intended to limit the scope of the inventive concepts in any manner, but rather to illustrate the inventive concepts to those skilled in the art by reference to specific embodiments.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
In a game, a player is purposefully guided to play a game, and a means of giving a virtual item used in the game to the player is a game task. Wherein, the game task can include: killing monsters, collecting virtual items, etc.
The player needs to find the target object while performing the game mission. Taking game tasks such as killing monsters and collecting articles as examples, the player controls the virtual character to start from the current position through the terminal device, and searches for monsters or articles along a path in the game scene. The process of determining the path moved by the virtual character is path finding, and the path finding comprises manual path finding and automatic path finding. The following describes the way finding with reference to the attached drawings:
fig. 1 is a schematic diagram of a graphical user interface provided in an embodiment of the present application. As shown in fig. 1, the method provided by the embodiment of the present application may be applied to a terminal device 11, where the terminal device 11 may be a smart phone, a tablet computer, or other devices.
The terminal device 11 can run a game application 12, and the graphical user interface 13 of the terminal device 11 can display a game scene 13 and a mobile control 14 of the game application 12; the game scene 13 includes a virtual Character 131 and a Non-Player Character (NPC) 132, and the Non-Player Character may be a monster or an article as described above.
Among them, the movement control 14 is a control for controlling the movement of the virtual character. For example, the virtual joystick includes direction keys of "up", "down", "left" and "right".
In the manual path finding scene, the player can control the virtual character to move along the directions of up, down, left and right by operating the direction keys so as to find the target object. The user usually determines the moving direction of the virtual character by combining the visual observation of the game scene and according to the self thinking judgment, and then controls the virtual character to move through the moving control. However, when the road is automatically searched, dead roads are often encountered, or if the equipment of the virtual character is large in size, the equipment is stuck with surrounding scenes, or stuck at corners. At this time, the user needs to operate the movement control to make the virtual character turn back from the dead road, and this part of the operation actually belongs to the invalid operation. Thus, the efficiency of the game operation is low.
In an auto-routing scenario, a user may select an NPC or specify a destination in a game map. A shortest path can be automatically planned and the virtual character can be automatically controlled to move along the shortest path to the destination or the selected NPC.
In the automatic path finding scene, because the roadmap path is long, the peripheral scenes are more, and the equipment on the virtual character body has larger volume, so that the equipment is easy to block with the peripheral scenes or block at a corner. From the user perspective, it is the virtual character in the game frame that stops advancing at a certain position, and the foot may still be running. At this time, the user also needs to manually adjust the moving direction of the virtual character, and because the route is automatically searched, the user does not know in advance how the virtual character reaches the current blocked position, so that the map viewing history path needs to be opened. For the user, the operation is complicated, the operation efficiency is low, and the user experience is poor.
In view of the above technical problems, the inventors of the present application propose the following technical idea: in a game scene, obstacles which may exist in the moving direction of the virtual character are detected in advance, and when the obstacles are detected, a user is reminded to adjust the moving direction of the virtual character, so that invalid operations are reduced. At present, most reminding modes in games comprise voice reminding, visual reminding and the like, and voice reminding or visual reminding can interrupt game playing, so that user experience is poor. Therefore, the application proposes a reminding mode through tactile feedback to remind the user from the sense of touch.
The following describes the technical solutions of the present application and how to solve the above technical problems with specific embodiments. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments. Embodiments of the present application will be described below with reference to the accompanying drawings.
Fig. 2 is a first flowchart of a game control method according to an embodiment of the present application. As shown in fig. 2, the method for controlling the game includes the following steps:
s201, responding to the touch sliding operation aiming at the mobile control, acquiring a first relative position relation between the current position of the touch point of the touch sliding operation on the graphical user interface and the original point position of the mobile control, and controlling the virtual character to move towards the moving direction determined by the first relative position relation in the game scene.
The execution subject of the method of the present embodiment may be a terminal device as shown in fig. 1.
In an alternative embodiment, the touch point acting on the graphical user interface may be controlled by a contact point of a finger, a stylus or any touch medium with a screen of a terminal device presenting the graphical user interface. The current position of the touch point may be optionally controlled by a touch operation of the player on the graphical user interface, and the current position of the touch point may be displayed at an upper left position, an upper right position, or any other position in the graphical user interface.
In this embodiment, the origin position of the mobile control may be a center point position of the mobile control, and when a touch sliding operation is detected on the graphical user interface, a current position of a touch point of the touch sliding operation is obtained; determining a first relative position relation according to the current position of the touch point of the touch sliding operation and the position of the central point of the mobile control; determining the moving direction of the virtual character according to the first relative position relation; and controlling the virtual character to move towards the determined moving direction in the game scene according to the moving direction of the virtual character.
The center point of the mobile control is used as an origin, and the touch point of the touch sliding operation can be any touch point within a 360-degree range with the origin as a circle center. The direction of the current position of the touch point of the touch slide operation relative to the position of the center point of the movement control may be understood as a first relative positional relationship. In some optional embodiments, the first relative positional relationship may further include a distance between a current position of the touch point of the touch sliding operation and a position of the center point of the movement control.
And determining the moving direction of the virtual character according to the first relative position relationship, specifically comprising: determining the moving direction of the virtual character according to the first relative position relation and a preset first corresponding relation; the first corresponding relationship is used for representing a mapping relationship between a sliding direction of the touch sliding operation (i.e. a relative position relationship between the current position of the touch point and the origin position of the mobile control) and a moving direction of the virtual character, and determining the moving direction of the virtual character. Wherein, the first corresponding relationship may include two different embodiments as follows:
in some optional embodiments, the first correspondence comprises: the direction of the current position of the touch point of the touch sliding operation relative to the original point position of the mobile control is the same as the moving direction of the virtual character.
For example, if the current position of the touch point of the touch sliding operation is located in the upper left direction of the origin position of the mobile control, the virtual character is controlled to move towards the upper left direction of the current position of the virtual character.
And if the current position of the touch point of the touch sliding operation is positioned in the upper right direction of the original position of the mobile control, controlling the virtual character to move towards the upper right direction of the current position of the virtual character.
And if the current position of the touch point of the touch sliding operation is positioned in the lower left direction of the original position of the mobile control, controlling the virtual character to move towards the lower left direction of the current position of the virtual character.
And if the current position of the touch point of the touch sliding operation is positioned in the lower right direction of the original position of the mobile control, controlling the virtual character to move towards the lower right direction of the current position of the virtual character.
For the reader's understanding, this step will be described below with reference to the accompanying drawings:
fig. 3 is a schematic diagram of a mapping relationship provided in an embodiment of the present application. As shown in fig. 3, in the game scene, a coordinate axis X1O1Y1 is established with the current position of the virtual character as the origin. In addition, in the graphical user interface, the coordinate axis X2O2Y2 is established with the center point of the movement control as the origin. In the coordinate axis, four regions formed by the horizontal axis and the vertical axis are divided into four quadrants, taking the coordinate axis X2O2Y2 as an example, the upper right quadrant is a first quadrant (i.e., a region formed by the positive half axis of X2 and the positive half axis of Y2), the upper left quadrant is a second quadrant (i.e., a region formed by the negative half axis of X2 and the positive half axis of Y2), the lower left quadrant is a third quadrant (i.e., a region formed by the negative half axis of X2 and the negative half axis of Y2), and the lower right quadrant is a fourth quadrant (i.e., a region formed by the positive half axis of X2 and the negative half axis of Y2).
If the direction of the current position of the touch point of the touch slide operation relative to the origin position of the moving control is located in the second quadrant of the coordinate axis X2O2Y2, and the included angle between the current position of the touch point of the touch slide operation and the negative half axis of X2 is 45 degrees. The direction of movement of the avatar is in the first quadrant of coordinate axis X1O1Y1, at 45 degrees to the negative half axis of X1.
It should be noted that the coordinate axes are not visible in the game, and are not intended to limit the present application, and are exemplary descriptions for the convenience of the reader.
In other alternative embodiments, the first correspondence includes: the direction of the current position of the touch point of the touch sliding operation relative to the original point position of the mobile control is different from the moving direction of the virtual character.
For example, in a parachute jumping game, if the current position of the touch point of the touch sliding operation is located right above the origin position of the mobile control, the virtual character is controlled to move towards the direction right below the current position of the virtual character; and if the current position of the touch point of the touch sliding operation is located right below the original position of the mobile control, controlling the virtual character to move towards the right above the current position of the virtual character.
S202, responding to the movement of the virtual character in the game scene, and determining whether an obstacle exists in a preset search range which takes the current position of the virtual character as a central point in the game scene.
In the step, whether an obstacle exists in the surrounding environment of the virtual character in the moving direction is determined in real time according to the current position of the virtual character in the moving process of the virtual character.
Specifically, in this step, in response to the movement of the virtual character in the game scene, it is determined whether an obstacle exists in a preset search range in the current movement direction in the game scene with the current position of the virtual character as a center point.
In an alternative embodiment, the preset search range may be a circular area with the current position of the virtual character as a center point in the game scene, that is, the periphery of the virtual character.
In another alternative embodiment, the preset search range may be a sector area with a preset length as a radius in a direction moving to the current position of the virtual character in the game scene, where the central angle of the sector area is 180 degrees.
Fig. 4 is a schematic diagram of a preset search range according to an embodiment of the present application. As shown in fig. 4, point a2 represents the current position of the virtual character, the pointing direction of the arrow represents the current moving direction of the virtual character, the preset length is R2, and a sector area with a center point of a2 and a radius of R2 and a central angle of 180 degrees is the preset search range.
S203, if an obstacle exists in the preset search range with the current position of the virtual character as the center point, determining a target position on the graphical user interface based on the current position of the touch point.
The determination of the target position at least comprises the following two implementation modes:
in some optional embodiments, if an obstacle exists in a preset search range with the current position of the virtual character as a center point, the current position of the touch point is determined as the target position.
In this embodiment, when the virtual joystick is long-pressed on the graphical user interface to control the virtual character to move, a contact area is formed between the finger and the graphical user interface, and the contact area is the target position.
Fig. 5 is a schematic diagram of a target location according to an embodiment of the present disclosure. As shown in fig. 5, if a long-press operation is performed on the graphical user interface by a finger and a contact area (an area within a dotted circle) 51 is formed, the contact area 51 is set as a target position in the present embodiment.
In other optional embodiments, if an obstacle exists in the preset search range with the current position of the virtual character as the center point, the preset area range with the current position of the touch point as the center is determined as the target position.
In this embodiment, when the virtual joystick is slid on the graphical user interface to control the virtual character to move, the current position of the touch point and the preset area range in the preset direction relative to the current position of the touch point may be determined as the target position.
Fig. 6 is a schematic diagram of another target location provided in the embodiments of the present application. As shown in fig. 6, assuming that the sliding of the finger in the upper left direction in the game controls the virtual character to move in the upper left direction, when it is detected that the obstacle is located in the upper left direction of the virtual character, the target position 61 may be determined as a preset area range (i.e., an area indicated by several open circles in the figure) between the current touch point position and the upper left direction of the current touch point position.
And S204, performing tactile feedback at the target position.
The terminal device of the present embodiment is a device based on a conductive layer for tactile feedback. Wherein the haptic feedback comprises friction haptic feedback and/or vibrotactile feedback. The friction touch feedback enables a finger of a user to feel real friction touch on a touch screen of the terminal device, and the vibration touch feedback enables a hand feel of the user to be vibrated.
In this embodiment, the user may be prompted by a single friction touch sense or a vibration touch sense, or may be prompted by a combination of a friction touch sense and a vibration touch sense.
In the embodiment, a first relative position relationship between a current position of a touch point of the touch sliding operation on the graphical user interface and an origin position of the mobile control is obtained in response to the touch sliding operation on the mobile control, and in a process that the virtual character moves in a movement direction determined by the first relative position relationship in a game scene, whether an obstacle exists in a preset search range in which the current position of the virtual character is a central point in the game scene is determined in response to the movement of the virtual character in the game scene, if an obstacle exists in the preset search range in which the current position of the virtual character is the central point in the game scene, a target position is determined on the graphical user interface based on the current position of the touch point, and haptic feedback is performed at the target position. In the moving process of the virtual character, whether obstacles exist around the virtual character is detected in real time, under the condition that the obstacles are detected, touch feedback is carried out based on the touch points, a user is prompted in a touch feedback mode, the user can know the road condition in front in advance, the moving direction of the virtual character is adjusted in time, and the operation efficiency is improved.
Fig. 7 is a second flowchart of a game control method according to an embodiment of the present application. As shown in fig. 7, step S303 specifically includes:
s701, if an obstacle exists in a preset search range with the current position of the virtual character as a center point, determining a second relative position relation between the position of the obstacle and the current position of the virtual character;
s702, determining the target position according to the relation between the current position and the second relative position of the touch point.
Specifically, determining the target position according to the relationship between the current position and the second relative position of the touch point includes:
a1, converting the second relative position relationship into a third relative position relationship under a target coordinate system, wherein the target coordinate system is a graphical user interface coordinate system where the current position of the touch point is located.
Wherein, the second relative position relationship may include at least two of the following embodiments:
in a first optional implementation, the second relative positional relationship includes a first direction, the first direction being a direction of the obstacle with respect to the current position of the virtual character; the third relative position relationship includes a second direction, and the second direction is a sliding direction of the touch sliding operation. Step a1 specifically includes: determining a second direction according to the first direction and a preset second corresponding relation; the second corresponding relation is used for representing a mapping relation between the sliding direction of the touch sliding operation and the direction of the position of the obstacle relative to the current position of the virtual character. When the touch sliding operation slides towards the second direction, the virtual character can be controlled to move towards the direction of the obstacle.
The second correspondence relationship may include two different embodiments as follows:
in some optional embodiments, the second correspondence relationship comprises: the first direction is the same as the second direction.
For example, if the position of the obstacle is located in the left-up direction of the current position of the virtual character, the touch sliding operation is in the left-up direction of the current touch position; if the position of the obstacle is located in the upper right direction of the current position of the virtual character, the touch sliding operation is the upper right direction of the current touch position; if the position of the obstacle is located in the left lower direction of the current position of the virtual character, the touch sliding operation is the left lower direction of the current touch position; if the position of the obstacle is located in the lower right direction of the current position of the virtual character, the touch sliding operation is in the lower right direction of the current touch position.
In other alternative embodiments, the second correspondence includes: the first direction is different from the second direction, e.g., the first direction is opposite to the second direction.
Still taking a parachute jumping game as an example, if the sliding direction of the touch sliding operation is directly above the current touch position, the position of the obstacle is directly below the current position of the virtual character; if the sliding direction of the touch sliding operation is a direction directly below the current touch position, the direction of the position of the obstacle relative to the current position of the virtual character is a direction directly above.
In a second optional implementation manner, the second relative position relationship may further include a first distance, where the first distance is a distance between the obstacle and the current position of the virtual character, that is, a straight-line distance between the position of the obstacle and the current position of the virtual character; the third relative position relationship may further include a second distance, where the second distance is a sliding distance of the touch sliding operation. Step a1 further includes: determining a second distance according to the first distance and a preset third corresponding relation; the third corresponding relation is used for representing the ratio of the sliding distance of the touch sliding operation to the distance between the position of the obstacle and the current position of the virtual character. When the touch sliding operation slides a second distance in a second direction, the virtual character collides with the obstacle.
It is to be noted that, since the range of the area of the touch sliding operation in the game is small, the ratio in the third correspondence may be set to be smaller than 1.
It should be noted that the second relative position relationship may only include the first direction, may only include the first distance, and may also include both the first direction and the first distance.
a2, determining the offset with the current position of the touch point as the center according to the third relative position relation in the target coordinate system.
and a3, determining the target position according to the current position and the offset of the touch point.
On the basis of the embodiment of step a1, the offset in step a2 may also include two different embodiments as follows:
in an alternative embodiment, the offset amount includes an offset direction, and the second direction may be determined as the offset direction, that is, the offset direction of the offset amount with the current position of the touch point as the center in step a2 is the same as the second direction.
For example, if the second direction is a left-upper direction of the current touch position, the target position includes the current position of the touch point and an area of a preset distance in the left-upper direction of the current position of the touch point. For example, the area with the preset distance in the upper left direction may be a sector area with the current position of the touch point as a center and the preset distance length as a radius, the sector area may use a straight line in the second direction as a center line, and a central angle of the sector area may be 180 degrees.
In another alternative embodiment, the offset amount further includes an offset distance, and the second distance may be determined as the offset distance.
On the basis of the example of the first alternative embodiment, it can be understood that the predetermined distance length in the example of the first alternative embodiment is the second distance.
Fig. 8 is a flowchart of a game control method according to an embodiment of the present application. As shown in fig. 8, the game control method includes the steps of:
s801, a plurality of rays are emitted to a preset search range by taking the current position of the virtual character as a starting point, and the length of each ray in the plurality of rays is equal to the preset length.
As described in the above embodiment, the preset search range may be a circular area with the current position of the virtual character as the center point, and step S801 is to emit a plurality of rays to the 360-degree direction around the virtual character in the game scene with the current position of the virtual character as the starting point.
And if the preset search range is a sector area that takes the current position of the virtual character as the center point in the game scene and the preset length as the radius, then step S801 is to emit a plurality of rays in the current moving direction of the virtual character in the game scene.
The preset length comprises a first preset length and/or a second preset length; the first preset length is determined according to the height of the virtual character; the second preset length is determined according to the width of equipment of the virtual character.
The first preset length is used for determining the minimum height of the virtual character capable of passing on the front path. For example, if the virtual character is not equipped and an obstacle such as a branch or a cave exists in the moving path in front of the virtual character, the distance between the obstacle and the foot of the virtual character should be greater than the height of the virtual character, so that the virtual character can be ensured to pass in the height direction. If the virtual character has equipment, the distance between the branches, the cave and the feet of the virtual character wearing the equipment should be larger than the height of the virtual character wearing the equipment.
Fig. 9 is a first schematic diagram of a first preset length provided in the embodiment of the present application. As shown in fig. 9, when the ground on which the current position of the virtual character 61 is located is used as a reference, the height of the virtual character 61 from the ground 62 is h1, and the minimum height of the surrounding scenery 63 from the ground 62 is h2, the surrounding scenery 63 does not hinder the movement of the virtual character when h2> h 1.
Fig. 10 is a second schematic diagram of the first preset length provided in the embodiment of the present application. As shown in fig. 10, when the ground on which the virtual character 61 is currently located is used as a reference object, and the virtual character 61 wears the helmet 64, the height of the virtual character wearing the helmet relative to the ground 62 is h3, and the minimum height of the surrounding scenery 63 relative to the ground 62 is h2, when h2 ≦ h3, the surrounding scenery may hinder the movement of the virtual character. In the case of h2> h3, the surrounding scenes do not hinder the movement of the virtual character.
Similarly, the second predetermined length is used to determine the minimum width of the virtual character that can be traveled along the path in front of the virtual character. For example, if an obstacle such as a canyon exists in the forward movement path of the virtual character, the width of the canyon should be larger than half of the sum of the width of the virtual character and the width of the equipment, so that the virtual character can be ensured to pass in the width direction.
Fig. 11 is a first schematic diagram of a second preset length provided in the embodiment of the present application. As shown in fig. 11, when the width of the body of the virtual character 71 is w1 and the shortest distance between the edge of the surrounding scenery 72 close to the virtual character and the center of the body of the virtual character is w2, the surrounding scenery does not hinder the movement of the virtual character in the case of w2> w 1/2.
Fig. 12 is a second schematic diagram of a second preset length provided in the embodiment of the present application. As shown in fig. 12, when the width of the body of the virtual character 71 is w1, the width of the armor of the virtual character is w3, and the shortest distance between the edge of the peripheral scenery 73 near the virtual character and the center of the body of the virtual character is w2, the peripheral scenery does not hinder the movement of the virtual character when w2> w 3/2.
S802, if the reflection lines of at least part of the rays in the plurality of rays are received, determining that an obstacle exists in a preset search range which takes the current position of the virtual character as a central point in the game scene.
For example, if an object exists in the preset search range in the game scene, the object reflects the emitted ray to form a reflected ray, and the length of the emitted ray is a preset length, where the preset length is used to indicate a minimum distance at which the virtual character will not collide with a surrounding object in the moving process. If the reflection lines of at least part of the rays in the plurality of rays are received, the obstacle in the preset search range with the current position of the virtual character as the central point in the game scene can be determined.
On the basis of the above embodiment, if the terminal device is a terminal device with friction haptic feedback, the haptic feedback at the target position includes: performing a frictional tactile feedback at a target location with a first degree of friction; or performing friction touch feedback at the target position according to a preset friction degree change rule, wherein the friction degree at the target position is inversely related to the distance between the virtual character and the obstacle.
The term "frictionally tactile feedback at the target position with a first degree of friction" is understood to mean frictionally tactile feedback at the target position with a fixed degree of friction, the frictional resistance at the target position is a static frictional resistance, i.e. a trend resistance is represented at the target position, and the roughness felt by the user's fingers is a constant degree of friction and does not change with time.
And according to a preset friction degree change rule, performing friction tactile feedback at the target position, namely performing friction tactile feedback at the target position according to the friction degree which changes along with time, wherein the friction resistance at the target position is dynamic friction resistance, namely the roughness sensed by the fingers of the user changes along with the change of the time. That is, the user controls the virtual character to move through the touch sliding operation, and when the user's finger slides on the graphical user interface in the offset direction of the offset amount, the closer the virtual character is to the obstacle, the greater the degree of friction at the target position, the stronger the roughness felt by the user's finger, and the farther the virtual character is from the obstacle, the smaller the friction length at the target position, and the weaker the roughness felt by the user's finger. Wherein, the friction degree change rule comprises linear change, nonlinear change or abrupt change.
Fig. 13 is a trend graph of a change in friction degree of a movement control provided in the embodiment of the present application.
Fig. 14 is a diagram illustrating an effect of a change in a degree of friction of a movement control according to an embodiment of the present application.
As shown in fig. 13 and fig. 14, it can be seen that, in the effect graph of the friction force of the preset control at the time t1 to the left of the arrow in fig. 14, the degree of friction is f1, and f1 is 0, that is, the preset control at the time t1 is in a smooth state. To the right of the arrow in fig. 14 is a graph of the friction effect of the preset control at time t2, the friction degree is f2, f2>0, that is, the preset control at time t1 is in a rough state. The roughness can be felt by the user touching with a finger.
On the basis of the above-described embodiments, the haptic feedback is realized by a haptic feedback component provided inside the terminal device. And the haptic feedback assembly includes a plurality of haptic feedback units each having a degree of friction.
The plurality of haptic feedback units are shown in a granular pattern in fig. 14, and the variation law of the degree of friction includes a variation law of the number of haptic feedback units and/or a variation law of the degree of friction of each of the plurality of haptic feedback units.
Wherein, the change rule of the friction degree of each tactile feedback unit can be linear, non-linear or abrupt change.
The variation law of the number of the haptic feedback units includes gradually decreasing and gradually increasing, and please continue to refer to fig. 13, the time t3 in fig. 13 is 50 haptic feedback units, and the time t4 is decreased to 30 haptic feedback units.
On the basis of the above embodiment, the target position display pattern may also be controlled according to the target position, and the pattern is used to indicate the position where the tactile feedback is performed.
Then, on the basis that the target position is the current position of the touch point, the pattern may be displayed in a circular area with the current position of the touch point as the center and the preset distance as the radius, or the pattern may be displayed in an annular area with the current position of the touch point as the center and the preset distance as the radius, and the area range corresponding to the inner ring of the annular area should include the contact area corresponding to the touch point. For example, on the basis of fig. 5, an annular region having a circle corresponding to the contact region 51 as an inner ring is determined outside the contact region 51 in fig. 5, and a pattern is displayed in the annular region.
And on the basis of determining the target position according to the current position and the offset of the touch point, the pattern can be displayed in the area indicated by the offset. For example, on the basis of fig. 6, the pattern is displayed in the upper left direction of the current touch point position.
Alternatively, the pattern includes a ring shape, a circle, a stripe shape, or the like. The change law of the friction degree comprises a linear change, a nonlinear change or an abrupt change. Wherein the non-linear change may be an exponential change, and the abrupt change may be a step jump.
In the automatic route-finding scene, because the user is usually in an on-hook state after setting the automatic route-finding, the friction tactile feedback at the target position cannot timely remind the user. Thus, after haptic feedback at the target location, the method of this embodiment may further comprise: if the obstacles exist in a preset search range which takes the current position of the virtual character as a central point in the game scene after the preset time is exceeded, re-planning a path for the virtual character according to a routing algorithm; and controlling the virtual character to move according to the re-planned path. That is, if the time for performing the frictional tactile feedback at the target position exceeds the preset time and an obstacle still exists in a preset search range which takes the current position of the virtual character as a central point in the game scene, replanning a path for the virtual character according to a routing algorithm; and controlling the virtual character to move according to the re-planned path. In this way, the user can automatically return to the normal track without opening the map.
Optionally, the way-finding algorithm includes a breadth-optimal search algorithm, a greedy algorithm, Dijkstra (Dijkstra) algorithm, an a (a-star) search algorithm, and a B (B-star) search algorithm.
On the basis of the above method embodiment, fig. 15 is a schematic structural diagram of a control device of a game provided in the embodiment of the present application. As shown in fig. 15, the apparatus for controlling a game, which provides a graphical user interface through a terminal device, where the graphical user interface further includes a mobile control, where content displayed by the graphical user interface at least partially includes a game scene of the game, where the game scene includes a virtual character, includes: a control module 150, a first determination module 151, a second determination module 152, and a haptic feedback module 153;
optionally, if an obstacle exists in a preset search range that takes the current position of the virtual character as a center point, the second determining module 152 specifically includes, when determining a target position on the graphical user interface based on the current position of the touch point: and if an obstacle exists in a preset search range with the current position of the virtual character as a central point, determining the current position of the touch point as a target position.
Optionally, if an obstacle exists in the preset search range in which the current position of the virtual character is the central point, the second determining module 152 specifically includes, when determining a target position on the graphical user interface based on the current position of the touch point: if an obstacle exists in a preset search range with the current position of the virtual character as a central point, determining a second relative position relation between the position of the obstacle and the current position of the virtual character; and determining the target position according to the relationship between the current position of the touch point and the second relative position.
Optionally, the determining the target position by the second determining module 152 according to the relationship between the current position of the touch point and the second relative position specifically includes: converting the second relative position relationship into a third relative position relationship under a target coordinate system, wherein the target coordinate system is a graphical user interface coordinate system where the current position of the touch point is located; determining an offset with the current position of the touch point as a center according to the third relative position relation in the target coordinate system; and determining a target position according to the current position of the touch point and the offset.
Optionally, the first determining module 151 determines whether an obstacle exists in a preset search range in which the current position of the virtual character is a central point in the game scene, specifically including: and determining whether an obstacle exists in a preset search range in the current moving direction by taking the current position of the virtual character as a central point in the game scene.
Optionally, the first determining module 151 determines whether an obstacle exists in a preset search range in which the current position of the virtual character is a central point in the game scene, specifically including: launching a plurality of rays into a preset search range by taking the current position of the virtual character as a starting point, wherein the length of each ray in the plurality of rays is equal to a preset length; and if the reflection lines of at least part of the rays in the plurality of rays are received, determining that an obstacle exists in a preset search range which takes the current position of the virtual character as a central point in the game scene.
Optionally, the preset length includes a first preset length and/or a second preset length; the first preset length is determined according to the height of the virtual character; the second preset length is determined according to the width of equipment of the virtual character.
Optionally, the preset search range is a sector area that takes the current position of the virtual character as a central point in a game scene and takes the preset length as a radius, and a central angle of the sector area is 180 degrees.
Optionally, if the terminal device is a terminal device with friction haptic feedback, the haptic feedback module 153 performs haptic feedback at the target position, which specifically includes: frictionally tactile feedback at the target location at a first degree of friction; or performing friction tactile feedback at the target position according to a preset friction degree change rule, wherein the friction degree at the target position is negatively related to the distance between the virtual character and the obstacle.
Optionally, the apparatus further comprises: a display module 154 for displaying a pattern according to the target position, the pattern being used for prompting a position for tactile feedback.
Optionally, the friction degree variation law includes a linear variation, a non-linear variation or an abrupt variation.
Optionally, the apparatus further comprises: the path rule module 155 is configured to replan a path to the virtual character according to a routing algorithm if an obstacle exists within a preset search range in which the current position of the virtual character is a central point in the game scene after a preset time is exceeded; a control module 156 for controlling the virtual character to move according to the re-planned path.
The control device for the game provided by the embodiment of the application can be used for executing the technical scheme of the control method for the game in the embodiment, the implementation principle and the technical effect are similar, and the details are not repeated herein.
It should be noted that the division of the modules of the above apparatus is only a logical division, and the actual implementation may be wholly or partially integrated into one physical entity, or may be physically separated. And these modules can be realized in the form of software called by processing element; or may be implemented entirely in hardware; and part of the modules can be realized in the form of calling software by the processing element, and part of the modules can be realized in the form of hardware. For example, the second determining module 152 may be a separate processing element, or may be implemented by being integrated into a chip of the apparatus, or may be stored in a memory of the apparatus in the form of program code, and the processing element of the apparatus calls and executes the functions of the second determining module 152. Other modules are implemented similarly. In addition, all or part of the modules can be integrated together or can be independently realized. The processing element here may be an integrated circuit with signal processing capabilities. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in the form of software.
Fig. 16 is a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 16, the electronic device may include: transceiver 161, processor 162, memory 163.
Processor 162 executes computer-executable instructions stored in memory, causing processor 162 to perform aspects of the embodiments described above. The processor 162 may be a general-purpose processor including a central processing unit CPU, a Network Processor (NP), and the like; but also a digital signal processor DSP, an application specific integrated circuit ASIC, a field programmable gate array FPGA or other programmable logic device, discrete gate or transistor logic, discrete hardware components.
Memory 163 is coupled to processor 162 via the system bus and communicates with each other, and memory 163 stores computer program instructions.
The transceiver 161 may be used to receive operating instructions.
The system bus may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The system bus may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus. The transceiver is used to enable communication between the database access device and other computers (e.g., clients, read-write libraries, and read-only libraries). The memory may include Random Access Memory (RAM) and may also include non-volatile memory (non-volatile memory).
The electronic device provided by the embodiment of the application may be the terminal device of the above embodiment.
The embodiment of the application also provides a chip for running the instructions, and the chip is used for executing the technical scheme of the control method of the game in the embodiment.
The embodiment of the present application further provides a computer-readable storage medium, where a computer instruction is stored in the computer-readable storage medium, and when the computer instruction runs on a computer, the computer is enabled to execute the technical solution of the control method of the game according to the above embodiment.
The embodiment of the present application further provides a computer program product, where the computer program product includes a computer program, the computer program is stored in a computer-readable storage medium, at least one processor can read the computer program from the computer-readable storage medium, and when the computer program is executed by the at least one processor, the technical solution of the control method for a game in the foregoing embodiment can be implemented.
The game control method in the embodiment of the application can be operated on terminal equipment or a cloud interaction system.
The cloud interaction system comprises a cloud server and user equipment and is used for running cloud applications. The cloud applications run separately.
In an alternative embodiment, cloud gaming refers to a cloud computing-based gaming mode. In the running mode of the cloud game, the running main body of the game program and the game picture presenting main body are separated, the storage and the running of the object selection method are completed on a cloud game server, and the cloud game client is used for receiving and sending data and presenting the game picture, for example, the cloud game client can be a display device with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palm computer and the like; however, the terminal device performing the game data processing is a cloud game server in the cloud. When a game is played, a user operates the cloud game client to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, data such as game pictures and the like are encoded and compressed, the data are returned to the cloud game client through a network, and finally the data are decoded through the cloud game client and the game pictures are output.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It will be understood that the present application is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (15)

1. A control method of a game is characterized in that a terminal device provides a graphical user interface, the graphical user interface comprises a mobile control, content displayed by the graphical user interface at least partially comprises a game scene of the game, the game scene comprises virtual characters, and the method comprises the following steps:
responding to the touch sliding operation aiming at the mobile control, acquiring a first relative position relation between the current position of the touch point of the touch sliding operation on the graphical user interface and the original point position of the mobile control, and controlling the virtual character to move towards the movement direction determined by the first relative position relation in the game scene;
responding to the movement of the virtual character in the game scene, and determining whether an obstacle exists in a preset search range which takes the current position of the virtual character as a central point in the game scene;
if an obstacle exists in a preset search range with the current position of the virtual character as a central point, determining a target position on the graphical user interface based on the current position of the touch point;
haptic feedback is performed at the target location.
2. The method of claim 1, wherein determining a target location on the gui based on the current location of the touch point if an obstacle exists within a predetermined search range centered on the current location of the virtual character comprises:
and if an obstacle exists in a preset search range with the current position of the virtual character as a central point, determining the current position of the touch point as a target position.
3. The method of claim 1, wherein determining a target location on the gui based on the current location of the touch point if an obstacle exists within a predetermined search range centered on the current location of the virtual character comprises:
if an obstacle exists in a preset search range with the current position of the virtual character as a central point, determining a second relative position relation between the position of the obstacle and the current position of the virtual character;
and determining the target position according to the relationship between the current position of the touch point and the second relative position.
4. The method of claim 3, wherein determining the target position according to the current position of the touch point and the second relative position relationship comprises:
converting the second relative position relationship into a third relative position relationship under a target coordinate system, wherein the target coordinate system is a graphical user interface coordinate system where the current position of the touch point is located;
determining an offset with the current position of the touch point as a center according to the third relative position relation in the target coordinate system;
and determining a target position according to the current position of the touch point and the offset.
5. The method of claim 1, wherein the determining whether an obstacle exists within a preset search range centered on the current position of the virtual character in the game scene comprises:
and determining whether an obstacle exists in a preset search range in the current moving direction by taking the current position of the virtual character as a central point in the game scene.
6. The method according to any one of claims 1-5, wherein the determining whether an obstacle exists within a preset search range centered on the current position of the virtual character in the game scene comprises:
launching a plurality of rays into a preset search range by taking the current position of the virtual character as a starting point, wherein the length of each ray in the plurality of rays is equal to a preset length;
and if the reflection lines of at least part of the rays in the plurality of rays are received, determining that an obstacle exists in a preset search range which takes the current position of the virtual character as a central point in the game scene.
7. The method according to claim 6, wherein the preset length comprises a first preset length and/or a second preset length;
the first preset length is determined according to the height of the virtual character;
the second preset length is determined according to the width of equipment of the virtual character.
8. The method according to claim 6, wherein the preset search range is a sector area having a radius of the preset length in a direction of a current movement of the virtual character in a game scene with a current position of the virtual character as a center point, and a central angle of the sector area is 180 degrees.
9. The method according to any one of claims 1-5, wherein the terminal device is a terminal device with friction haptic feedback, and the haptic feedback at the target position comprises:
frictionally tactile feedback at the target location at a first degree of friction;
alternatively, the first and second electrodes may be,
and performing friction tactile feedback at the target position according to a preset friction degree change rule, wherein the friction degree at the target position is inversely related to the distance between the virtual character and the obstacle.
10. The method of claim 9, further comprising:
and displaying a pattern according to the target position, wherein the pattern is used for prompting the position for tactile feedback.
11. The method of claim 9, wherein the degree of friction change law comprises a linear change, a non-linear change, or an abrupt change.
12. The method of any of claims 1-5, wherein after the haptic feedback at the target location, the method further comprises:
if an obstacle exists in a preset search range which takes the current position of the virtual character as a central point in the game scene after the preset time is exceeded, replanning a path for the virtual character according to a routing algorithm;
and controlling the virtual role to move according to the re-planned path.
13. An apparatus for controlling a game, wherein a graphical user interface is provided through a terminal device, the graphical user interface further includes a mobile control, content displayed by the graphical user interface at least partially includes a game scene of the game, the game scene includes a virtual character, and the apparatus includes:
the control module is used for responding to the touch sliding operation aiming at the mobile control, acquiring a first relative position relation between the current position of a touch point of the touch sliding operation on the graphical user interface and the original point position of the mobile control, and controlling the virtual character to move towards the movement direction determined by the first relative position relation in the game scene;
the first determination module is used for responding to the movement of the virtual character in the game scene, and determining whether an obstacle exists in a preset search range which takes the current position of the virtual character as a central point in the game scene;
a second determining module, configured to determine a target position on the graphical user interface based on the current position of the touch point if an obstacle exists within a preset search range with the current position of the virtual character as a center point;
a haptic feedback module to perform haptic feedback at the target location.
14. An electronic device, comprising: a memory, a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to implement the method of any one of claims 1-12.
15. A computer-readable storage medium having computer-executable instructions stored thereon, which when executed by a processor, perform the method of any one of claims 1-12.
CN202110976432.6A 2021-08-24 2021-08-24 Game control method and device, electronic equipment and storage medium Pending CN113663333A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110976432.6A CN113663333A (en) 2021-08-24 2021-08-24 Game control method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110976432.6A CN113663333A (en) 2021-08-24 2021-08-24 Game control method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113663333A true CN113663333A (en) 2021-11-19

Family

ID=78545784

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110976432.6A Pending CN113663333A (en) 2021-08-24 2021-08-24 Game control method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113663333A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114253401A (en) * 2021-12-27 2022-03-29 郑州捷安高科股份有限公司 Method and device for determining position in virtual scene, electronic equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107789830A (en) * 2017-09-15 2018-03-13 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium
CN111760268A (en) * 2020-07-06 2020-10-13 网易(杭州)网络有限公司 Path finding control method and device in game
CN111888762A (en) * 2020-08-13 2020-11-06 网易(杭州)网络有限公司 Method for adjusting visual angle of lens in game and electronic equipment
CN112221118A (en) * 2020-11-09 2021-01-15 腾讯科技(深圳)有限公司 Human-computer interaction perception processing method and device and electronic equipment
CN112807681A (en) * 2021-02-25 2021-05-18 腾讯科技(深圳)有限公司 Game control method, device, electronic equipment and storage medium
CN113168228A (en) * 2018-12-31 2021-07-23 佳殿玻璃有限公司 Systems and/or methods for parallax correction in large area transparent touch interfaces
CN113244610A (en) * 2021-06-02 2021-08-13 网易(杭州)网络有限公司 Method, device, equipment and storage medium for controlling virtual moving object in game

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107789830A (en) * 2017-09-15 2018-03-13 网易(杭州)网络有限公司 Information processing method, device, electronic equipment and storage medium
CN113168228A (en) * 2018-12-31 2021-07-23 佳殿玻璃有限公司 Systems and/or methods for parallax correction in large area transparent touch interfaces
CN111760268A (en) * 2020-07-06 2020-10-13 网易(杭州)网络有限公司 Path finding control method and device in game
CN111888762A (en) * 2020-08-13 2020-11-06 网易(杭州)网络有限公司 Method for adjusting visual angle of lens in game and electronic equipment
CN112221118A (en) * 2020-11-09 2021-01-15 腾讯科技(深圳)有限公司 Human-computer interaction perception processing method and device and electronic equipment
CN112807681A (en) * 2021-02-25 2021-05-18 腾讯科技(深圳)有限公司 Game control method, device, electronic equipment and storage medium
CN113244610A (en) * 2021-06-02 2021-08-13 网易(杭州)网络有限公司 Method, device, equipment and storage medium for controlling virtual moving object in game

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114253401A (en) * 2021-12-27 2022-03-29 郑州捷安高科股份有限公司 Method and device for determining position in virtual scene, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
US11565181B2 (en) Virtual object control method and apparatus, computer device, and storage medium
US9001062B2 (en) Method for controlling computer that is held and operated by user using a re-touch determination area
KR20240011871A (en) Method for controlling virtual object, and related apparatus
US9535493B2 (en) Apparatus, method, computer program and user interface
US11266904B2 (en) Game system, game control device, and information storage medium
WO2022247592A1 (en) Virtual prop switching method and apparatus, terminal, and storage medium
JP2014083395A (en) Game-providing device
CN112245908A (en) Method and device for controlling game virtual character, storage medium and electronic equipment
US20140375560A1 (en) Storage medium storing information processing program, information processing device, information processing system, and method for calculating specified position
CN113663333A (en) Game control method and device, electronic equipment and storage medium
US20140111551A1 (en) Information-processing device, storage medium, information-processing method, and information-processing system
CN110743161B (en) Virtual object control method, device, terminal and storage medium
US20230142566A1 (en) System and method for precise positioning with touchscreen gestures
CN115193042A (en) Display control method, display control device, electronic equipment and storage medium
CN114404944A (en) Method and device for controlling player character, electronic device and storage medium
JP5933069B2 (en) Game providing device
CN113694514A (en) Object control method and device
KR102557808B1 (en) Gaming service system and method for sharing memo therein
CN117717780B (en) Virtual character control method, device, electronic equipment and storage medium
US20220379210A1 (en) Game scene processing method, apparatus, storage medium, and electronic device
CN112402967B (en) Game control method, game control device, terminal equipment and medium
KR101983696B1 (en) Apparatus for interfacing of game
KR20230159090A (en) Method and apparatus for providing virtual input pad on touch screen
CN111973984A (en) Coordinate control method and device for virtual scene, electronic equipment and storage medium
CN116115996A (en) Picture display method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination