CN110613933A - Skill release control method and device in game, storage medium and processor - Google Patents

Skill release control method and device in game, storage medium and processor Download PDF

Info

Publication number
CN110613933A
CN110613933A CN201910907422.XA CN201910907422A CN110613933A CN 110613933 A CN110613933 A CN 110613933A CN 201910907422 A CN201910907422 A CN 201910907422A CN 110613933 A CN110613933 A CN 110613933A
Authority
CN
China
Prior art keywords
virtual
touch point
skill
game
virtual camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910907422.XA
Other languages
Chinese (zh)
Inventor
饶峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201910907422.XA priority Critical patent/CN110613933A/en
Publication of CN110613933A publication Critical patent/CN110613933A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a skill release control method and device in a game, a storage medium and a processor. Wherein, the method comprises the following steps: determining the orientation of a virtual camera in the game according to the orientation of the virtual character in the game scene of the game, wherein the virtual camera is used for controlling the presentation visual field of the game scene; responding to sliding touch operation acting on a skill icon, determining the position of a touch point of the sliding touch operation, and controlling the release direction of the virtual skill according to the touch point; and if the touch point is positioned in the preset area of the graphical interface, adjusting the relative position between the virtual camera and the virtual character. The invention solves the technical problem that the skill release condition of the blind area of the visual field cannot be previewed in the third person's name action game.

Description

Skill release control method and device in game, storage medium and processor
Technical Field
The invention relates to the field of computers, in particular to a skill release control method, a skill release control device, a skill release control storage medium and a skill release control processor in a game.
Background
The ARPG, Action Role Playing Game, is a Game of Action Role Playing type, and the third person who goes through the shoulder lens is mostly called a visual angle (i.e. the third person is called an Action Game), such as a stimulating battlefield.
Fig. 1 is a schematic diagram of a third person action game according to the prior art, and as shown in fig. 1, in the third person action game of the over-shoulder type, the visual field of a player is mainly in the front of the visual field, and the back of a game character cannot be seen. Therefore, in the third person action game, the game character performs skill release only within the range of the front view.
In the third person action game, the manner of game character skill release is as follows:
fig. 2 is a schematic diagram showing a skill release manner according to the prior art, as shown in fig. 2, when a player clicks a skill icon in a game, the skill release is performed directly in front of a game character.
However, with this skill releasing method, the player can only orient the face of the game character and release the skill, and cannot select the direction of releasing the virtual skill, nor can the player preview the direction of releasing the skill.
Fig. 3 is a schematic diagram of a skill release method according to the prior art, and as shown in fig. 3, the release direction of the virtual skill is consistent with the face direction of the game character, and a player can move the game character to change the game view angle by sliding up, down, left and right, thereby adjusting the face direction of the game character and changing the release direction of the virtual skill of the game character.
However, the skill release mode is adopted, so that the operation efficiency is low, and the key time is easy to miss; when the game character is controlled to release the skill after the game character is pointed, the game character needs to slide to turn over and then aim (namely, the release direction of the virtual skill is previewed); or aim at earlier and then control the recreation role and turn round, operating efficiency is lower, easily misses the key opportunity that the skill releases.
Fig. 4 is a third schematic diagram of a skill release mode according to the prior art, and fig. 5 is a fourth schematic diagram of a skill release mode according to the prior art, as shown in fig. 4 and 5, after a skill key is pressed (or triggered), the visual angle of the game is kept unchanged (namely, the visual angle of the game is locked), and a player controls the release direction of a virtual skill by sliding up, down, left and right, and then releases the skill key to complete the skill release.
However, according to this skill release method, the skill preview display is not complete, and for example, when the skill preview display is performed through a shoulder lens (i.e., a third person's perspective), the game character face is forward, and when a direction is selected at the time of skill release, for example, when the skill is released to the back of the game character, the view behind the game character is not complete, and the back of the game character cannot be seen.
In order to solve the problem that the skill release condition of the blind area in the third person movement game cannot be previewed, an effective solution is not provided at present.
Disclosure of Invention
The embodiment of the invention provides a skill release control method, a skill release control device, a storage medium and a processor in a game, which are used for at least solving the technical problem that the skill release condition of a blind field of vision cannot be previewed in a third person weighing action game.
According to an aspect of the embodiments of the present invention, there is provided a method for adjusting a virtual camera, where a game includes at least one virtual character, the virtual character executes a virtual action according to an operation instruction received by a game terminal device, a graphical user interface of the game terminal device includes at least one skill icon, and the skill icon corresponds to a virtual skill of the virtual character, including: determining the orientation of a virtual camera in the game according to the orientation of the virtual character in the game scene of the game, wherein the virtual camera is used for controlling the presentation visual field of the game scene; responding to sliding touch operation acting on a skill icon, determining the position of a touch point of the sliding touch operation, and controlling the release direction of the virtual skill according to the touch point; and if the touch point is positioned in a preset area of the graphical interface, adjusting the relative position between the virtual camera and the virtual character.
Further, if the touch point is located in a predetermined area of the graphical interface, adjusting a relative position between a virtual camera and the virtual character includes: and if the touch point is located in a preset area of the graphical interface, adjusting the relative position between a virtual camera and the virtual character according to the position of the touch point.
Further, if the touch point is located in a predetermined area of the graphical interface, adjusting a relative position between a virtual camera and the virtual character includes: and if the touch point is located in a preset area of the graphical interface, adjusting the relative position between the virtual camera and the virtual character and the orientation of the virtual camera.
Further, if the touch point is located in a predetermined area of the graphical interface, adjusting a relative position between a virtual camera and the virtual character includes: and if the touch point is positioned in a preset area of the graphical interface, adjusting the relative height between the virtual camera and the virtual character.
Further, the sliding touch operation acting on the skill icon comprises: and starting from the sliding touch operation of the skill icon.
Further, before determining the position of the touch point of the sliding touch operation in response to the sliding touch operation acting on the skill icon, the method further includes: and establishing a mapping relation between the position of the touch point and the release direction of the virtual skill.
Further, establishing a mapping relationship between the position of the touch point and the release direction of the virtual skill comprises: establishing a two-dimensional rectangular coordinate system by taking the initial position of the touch point as an origin, and marking the position of the touch point as (x1, y 1); establishing a three-dimensional rectangular coordinate system by taking the position of the virtual character in the game scene as an origin, the horizontal orientation of the virtual character as a y-axis and the vertical upward direction in the game scene as a z-axis, and marking the release direction of the virtual skill as (x0, y0, z 0); wherein x0 in the release direction of the virtual skill is a mapping of x1 in the position of the touch point, and y0 in the release direction of the virtual skill is a mapping of y1 in the position of the touch point.
Further, before adjusting the relative position between the virtual camera and the virtual character if the touch point is located in the predetermined area of the graphical interface, the method further includes: judging whether y1 in the position of the touch point is less than or equal to zero; and under the condition that y1 in the positions of the touch points is less than or equal to 0, determining that the touch points are located in a preset area of the graphical interface.
Further, before adjusting the relative position between the virtual camera and the virtual character if the touch point is located in the predetermined area of the graphical interface, the method further includes: establishing a polar coordinate system by taking the initial position of the touch point as an origin, and marking the position of the touch point as (rho, theta), wherein rho is the distance between the position of the touch point and the initial position, and theta is a polar angle between the touch point and a polar axis; judging whether the position of the touch point is located in a preset area of the graphical interface according to a polar angle theta between the touch point and a polar axis; and determining that the position of the touch point is not located in the preset area of the graphical interface when the polar angle theta is between (0 degrees, 180 degrees) and determining that the position of the touch point is located in the preset area of the graphical interface when the polar angle theta is between (180 degrees, 360 degrees).
Further, if the touch point is located in a predetermined area of the graphical interface, adjusting the relative position between the virtual camera and the virtual character includes: when the polar angle theta is between (180 DEG, 270 DEG), the relative height of the virtual camera and the virtual character in the vertical direction is increased in the game scene, the relative distance of the virtual camera and the virtual character in the horizontal direction is reduced, and the top view angle of the virtual camera is increased, when the polar angle theta is between (270 DEG, 360 DEG), the relative height of the virtual camera and the virtual character in the vertical direction is reduced in the game scene, the relative distance of the virtual camera and the virtual character in the horizontal direction is increased, and the top view angle of the virtual camera is reduced.
Further, if the touch point is located in a predetermined area of the graphical interface, adjusting the relative position between the virtual camera and the virtual character includes at least one of: according to the position of the touch point, adjusting the relative height of the virtual camera and the virtual character in the vertical direction in the game scene; and/or adjusting the relative distance between the virtual camera and the virtual character in the horizontal direction in the game scene according to the position of the touch point; and/or adjusting the overlooking angle of the virtual camera in the game scene according to the position of the touch point.
Further, the method further comprises: and if the touch point is not located in the preset area of the graphical interface, keeping adjusting the relative position between the virtual camera and the virtual character.
According to another aspect of the embodiments of the present invention, there is also provided an in-game skill release control apparatus, where the game includes at least one virtual character, the virtual character executes a virtual action according to an operation instruction received by a game terminal device, a graphical user interface of the game terminal device includes at least one skill icon, and the skill icon corresponds to a virtual skill of the virtual character, including: a determining unit, configured to determine an orientation of a virtual camera in a game according to an orientation of the virtual character in a game scene of the game, where the virtual camera is used to control a rendering field of view of the game scene; the response unit is used for responding to the sliding touch operation acting on the skill icon, determining the position of a touch point of the sliding touch operation, and controlling the release direction of the virtual skill according to the touch point; and the adjusting unit is used for adjusting the relative position between the virtual camera and the virtual character if the touch point is positioned in the preset area of the graphical interface.
According to another aspect of the embodiments of the present invention, there is also provided a storage medium including a stored program, wherein when the program runs, a device on which the storage medium is located is controlled to execute the in-game skill release control method described above.
According to another aspect of the embodiment of the present invention, there is further provided a processor, configured to execute a program, where the program executes the in-game skill release control method described above.
In the embodiment of the invention, the game comprises at least one virtual character, the virtual character executes a virtual action according to an operation instruction received by the game terminal equipment, a graphical user interface of the game terminal equipment comprises at least one skill icon, the skill icon corresponds to the virtual skill of the virtual character, the orientation of a virtual camera in the game is determined according to the orientation of the virtual character in the game scene of the game, and the virtual camera is used for controlling the presentation visual field of the game scene; responding to sliding touch operation acting on the skill icon, determining the position of a touch point of the sliding touch operation, and controlling the release direction of the virtual skill according to the touch point; if the touch point is located in the preset area of the graphical interface, the relative position between the virtual camera and the virtual character is adjusted, the virtual camera is controlled to collect the image of the visual field blind area of the virtual character, and the purpose of collecting the image of the visual field blind area is achieved, so that the technical effect of previewing the skill release condition of the visual field blind area is achieved, and the technical problem that the skill release condition of the visual field blind area cannot be previewed in a third person named action game is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
FIG. 1 is a schematic diagram of a third person action game according to the prior art;
FIG. 2 is a first schematic diagram of a manner of releasing virtual skills according to the prior art;
FIG. 3 is a second schematic diagram of a virtual skill release according to the prior art;
FIG. 4 is a third schematic diagram of a virtual skill release method according to the prior art;
FIG. 5 is a fourth schematic diagram of a virtual skill release according to the prior art;
FIG. 6 is a flow chart of an in-game skill release control method according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of a two-dimensional rectangular coordinate system for creating a skill icon in accordance with an embodiment of the invention;
FIG. 8 is a diagram illustrating a touch point at a first position according to an embodiment of the present invention;
fig. 9 is a schematic diagram of a release direction of a virtual skill with a touch point at a first position and a position of a virtual camera according to an embodiment of the present invention;
FIG. 10 is a diagram illustrating a game effect with a touch point at a first position according to an embodiment of the present invention;
FIG. 11 is a diagram illustrating a touch point at a second position according to an embodiment of the present invention;
fig. 12 is a schematic diagram of a release direction of a virtual skill with a touch point at a second position and a position of a virtual camera according to an embodiment of the present invention;
FIG. 13 is a diagram illustrating a game effect with a touch point at a second position according to an embodiment of the present invention;
FIG. 14 is a diagram illustrating a touch point at position A1 according to an embodiment of the present invention;
fig. 15 is a schematic diagram of a release direction of a virtual skill with a touch point at position a1 and a position of a virtual camera according to an embodiment of the present invention;
FIG. 16 is a diagram illustrating a game effect with a touch point at position A1 according to an embodiment of the present invention;
FIG. 17 is a diagram illustrating a touch point at position A2 according to an embodiment of the present invention;
fig. 18 is a schematic diagram of a release direction of a virtual skill with a touch point at position a2 and a position of a virtual camera according to an embodiment of the present invention;
FIG. 19 is a diagram illustrating a game effect with a touch point at location A2 according to an embodiment of the present invention;
FIG. 20 is a diagram illustrating a touch point at position A3 according to an embodiment of the present invention;
fig. 21 is a schematic diagram of a release direction of a virtual skill with a touch point at position a3 and a position of a virtual camera according to an embodiment of the present invention;
FIG. 22 is a diagram illustrating a game effect with a touch point at location A3 according to an embodiment of the present invention;
FIG. 23 is a diagram illustrating a touch point at position A4 according to an embodiment of the present invention;
fig. 24 is a schematic diagram of a release direction of a virtual skill with a touch point at position a4 and a position of a virtual camera according to an embodiment of the present invention;
FIG. 25 is a diagram illustrating a game effect with a touch point at location A4 according to an embodiment of the present invention;
FIG. 26 is a diagram illustrating a touch point at position A5 according to an embodiment of the present invention;
fig. 27 is a schematic diagram of a release direction of a virtual skill with a touch point at position a5 and a position of a virtual camera according to an embodiment of the present invention;
FIG. 28 is a diagram illustrating a game effect with a touch point at location A5, according to an embodiment of the present invention;
FIG. 29 is a diagram illustrating a touch point at position A6 according to an embodiment of the present invention;
fig. 30 is a schematic diagram of a release direction of a virtual skill with a touch point at position a6 and a position of a virtual camera according to an embodiment of the present invention;
FIG. 31 is a diagram illustrating a game effect with a touch point at position A6 according to an embodiment of the present invention;
FIG. 32 is a schematic diagram of a polar coordinate system for creating a skill icon in accordance with an embodiment of the invention;
fig. 33 is a schematic diagram of an in-game skill release control apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In accordance with an embodiment of the present invention, there is provided an embodiment of a method for adjusting a virtual camera, where the steps illustrated in the flowchart of the drawings may be performed in a computer system, such as a set of computer-executable instructions, and where a logical order is illustrated in the flowchart, in some cases, the steps illustrated or described may be performed in an order different than that illustrated herein.
Fig. 6 is a flowchart of a skill release control method in a game according to an embodiment of the present invention, where as shown in fig. 6, the game includes at least one virtual character, the virtual character executes a virtual action according to an operation instruction received by a game terminal device, a graphical user interface of the game terminal device includes at least one skill icon, and the skill icon corresponds to a virtual skill of the virtual character, where the method includes the following steps:
step S602, determining the orientation of a virtual camera in the game according to the orientation of the virtual character in the game scene of the game, wherein the virtual camera is used for controlling the presenting visual field of the game scene;
step S604, responding to the sliding touch operation acting on the skill icon, determining the position of a touch point of the sliding touch operation, and controlling the release direction of the virtual skill according to the touch point;
in step S606, if the touch point is located in the predetermined area of the graphical interface, the relative position between the virtual camera and the virtual character is adjusted.
In the embodiment of the invention, the game comprises at least one virtual character, the virtual character executes a virtual action according to an operation instruction received by the game terminal equipment, a graphical user interface of the game terminal equipment comprises at least one skill icon, the skill icon corresponds to the virtual skill of the virtual character, the orientation of a virtual camera in the game is determined according to the orientation of the virtual character in the game scene of the game, and the virtual camera is used for controlling the presentation visual field of the game scene; responding to sliding touch operation acting on the skill icon, determining the position of a touch point of the sliding touch operation, and controlling the release direction of the virtual skill according to the touch point; if the touch point is located in the preset area of the graphical interface, the relative position between the virtual camera and the virtual character is adjusted, the virtual camera is controlled to collect the image of the visual field blind area of the virtual character, and the purpose of collecting the image of the visual field blind area is achieved, so that the technical effect of previewing the skill release condition of the visual field blind area is achieved, and the technical problem that the skill release condition of the visual field blind area cannot be previewed in a third person named action game is solved.
It should be noted that the adjustment of the virtual camera may be applied to a game, for example, in an action role playing game ARPG, a user may control a virtual character in the game to release a virtual skill through a skill icon, determine a release direction of the virtual skill according to a position of a touch point acting on an energy-saving icon, and adjust a position and a posture of the virtual camera in the game, and in a case where the user controls the virtual character to release the skill to a blind area of a field of view, the purpose of previewing a release condition of the skill of the blind area of the field of view may be achieved by adjusting the position and the posture of the camera.
Alternatively, the skill icon may be displayed on a touch screen of the touch device, and the user may generate the touch point by performing a sliding touch operation on the touch screen of the touch device.
Optionally, the touch points at different positions on the skill icon may control different virtual skills to release, and the game character may be instructed to release the virtual skills in the direction corresponding to the position of the touch point according to the touch points.
Alternatively, the touch device may be a touch joystick, or a touch screen (such as a screen of a cell phone or tablet computer).
Optionally, the user may preview the release direction or the release position of the virtual skill before the virtual character releases the virtual skill by operating the skill icon; the virtual skill release effect can also be previewed during the process of releasing the virtual skill of the virtual character or after the virtual skill is released.
It should be noted that the virtual character may be a game character operated by a user, and the virtual character may be a game character in a game, such as a character, an animal, a racing car, a tank, an airplane, a battleship, and the like; the virtual skill released by the virtual character can be the attacking ability given to the game character in the game, and the attacking ability can be the hitting ability of the game character (such as the attacking skill of the game character), or the game character controls a prop in the game to hit (such as the game character controls a game prop such as a gun, a grenade, a cannon and the like to attack).
It should be noted that the blind field of view may be the back or the side of the virtual character, for example, the user controls the virtual character to release the skill to the back or the side.
In the scheme provided in step S606, the virtual camera is configured to capture an image of the virtual character from a third perspective in the game scene, and the virtual camera is disposed behind the virtual character and captures the image of the virtual character from the third perspective in a top-down posture.
As an alternative embodiment, if the touch point is located in a predetermined area of the graphical interface, adjusting the relative position between the virtual camera and the virtual character includes: and if the touch point is positioned in the preset area of the graphical interface, adjusting the relative position between the virtual camera and the virtual character according to the position of the touch point.
As an alternative embodiment, if the touch point is located in a predetermined area of the graphical interface, adjusting the relative position between the virtual camera and the virtual character includes: and if the touch point is located in the preset area of the graphical interface, adjusting the relative position between the virtual camera and the virtual character and the orientation of the virtual camera.
As an alternative embodiment, if the touch point is located in a predetermined area of the graphical interface, adjusting the relative position between the virtual camera and the virtual character includes: and if the touch point is positioned in the preset area of the graphical interface, adjusting the relative height between the virtual camera and the virtual character.
In the above embodiment of the present invention, adjusting the relative position between the virtual camera and the virtual character includes: adjusting the relative position between the virtual camera and the virtual character according to the position of the touch point; or adjusting the relative position between the virtual camera and the virtual character and the orientation of the virtual camera; or the relative height between the virtual camera and the virtual character is adjusted, so that the virtual camera can be adjusted in all directions, the virtual camera can collect the blind area view of the virtual character in a game scene, and the skill preview of the blind area of the view can be realized.
As an alternative embodiment, the sliding touch operation applied to the skill icon includes: starting from a sliding touch operation of the skill icon.
It should be noted that, since the release direction of the virtual skill of the virtual character is controlled by the position of the touch point on the skill icon, before the virtual character is controlled to release the virtual skill, a mapping relationship between the position of the touch point and the release direction of the virtual skill needs to be established in advance.
Optionally, the graphic interface of the skill icon is a circle, the position of the center of the circle represents the current position of the virtual character, and the relative direction between the position of the touch point and the center of the circle represents the release direction of the virtual skill of the virtual character.
As an alternative embodiment, before determining the position of the touch point of the sliding touch operation in response to the sliding touch operation applied to the skill icon, the embodiment may further include: and establishing a mapping relation between the position of the touch point and the release direction of the virtual skill.
In the above embodiment of the present invention, by establishing the mapping relationship between the position of the touch point and the release direction of the virtual skill, the release direction of the virtual skill can be controlled based on the mapping relationship, so that the user can control the release direction of the virtual skill by controlling the position of the touch point.
As an alternative embodiment, establishing a mapping relationship between the position of the touch point and the release direction of the virtual skill includes: establishing a two-dimensional rectangular coordinate system by taking the initial position of the touch point as an origin, and marking the position of the touch point as (x1, y 1); establishing a three-dimensional rectangular coordinate system by taking the position of the virtual character in the game scene as an origin, the horizontal direction of the virtual character as a y-axis and the vertical upward direction in the game scene as a z-axis, and marking the release direction of the virtual skill as (x0, y0, z 0); wherein x0 in the release direction of the virtual skill is the mapping of x1 in the position of the touch point, and y0 in the release direction of the virtual skill is the mapping of y1 in the position of the touch point.
It should be noted that the initial position of the touch point is the position of the skill icon or the center of the skill icon.
Alternatively, the initial position of the touch point may represent the current position of the virtual character in the game scene, and the two-dimensional rectangular coordinate system of the skill icon is established with the initial position of the touch point as an origin, such that the x-axis direction in the two-dimensional rectangular coordinate system indicates that the virtual character is right in front of the horizontal direction in the game scene (or the face orientation of the virtual character), and the y-axis direction in the two-dimensional rectangular coordinate system indicates the horizontal side of the virtual character.
Alternatively, the game scene may be a three-dimensional space, a three-dimensional rectangular coordinate system may be established with the position of the virtual character as the origin, and the release direction of the virtual skill is marked as (x0, y0, z 0).
Optionally, a plane where an x axis and a y axis in the three-dimensional rectangular coordinate system are located is a horizontal plane in the game scene, the y axis in the three-dimensional rectangular coordinate system is a horizontal direction of a face of the virtual character, the z axis in the three-dimensional rectangular coordinate system is a vertical upward direction in the game scene, and after the y axis and the z axis of the three-dimensional coordinate system are determined, the x axis in the three-dimensional rectangular coordinate system can be determined according to a right-hand rule.
Optionally, after the two-dimensional rectangular coordinate system of the skill icon and the three-dimensional rectangular coordinate system of the game scene are established, the x-axis of the two-dimensional rectangular coordinate system in the skill icon and the x-axis of the three-dimensional rectangular coordinate system in the game scene may be mapped; and mapping the y axis of the two-dimensional rectangular coordinate system in the skill icon with the y axis of the three-dimensional rectangular coordinate system in the game scene, so that the release direction (x0, y0, z0) of the virtual skill of the virtual character can be determined according to the positions (x1, y1) of the touch points.
Alternatively, geographic coordinates in the game scene may be determined based on x-axis coordinates x0 and y-axis coordinates y0 in a three-dimensional rectangular coordinate system, and the height released by the skill may be determined according to the target height of the position of the geographic coordinates. For example, the target a is located at x0 and y0 in the game scene, and if the target a is located above the horizontal plane of the virtual character and the horizontal height between the target a and the virtual character is z0, the release direction of the virtual skill is (x0, y0, z 0); if the object A is positioned below the horizontal plane of the virtual character and the horizontal height between the object A and the virtual character is z0, the release direction of the virtual skill is (x0, y0, -z 0); if the object a is at the same level as the virtual character, the release direction of the virtual skill is (x0, y0, 0).
Alternatively, the relative position between the virtual camera and the virtual character may be determined based on a three-dimensional rectangular coordinate system of the virtual skill. For example, a three-dimensional rectangular coordinate system based on the virtual skill represents the position of the virtual camera.
Fig. 7 is a schematic diagram of a two-dimensional rectangular coordinate system for creating a skill icon according to an embodiment of the present invention, as shown in fig. 7, the skill icon is disposed on a touch device such as a smart phone or a tablet computer, an x-axis of the two-dimensional rectangular coordinate system is set according to a length direction of the touch device, and a y-axis of the two-dimensional rectangular coordinate system is set according to a width direction of the touch device.
Optionally, the length direction and the width direction of the touch device may be determined according to the gesture of the user holding the touch device.
Optionally, after determining the two-dimensional coordinates of the skill icon, the position of the touch point may be determined based on the coordinate position of the touch point in the two-dimensional rectangular coordinate system, so as to determine whether the position of the touch point is in the predetermined area of the graphical interface.
As an optional embodiment, before adjusting the relative position between the virtual camera and the virtual character if the touch point is located in the predetermined area of the graphical interface, the method further includes: judging whether y1 in the position of the touch point is less than or equal to zero; and determining that the touch point is located in a preset area of the graphical interface under the condition that y1 in the position of the touch point is less than or equal to 0.
In the above embodiment of the present invention, if the y-axis coordinate y1 in the position of the touch point is not greater than 0, it indicates that the y-axis coordinate y0 in the release direction of the virtual skill of the virtual character is not greater than 0, that is, it indicates that the virtual character releases the skill to the blind area of the back or side of the body, and it can be determined that the touch point is located in the predetermined area of the graphical interface when the y-axis coordinate y1 in the position of the touch point is not greater than 0.
As an alternative embodiment, the embodiment further comprises: and if the touch point is not positioned in the preset area of the graphical interface, keeping adjusting the relative position between the virtual camera and the virtual character.
For example, in a case where the touch point is not in the predetermined area, the relative position between the virtual camera and the virtual character and the posture of the virtual camera are maintained.
Optionally, when the y-axis coordinate y1 in the position of the touch point is greater than 0, it is determined that the touch point is not located in the predetermined area of the graphical interface, the release direction of the virtual skill in the game scene may be directly mapped according to the position of the touch point, and according to the position of the touch point, not only the release direction of the virtual skill in the game scene may be mapped, but also the height and position of the virtual camera may be simultaneously controlled, and the posture of the virtual camera may be adjusted.
Fig. 8 is a schematic diagram of a touch point at a first position, where y1>0 of the positions (x1, y1) of the touch point as shown in fig. 1, records the current position of the touch point as the first position, and further determines a skill releasing direction corresponding to the first position and a position of a virtual camera.
Since y1>0 in the first position, it indicates that the user is controlling the virtual character to perform skill release to the front of the virtual character.
Fig. 9 is a schematic diagram of a release direction of a virtual skill with a touch point at a first position and a position of a virtual camera according to an embodiment of the present invention, as shown in fig. 9, the virtual camera is generally disposed behind a virtual character and slightly higher than the virtual character, so that when the virtual character releases the skill forward, the virtual camera at the current position can acquire an image of the release of the skill, and the position and the posture of the virtual camera do not change with the position of the touch point; the release direction of the virtual skill corresponds to the position mapping of the touch point.
Fig. 10 is a schematic view of a game effect with a touch point at a first position according to an embodiment of the present invention, and as shown in fig. 10, a release direction of a virtual skill corresponds to a first position mapping of the touch point, and a virtual camera does not change with a change in the position of the touch point.
FIG. 11 is a diagram illustrating a touch point at a second position, where as shown in FIG. 11, y1 is ≦ 0 of the touch point positions (x1, y1), the current position of the touch point is recorded as the second position, and then the release direction of the virtual skill corresponding to the second position of the touch point and the position of the virtual camera are further determined.
Since y1 in the first position of the touch point is less than or equal to 0, it indicates that the user controls the virtual character to release the skill to the back or the side of the virtual character.
Fig. 12 is a schematic diagram of a release direction of a virtual skill and a position of a virtual camera with a touch point at a second position according to an embodiment of the present invention, as shown in fig. 12, the virtual camera is generally disposed behind the virtual character and slightly higher than the virtual character, and when the virtual character releases the skill in the back or the side direction, the virtual camera at the current position cannot acquire an image of the release of the skill, because, in order to acquire a picture of the release of the skill, the position and the posture of the virtual camera can be adjusted according to the second position, wherein the height of the virtual camera increases as the position y1 of the touch point decreases; the release direction of the virtual skill corresponds to the position mapping of the touch point.
Fig. 13 is a schematic view of a game effect with a touch point at a second position according to an embodiment of the present invention, and as shown in fig. 13, a release direction of a virtual skill corresponds to a second position mapping of the touch point, a height of a virtual camera is increased, and a top view angle of the virtual camera is increased.
It should be noted that, in the game scene, the position of the virtual camera and the release direction of the virtual skill may be represented by using the same set of three-dimensional rectangular coordinate systems, that is, the position of the virtual camera may also be labeled as (x0, y0, z 0).
To distinguish the direction of release of the virtual skill from the position of the virtual camera, the direction of release of the virtual skill may be labeled J (x0a, y0a, z0 a); the position of the virtual camera is marked as S (x0b, y0b, z0 b).
Alternatively, if the image of the virtual character is taken from a third perspective, the virtual camera should be located directly behind or above the virtual character, and the x-axis coordinate x0b of the virtual camera is 0.
Optionally, the z-axis coordinate z0b of the virtual camera increases as the y-axis coordinate y1 decreases in the location of the touch point.
Alternatively, in the case where the y-axis coordinate y1<0 in the position of the touch point, the virtual camera is located above the virtual character, and the horizontal distance between the virtual camera and the virtual character is closer, that is, the y-axis coordinate y0b of the virtual camera increases as the y-axis coordinate y1 in the position of the touch point decreases.
Optionally, the spatial distance between the virtual camera and the virtual character may be a fixed value, and when the height of the virtual camera is increased, the horizontal distance between the virtual camera and the virtual character is decreased; similarly, when the height of the virtual camera is reduced, the horizontal distance between the virtual camera and the virtual character is increased, that is, the virtual camera can move on a spherical cambered surface taking the virtual character as the center of sphere, and the radius of the center of sphere of the spherical cambered surface is the space distance between the virtual camera and the virtual character.
When the virtual camera is at the initial default position, the position of the virtual camera is marked as S (0, y _ preset, z _ preset), and the x-axis coordinate of the virtual camera is 0 at the moment, namely, the position is right behind the character; y _ preset is the y-axis distance from the virtual camera to the virtual character; z _ preset is the height of the virtual camera at the default position. The developers of the values of the specific y _ preset and z _ preset are adjusted according to the game hand feeling.
As an alternative embodiment, if the touch point is located in a predetermined area of the graphical interface, adjusting the relative position between the virtual camera and the virtual character includes at least one of: according to the position of the touch point, adjusting the relative height of the virtual camera and the virtual character in the vertical direction in the game scene; and/or adjusting the relative distance between the virtual camera and the virtual character in the horizontal direction in the game scene according to the position of the touch point; and/or adjusting the overlooking angle of the virtual camera in the game scene according to the position of the touch point.
To describe in detail the mapping relationship between the position of the touch point and the release direction of the virtual skill, and the relationship between the position of the touch point and the position and attitude of the virtual camera, the position of the touch point is then rotated counterclockwise around the center of the skill icon, the release direction of the virtual skill, the position of the virtual camera, and the attitude of the virtual camera, as follows.
Fig. 14 is a schematic diagram illustrating that a touch point is located at a position a1 according to an embodiment of the present invention, and as shown in fig. 14, in response to a sliding touch operation applied to a skill icon, a position a1(x1, y1) of the touch point is determined, where x1<0, y1> 0.
Fig. 15 is a schematic diagram of a release direction of a virtual skill with a touch point at a position a1 and a position of a virtual camera according to an embodiment of the present invention, where as shown in fig. 15, the release direction of the virtual skill corresponds to a position map of the touch point, and the position of the virtual camera does not change with a change in the position of the touch point.
Fig. 16 is a schematic diagram of a game effect of a touch point at a position a1 according to an embodiment of the present invention, and as shown in fig. 16, a release direction of a virtual skill mapped by a position a1 of the touch point is previewed without changing the position and posture of a virtual camera.
Fig. 17 is a schematic diagram illustrating that a touch point is located at a position a2 according to an embodiment of the present invention, and as shown in fig. 17, in response to a sliding touch operation applied to a skill icon, it is determined that x1<0 and y1<0 are located at a position a2(x1 and y1) of the touch point.
Fig. 18 is a schematic diagram of a release direction of a virtual skill with a touch point at a position a2 and a position of a virtual camera according to an embodiment of the present invention, where, as shown in fig. 18, the release direction of the virtual skill corresponds to a position mapping of the touch point, and the height of the virtual camera increases with a decrease in y-axis coordinate y1 in the position of the touch point; the horizontal distance between the virtual camera and the virtual character decreases as the y-axis coordinate y1 in the position of the touch point decreases.
Fig. 19 is a schematic diagram of a game effect in which the touch point is at the position a2 according to an embodiment of the present invention, and as shown in fig. 19, the height of the virtual camera is increased, the horizontal distance between the virtual camera and the virtual character is shortened, the depression angle of the virtual camera is increased, and the release direction of the virtual skill mapped by the position a2 of the touch point is previewed.
Fig. 20 is a schematic diagram of a touch point at a position A3 according to an embodiment of the present invention, and as shown in fig. 20, in response to a sliding touch operation applied to a skill icon, the position A3(x1, y1) of the touch point is determined, where x1 is 0 and y1 is y1min, that is, the y-axis coordinate of the position of the touch point is the minimum.
Fig. 21 is a schematic diagram of a release direction of a virtual skill with a touch point at a position a3 and a position of a virtual camera according to an embodiment of the present invention, where, as shown in fig. 21, the release direction of the virtual skill corresponds to a position mapping of the touch point, and a height of the virtual camera increases with a decrease of a y-axis coordinate y1 in the position of the touch point to reach a maximum height; the horizontal distance between the virtual camera and the virtual character decreases as the y-axis coordinate y1 in the position of the touch point decreases, i.e., the virtual camera is located directly above the virtual character.
Fig. 22 is a schematic diagram of a game effect of the touch point at the position A3 according to the embodiment of the present invention, as shown in fig. 22, the height of the virtual camera is increased, the horizontal distance between the virtual camera and the virtual character is shortened, the depression angle of the virtual camera is increased, the virtual camera is positioned right above the virtual character, the depression angle of the virtual camera is vertically downward, and the release direction of the virtual skill mapped by the position A3 of the touch point is previewed.
Fig. 23 is a schematic diagram illustrating that a touch point is located at a position a4 according to an embodiment of the present invention, and as shown in fig. 23, in response to a sliding touch operation applied to a skill icon, positions a4(x1, y1) of the touch point are determined, where x1>0 and y1< 0.
Fig. 24 is a schematic diagram of a release direction of a virtual skill with a touch point at a position a4 and a position of a virtual camera according to an embodiment of the present invention, where, as shown in fig. 24, the release direction of the virtual skill corresponds to a position map of the touch point, and the height of the virtual camera decreases with an increase of a y-axis coordinate y1 in the position of the touch point; the horizontal distance between the virtual camera and the virtual character increases as the y-axis coordinate y1 in the position of the touch point increases.
Fig. 25 is a schematic diagram of a game effect of the touch point at the position a4 according to the embodiment of the present invention, as shown in fig. 25, the height of the virtual camera is lowered, the horizontal distance between the virtual camera and the target object is pushed away, the depression angle of the virtual camera is reduced, and the release direction of the virtual skill mapped by the position a4 of the touch point is previewed.
Fig. 26 is a schematic diagram illustrating that a touch point is located at a position a5 according to an embodiment of the present invention, and as shown in fig. 26, in response to a sliding touch operation applied to a skill icon, a position a5(x1, y1) of the touch point is determined, where y1 is 0.
Fig. 27 is a schematic diagram of a release direction of a virtual skill with a touch point at a position a5 and a position of a virtual camera according to an embodiment of the present invention, where, as shown in fig. 27, the release direction of the virtual skill corresponds to a position mapping of the touch point, and a height of the virtual camera decreases with an increase of a y-axis coordinate y1 in the position of the touch point to reach a minimum height; the horizontal distance between the virtual camera and the virtual character increases as the y-axis coordinate y1 in the position of the touch point increases, reaching the farthest distance.
FIG. 28 is a schematic diagram of a game effect with a touch point at position A5 according to an embodiment of the present invention, as shown in FIG. 28, the height of the virtual camera is reduced to a minimum height; and (4) the horizontal distance between the virtual camera and the target object is pushed far to reach the farthest distance, the depression angle of the virtual camera is reduced, and the release direction of the virtual skill mapped by the position A5 of the touch point is previewed.
Fig. 29 is a schematic diagram illustrating that a touch point is located at a position a6 according to an embodiment of the present invention, and as shown in fig. 29, in response to a sliding touch operation applied to a skill icon, positions a6(x1, y1) of the touch point are determined, where x1>0 and y1> 0.
Fig. 30 is a schematic diagram of a release direction of a virtual skill and a position of a virtual camera, where a touch point is at a position a6 according to an embodiment of the present invention, and as shown in fig. 30, the release direction of the virtual skill corresponds to a position map of the touch point, and the position of the virtual camera does not change with a change in the position of the touch point.
Fig. 31 is a schematic diagram of a game effect of a touch point at a position a6 according to an embodiment of the present invention, and as shown in fig. 31, a release direction of a virtual skill mapped by a position a6 of the touch point is previewed without changing the position and posture of a virtual camera.
As an optional embodiment, before adjusting the relative position between the virtual camera and the virtual character if the touch point is located in the predetermined area of the graphical interface, the embodiment may further include: establishing a polar coordinate system by taking the initial position of the touch point as an origin, and marking the position of the touch point as (rho, theta), wherein rho is the distance between the position of the touch point and the initial position, and theta is a polar angle between the touch point and a polar axis; judging whether the position of the touch point is located in a preset area of the graphical interface according to the polar angle theta between the touch point and the polar axis; and determining that the position of the touch point is not located in the preset area of the graphical interface under the condition that the polar angle theta is between (0 degrees and 180 degrees), and determining that the position of the touch point is located in the preset area of the graphical interface under the condition that the polar angle theta is between (180 degrees and 360 degrees).
It should be noted that θ is a polar angle between the touch point and the polar axis, that is, an included angle formed between a straight line where the touch point and the origin point are located and the polar axis.
Optionally, the position of the touch point in the skill icon is represented by a polar coordinate system, and whether the current position of the touch point is in the predetermined area is determined according to a polar angle between the touch point and the polar axis in the polar coordinate system.
Fig. 32 is a schematic diagram of establishing a polar coordinate system of a skill icon according to an embodiment of the present invention, and as shown in fig. 32, the polar coordinate system of the skill icon is established with an initial position of a touch point as an origin, and a position of the touch point is labeled as (ρ, θ), where ρ is a distance between the touch point and the initial position, and θ is a polar angle between the touch point and a polar axis, atan2(y1, x 1).
Alternatively, the polar coordinate system of the skill icon may be determined based on a planar rectangular coordinate system of the skill icon.
For example, the skill icon may be labeled (x1, y1) by a rectangular plane coordinate system, converted from a rectangular plane coordinate system to a polar plane coordinate system, thenθ is atan2(y1, x1), where atan2 is an arctangent function that has taken the quadrant into account.
As an alternative embodiment, if the touch point is located in a predetermined area of the graphical interface, adjusting the relative position between the virtual camera and the virtual character includes: when the polar angle theta is between (180 DEG, 270 DEG), the relative height between the virtual camera and the virtual object virtual character in the vertical direction is increased in the game scene, the relative distance between the virtual camera and the virtual object virtual character in the horizontal direction is decreased, and the top view angle of the virtual camera is increased, when the polar angle theta is between (270 DEG, 360 DEG), the relative height between the virtual camera and the virtual character in the vertical direction is decreased in the game scene, the relative distance between the virtual camera and the virtual character in the horizontal direction is increased, and the top view angle of the virtual camera is decreased.
Alternatively, when the polar angle θ is between (0 °, 180 ° ], the virtual camera position is independent of the change in the position of the touch point, and the camera orientation angle is independent of the change in the position of the touch point.
Alternatively, when the polar angle θ is between (180 °, 270 ° ]), as the polar angle θ increases, the ordinate value y0 of the virtual camera increases (i.e., the horizontal distance between the virtual camera and the virtual character decreases), the virtual camera height value z0 increases (i.e., the height of the virtual camera increases), and the depression angle of the virtual camera increases and is oriented more vertically downward, whereas as the polar angle θ decreases, the ordinate value y0 of the virtual camera decreases (i.e., the horizontal distance between the virtual camera and the virtual character increases), the virtual camera height value z0 decreases (i.e., the height of the virtual camera decreases), and the depression angle of the virtual camera decreases and is oriented more horizontally forward.
Alternatively, when the polar angle θ is between (270 °, 360 ° ], as the polar angle θ increases, the ordinate value y0 of the virtual camera decreases (i.e., the horizontal distance between the virtual camera and the virtual character increases), the virtual camera height value z0 decreases (i.e., the height of the virtual camera decreases), and the top view angle of the virtual camera becomes smaller and more horizontal toward the front, whereas as the polar angle θ decreases, the ordinate value y0 of the virtual camera increases (i.e., the horizontal distance between the virtual camera and the virtual character decreases), the virtual camera height value z0 increases (i.e., the height of the virtual camera increases), and the top view angle of the virtual camera becomes larger and more vertical toward the bottom.
According to still another embodiment of the present invention, there is also provided a storage medium including a stored program, wherein the program executes the in-game skill release control method according to any one of the above.
According to yet another embodiment of the present invention, there is also provided a processor for executing a program, wherein the program is executed to execute any one of the in-game skill release control methods described above.
According to an embodiment of the present invention, there is also provided an embodiment of an in-game skill release control device, where the in-game skill release control device may be configured to execute the in-game skill release control method in the embodiment of the present invention, and the in-game skill release control method in the embodiment of the present invention may be executed in the in-game skill release control device.
Fig. 33 is a schematic diagram of a skill release control apparatus in a game according to an embodiment of the present invention, as shown in fig. 33, where the game includes at least one virtual character, the virtual character performs a virtual action according to an operation instruction received by a game terminal device, a graphical user interface of the game terminal device includes at least one skill icon, and the skill icon corresponds to a virtual skill of the virtual character, the apparatus may include: a first determination unit 332, a response unit 334 and an adjustment unit 336.
The first determining unit 332 is configured to determine, according to the orientation of the virtual character in the game scene of the game, an orientation of a virtual camera in the game, where the virtual camera is used to control a rendering field of view of the game scene; a response unit 334, configured to determine, in response to a sliding touch operation applied to a skill icon, a position of a touch point of the sliding touch operation, and control a release direction of the virtual skill according to the touch point; the adjusting unit 336 is configured to adjust a relative position between the virtual camera and the virtual character if the touch point is located in a predetermined area of the graphical interface.
It should be noted that the first determining unit 332 in this embodiment may be configured to execute step S602 in this embodiment, the responding unit 334 in this embodiment may be configured to execute step S604 in this embodiment, and the adjusting unit 336 in this embodiment may be configured to execute step S606 in this embodiment. The above units are the same as the examples and application scenarios realized by the corresponding steps, but are not limited to the disclosure of the above embodiments.
In the embodiment of the invention, the game comprises at least one virtual character, the virtual character executes a virtual action according to an operation instruction received by the game terminal equipment, a graphical user interface of the game terminal equipment comprises at least one skill icon, the skill icon corresponds to the virtual skill of the virtual character, the orientation of a virtual camera in the game is determined according to the orientation of the virtual character in the game scene of the game, and the virtual camera is used for controlling the presentation visual field of the game scene; responding to sliding touch operation acting on the skill icon, determining the position of a touch point of the sliding touch operation, and controlling the release direction of the virtual skill according to the touch point; if the touch point is located in the preset area of the graphical interface, the relative position between the virtual camera and the virtual character is adjusted, the virtual camera is controlled to collect the image of the visual field blind area of the virtual character, and the purpose of collecting the image of the visual field blind area is achieved, so that the technical effect of previewing the skill release condition of the visual field blind area is achieved, and the technical problem that the skill release condition of the visual field blind area cannot be previewed in a third person named action game is solved.
As an alternative embodiment, the first adjusting unit includes: and the first adjusting module is used for adjusting the relative position between the virtual camera and the virtual character according to the position of the touch point if the touch point is positioned in the preset area of the graphical interface.
As an alternative embodiment, the first adjusting unit includes: and the second adjusting module is used for adjusting the relative position between the virtual camera and the virtual character and the orientation of the virtual camera if the touch point is located in the preset area of the graphical interface.
As an alternative embodiment, the first adjusting unit includes: and the third adjusting module is used for adjusting the relative height between the virtual camera and the virtual character if the touch point is located in the preset area of the graphical interface.
As an alternative embodiment, the sliding touch operation applied to the skill icon includes: starting from a sliding touch operation of the skill icon.
As an alternative embodiment, the embodiment may further include: the establishing unit is used for establishing a mapping relation between the position of the touch point and the release direction of the virtual skill before determining the position of the touch point of the sliding touch operation in response to the sliding touch operation acting on the skill icon.
As an alternative embodiment, the establishing unit includes: the first establishing module is used for establishing a two-dimensional rectangular coordinate system by taking the initial position of the touch point as an origin, and marking the position of the touch point as (x1, y 1); a second establishing module, configured to establish a three-dimensional rectangular coordinate system with the position of the virtual character in the game scene as an origin, the horizontal orientation of the virtual character as a y-axis, and the vertical upward direction in the game scene as a z-axis, and mark a release direction of the virtual skill as (x0, y0, z 0); wherein x0 in the release direction of the virtual skill is the mapping of x1 in the position of the touch point, and y0 in the release direction of the virtual skill is the mapping of y1 in the position of the touch point.
As an alternative embodiment, the embodiment may further include: the judging unit is used for judging whether y1 in the positions of the touch points is less than or equal to zero or not before adjusting the relative positions between the virtual camera and the virtual character if the touch points are located in the preset area of the graphical interface; and the second determination unit is used for determining that the touch point is positioned in the preset area of the graphical interface under the condition that y1 in the position of the touch point is less than or equal to 0.
As an alternative embodiment, the embodiment may further include: a third establishing module, configured to establish a polar coordinate system with an initial position of the touch point as an origin and mark a position of the touch point as (ρ, θ) before adjusting a relative position between the virtual camera and the virtual character if the touch point is located in a predetermined area of the graphical interface, where ρ is a distance between the position of the touch point and the initial position and θ is a polar angle between the touch point and the polar axis; the judging module is used for judging whether the position of the touch point is positioned in a preset area of the graphical interface according to the polar angle theta between the touch point and the polar axis; and determining that the position of the touch point is not located in the preset area of the graphical interface under the condition that the polar angle theta is between (0 degrees and 180 degrees), and determining that the position of the touch point is located in the preset area of the graphical interface under the condition that the polar angle theta is between (180 degrees and 360 degrees).
As an alternative embodiment, the first adjusting unit includes: the game device comprises a first adjusting module and a second adjusting module, wherein the first adjusting module is used for increasing the relative height of a virtual camera and a virtual object virtual character in the vertical direction in a game scene along with the increase of a polar angle theta when the polar angle theta is between (180 degrees and 270 degrees), and decreasing the relative distance of the virtual camera and the virtual object virtual character in the horizontal direction to increase the top view angle of the virtual camera, and the second adjusting module is used for decreasing the relative height of the virtual camera and the virtual character in the vertical direction in the game scene along with the increase of the polar angle theta when the polar angle theta is between (270 degrees and 360 degrees), increasing the relative distance of the virtual camera and the virtual character in the horizontal direction to decrease the top view angle of the virtual camera.
As an alternative embodiment, the first adjusting unit comprises at least one of: the sixth adjusting module is used for adjusting the relative height of the virtual camera and the virtual character in the vertical direction in the game scene according to the position of the touch point; and/or a seventh adjusting module, configured to adjust, in the game scene, a relative distance between the virtual camera and the virtual character in the horizontal direction according to the position of the touch point; and/or the eighth adjusting module is used for adjusting the overlooking angle of the virtual camera in the game scene according to the position of the touch point.
As an alternative embodiment, the embodiment may further include: and the second adjusting unit is used for keeping and adjusting the relative position between the virtual camera and the virtual character if the touch point is not positioned in the preset area of the graphical interface.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (15)

1. A skill release control method in a game, wherein the game comprises at least one virtual character, the virtual character executes a virtual action according to an operation instruction received by a game terminal device, a graphic user interface of the game terminal device comprises at least one skill icon, and the skill icon corresponds to a virtual skill of the virtual character, and the method is characterized by comprising the following steps:
determining the orientation of a virtual camera in the game according to the orientation of the virtual character in the game scene of the game, wherein the virtual camera is used for controlling the presentation visual field of the game scene;
responding to sliding touch operation acting on a skill icon, determining the position of a touch point of the sliding touch operation, and controlling the release direction of the virtual skill according to the touch point;
and if the touch point is positioned in a preset area of the graphical interface, adjusting the relative position between the virtual camera and the virtual character.
2. The method of claim 1, wherein the adjusting the relative position between the virtual camera and the virtual character if the touch point is located in the predetermined area of the graphical interface comprises:
and if the touch point is located in a preset area of the graphical interface, adjusting the relative position between a virtual camera and the virtual character according to the position of the touch point.
3. The method of claim 1, wherein the adjusting the relative position between the virtual camera and the virtual character if the touch point is located in the predetermined area of the graphical interface comprises:
and if the touch point is located in a preset area of the graphical interface, adjusting the relative position between the virtual camera and the virtual character and the orientation of the virtual camera.
4. The method of claim 1, wherein the adjusting the relative position between the virtual camera and the virtual character if the touch point is located in the predetermined area of the graphical interface comprises:
and if the touch point is positioned in a preset area of the graphical interface, adjusting the relative height between the virtual camera and the virtual character.
5. The method of claim 1, wherein the sliding touch operation on the skill icon comprises:
and starting from the sliding touch operation of the skill icon.
6. The method according to claim 1, wherein before determining the position of the touch point of the sliding touch operation in response to the sliding touch operation acting on the skill icon, the method further comprises:
and establishing a mapping relation between the position of the touch point and the release direction of the virtual skill.
7. The method of claim 6, wherein establishing a mapping between the location of the touch point and the direction of release of the virtual skill comprises:
establishing a two-dimensional rectangular coordinate system by taking the initial position of the touch point as an origin, and marking the position of the touch point as (x1, y 1);
establishing a three-dimensional rectangular coordinate system by taking the position of the virtual character in the game scene as an origin, the horizontal orientation of the virtual character as a y-axis and the vertical upward direction in the game scene as a z-axis, and marking the release direction of the virtual skill as (x0, y0, z 0);
wherein x0 in the release direction of the virtual skill is a mapping of x1 in the position of the touch point, and y0 in the release direction of the virtual skill is a mapping of y1 in the position of the touch point.
8. The method of claim 7, wherein before adjusting the relative position between the virtual camera and the virtual character if the touch point is located in the predetermined area of the graphical interface, the method further comprises:
judging whether y1 in the position of the touch point is less than or equal to zero;
and under the condition that y1 in the positions of the touch points is less than or equal to 0, determining that the touch points are located in a preset area of the graphical interface.
9. The method of claim 1, wherein before adjusting the relative position between the virtual camera and the virtual character if the touch point is located in the predetermined area of the graphical interface, the method further comprises:
establishing a polar coordinate system by taking the initial position of the touch point as an origin, and marking the position of the touch point as (rho, theta), wherein rho is the distance between the position of the touch point and the initial position, and theta is a polar angle between the touch point and a polar axis;
judging whether the position of the touch point is located in a preset area of the graphical interface according to a polar angle theta between the touch point and a polar axis;
and determining that the position of the touch point is not located in the preset area of the graphical interface when the polar angle theta is between (0 degrees, 180 degrees) and determining that the position of the touch point is located in the preset area of the graphical interface when the polar angle theta is between (180 degrees, 360 degrees).
10. The method of claim 9, wherein if the touch point is located in a predetermined area of the graphical interface, adjusting the relative position between the virtual camera and the virtual character comprises:
in a case where the polar angle θ is between (180 °, 270 ° ]), as the polar angle θ increases, a relative height of the virtual camera and the virtual character in a vertical direction is increased in the game scene, a relative distance of the virtual camera and the virtual character in a horizontal direction is decreased, and a top view angle of the virtual camera is increased;
in a case where the polar angle θ is between (270 °, 360 ° ]), as the polar angle θ increases, a relative height of the virtual camera and the virtual character in a vertical direction in the game scene decreases, a relative distance of the virtual camera and the virtual character in a horizontal direction increases, and a top view angle of the virtual camera decreases.
11. The method of claim 1, wherein if the touch point is located in a predetermined area of the graphical interface, adjusting the relative position between the virtual camera and the virtual character comprises at least one of:
according to the position of the touch point, adjusting the relative height of the virtual camera and the virtual character in the vertical direction in the game scene; and/or
According to the position of the touch point, adjusting the relative distance between the virtual camera and the virtual character in the horizontal direction in the game scene; and/or
And adjusting the overlooking angle of the virtual camera in the game scene according to the position of the touch point.
12. The method of claim 1, further comprising:
and if the touch point is not located in the preset area of the graphical interface, keeping adjusting the relative position between the virtual camera and the virtual character.
13. A skill release control device in a game, the game including at least one virtual character, the virtual character executing a virtual action according to an operation instruction received by a game terminal device, a graphical user interface of the game terminal device including at least one skill icon, the skill icon corresponding to a virtual skill of the virtual character, the game comprising:
a determining unit, configured to determine an orientation of a virtual camera in a game according to an orientation of the virtual character in a game scene of the game, where the virtual camera is used to control a rendering field of view of the game scene;
the response unit is used for responding to the sliding touch operation acting on the skill icon, determining the position of a touch point of the sliding touch operation, and controlling the release direction of the virtual skill according to the touch point;
and the adjusting unit is used for adjusting the relative position between the virtual camera and the virtual character if the touch point is positioned in a preset area of a graphical interface.
14. A storage medium characterized by comprising a stored program, wherein the program executes the in-game skill release control method according to any one of claims 1 to 12.
15. A processor for running a program, wherein the program is run to perform the in-game skill release control method of any of claims 1 to 12.
CN201910907422.XA 2019-09-24 2019-09-24 Skill release control method and device in game, storage medium and processor Pending CN110613933A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910907422.XA CN110613933A (en) 2019-09-24 2019-09-24 Skill release control method and device in game, storage medium and processor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910907422.XA CN110613933A (en) 2019-09-24 2019-09-24 Skill release control method and device in game, storage medium and processor

Publications (1)

Publication Number Publication Date
CN110613933A true CN110613933A (en) 2019-12-27

Family

ID=68924083

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910907422.XA Pending CN110613933A (en) 2019-09-24 2019-09-24 Skill release control method and device in game, storage medium and processor

Country Status (1)

Country Link
CN (1) CN110613933A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111467802A (en) * 2020-04-09 2020-07-31 腾讯科技(深圳)有限公司 Method, device, equipment and medium for displaying picture of virtual environment
CN111481934A (en) * 2020-04-09 2020-08-04 腾讯科技(深圳)有限公司 Virtual environment picture display method, device, equipment and storage medium
CN113694514A (en) * 2021-09-03 2021-11-26 网易(杭州)网络有限公司 Object control method and device
WO2021244237A1 (en) * 2020-06-05 2021-12-09 腾讯科技(深圳)有限公司 Virtual object control method and apparatus, computer device, and storage medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109675308A (en) * 2019-01-10 2019-04-26 网易(杭州)网络有限公司 Display control method, device, storage medium, processor and terminal in game

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109675308A (en) * 2019-01-10 2019-04-26 网易(杭州)网络有限公司 Display control method, device, storage medium, processor and terminal in game

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111467802A (en) * 2020-04-09 2020-07-31 腾讯科技(深圳)有限公司 Method, device, equipment and medium for displaying picture of virtual environment
CN111481934A (en) * 2020-04-09 2020-08-04 腾讯科技(深圳)有限公司 Virtual environment picture display method, device, equipment and storage medium
CN111467802B (en) * 2020-04-09 2022-02-22 腾讯科技(深圳)有限公司 Method, device, equipment and medium for displaying picture of virtual environment
US11847734B2 (en) 2020-04-09 2023-12-19 Tencent Technology (Shenzhen) Company Limited Method and apparatus for displaying virtual environment picture, device, and storage medium
US11878242B2 (en) 2020-04-09 2024-01-23 Tencent Technology (Shenzhen) Company Limited Method and apparatus for displaying virtual environment picture, device, and storage medium
WO2021244237A1 (en) * 2020-06-05 2021-12-09 腾讯科技(深圳)有限公司 Virtual object control method and apparatus, computer device, and storage medium
EP3939679A4 (en) * 2020-06-05 2022-08-31 Tencent Technology (Shenzhen) Company Limited Virtual object control method and apparatus, computer device, and storage medium
JP2022540278A (en) * 2020-06-05 2022-09-15 ▲騰▼▲訊▼科技(深▲セン▼)有限公司 VIRTUAL OBJECT CONTROL METHOD, APPARATUS, COMPUTER DEVICE, AND COMPUTER PROGRAM
JP7384521B2 (en) 2020-06-05 2023-11-21 ▲騰▼▲訊▼科技(深▲セン▼)有限公司 Virtual object control method, device, computer equipment and computer program
US12005360B2 (en) 2020-06-05 2024-06-11 Tencent Technology (Shenzhen) Company Ltd Virtual object control method and apparatus, computer device, and storage medium
CN113694514A (en) * 2021-09-03 2021-11-26 网易(杭州)网络有限公司 Object control method and device

Similar Documents

Publication Publication Date Title
US11376501B2 (en) Method and apparatus for displaying marker element in virtual scene, computer device, and computer-readable storage medium
US11305190B2 (en) Location indication information display method, electronic apparatus, and storage medium
JP7247350B2 (en) Method, apparatus, electronic device and computer program for generating mark information in virtual environment
CN110613933A (en) Skill release control method and device in game, storage medium and processor
CN107823882B (en) Information processing method, information processing device, electronic equipment and storage medium
WO2018177170A1 (en) Display control method and apparatus for game picture, storage medium and electronic device
US20220023761A1 (en) Virtual object control method and apparatus, device, and storage medium
CN113440846B (en) Game display control method and device, storage medium and electronic equipment
WO2020244421A1 (en) Method and apparatus for controlling movement of virtual object, and terminal and storage medium
TWI669635B (en) Method and device for displaying barrage and non-volatile computer readable storage medium
CN110833694B (en) Display control method and device in game
CN111481934B (en) Virtual environment picture display method, device, equipment and storage medium
US20230360343A1 (en) Method for observing virtual environment, device, and storage medium
CN112717392B (en) Mark display method, device, terminal and storage medium
CN113117332B (en) Lens visual angle adjusting method and device, electronic equipment and storage medium
CN111437604A (en) Game display control method and device, electronic equipment and storage medium
WO2022227915A1 (en) Method and apparatus for displaying position marks, and device and storage medium
WO2022252927A1 (en) Virtual object location prompting method and apparatus, terminal, and storage medium
CN112791410A (en) Game control method and device, electronic equipment and storage medium
CN110737326A (en) Virtual object display method and device, terminal equipment and storage medium
CN107092348A (en) A kind of vision positioning mark layout method in immersive VR roaming system
JP2022540278A (en) VIRTUAL OBJECT CONTROL METHOD, APPARATUS, COMPUTER DEVICE, AND COMPUTER PROGRAM
CN113289336A (en) Method, apparatus, device and medium for tagging items in a virtual environment
WO2024045776A1 (en) Game skill cast method and apparatus, electronic device, and readable storage medium
CN114225398A (en) Virtual lens control method, device, equipment and storage medium of game

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination