CN108434731B - Virtual object control method and device, storage medium and electronic equipment - Google Patents

Virtual object control method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN108434731B
CN108434731B CN201810246492.0A CN201810246492A CN108434731B CN 108434731 B CN108434731 B CN 108434731B CN 201810246492 A CN201810246492 A CN 201810246492A CN 108434731 B CN108434731 B CN 108434731B
Authority
CN
China
Prior art keywords
jump
touch point
virtual object
continuous
sliding operation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810246492.0A
Other languages
Chinese (zh)
Other versions
CN108434731A (en
Inventor
郑贤钢
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201810246492.0A priority Critical patent/CN108434731B/en
Publication of CN108434731A publication Critical patent/CN108434731A/en
Application granted granted Critical
Publication of CN108434731B publication Critical patent/CN108434731B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/533Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/303Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present disclosure relates to the field of human-computer interaction technologies, and in particular, to a virtual object control method and apparatus, a storage medium, and an electronic device. The method responds to a trigger operation acting on an area where the virtual object is located, and monitors a sliding operation continuous with the trigger operation; when the situation that the sliding operation continuous with the triggering operation meets the preset condition is monitored, determining a jump activation position; and controlling the virtual object to execute continuous jumping action according to the position of the virtual object, the jump activation position and the position of the release operation in response to the release operation continuous with the sliding operation. The method and the device increase interaction between the user and the scene, improve operation space, enable the user to obtain competitive and strong operation pleasure in the game, and simultaneously provide a new mode for controlling the virtual object to continuously jump for multiple times in the touch terminal.

Description

Virtual object control method and device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of human-computer interaction technologies, and in particular, to a virtual object control method and apparatus, a storage medium, and an electronic device.
Background
With the rapid development of mobile communication technology, more and more applications for hand games appear in touch terminals. During the running process of the hand-game application, the virtual object is mainly controlled through the virtual rocker. The mode of controlling the virtual object through the virtual rocker is more consistent with the operation of the touch terminal and simplifies the operation mode, so that the user can get on the hand more easily, however, the interaction between the user and the scene is reduced, and the operation difficulty is weakened. Particularly, for games with strong competitive performance which emphasizes operation space and challenging operation, the operation is a key element of game success and failure, so that the mode of controlling the virtual object through the virtual rocker greatly loses the enjoyment of competition and strong operation in the games for users.
For example, in a hand-game application appearing in an existing touch terminal, the way of controlling virtual object jumping is mainly as follows: the jumping direction of the virtual object is determined by triggering the virtual rocker positioned at the origin of the virtual rocker area, dragging the virtual rocker in the virtual rocker area and according to a vector formed by the position for releasing the virtual rocker and the origin of the virtual rocker area.
Obviously, in the above manner, although the operation threshold for controlling the virtual object to jump is lowered, so that the user is easier to master, the operation is simplified, and at the same time, the user loses the enjoyment of competition and strong operation in the game, and the interaction between the user and the scene is also reduced.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
An object of the present disclosure is to provide a virtual object control method and apparatus, a storage medium, and an electronic device, thereby overcoming, at least to some extent, one or more problems due to limitations and disadvantages of the related art.
According to an aspect of the present disclosure, there is provided a virtual object control method applied to a touch terminal capable of presenting a virtual scene and at least one virtual object, the virtual object control method including:
responding to a trigger operation acted on the area where the virtual object is located, and monitoring a sliding operation continuous with the trigger operation;
when the situation that the sliding operation continuous with the triggering operation meets the preset condition is monitored, determining a jump activation position;
and controlling the virtual object to execute continuous jumping action according to the position of the virtual object, the jump activation position and the position of the release operation in response to the release operation continuous with the sliding operation.
In an exemplary embodiment of the present disclosure, the determining the jump activation position when it is monitored that the sliding operation continuous with the triggering operation satisfies a preset condition includes:
judging whether the touch point of the sliding operation continuous with the triggering operation moves out of the current jumping operation area in real time;
and when the touch point of the sliding operation continuous with the trigger operation is judged to move out of the jump operation area to which the touch point belongs, determining the current position of the touch point as the jump activation position.
In an exemplary embodiment of the present disclosure, the determining the jump activation position when it is monitored that the sliding operation continuous with the triggering operation satisfies a preset condition includes:
judging whether a touch point of sliding operation continuous with the triggering operation is located in an area where a preset type of virtual resource is located in the virtual scene in real time;
and when the touch point of the sliding operation continuous with the triggering operation is judged to be located in the area where the virtual resource of the preset type is located in the virtual scene, determining the current position of the touch point as a jump activation position.
In an exemplary embodiment of the present disclosure, the virtual resource of the preset type is a collision volume;
the real-time judgment of whether the touch point of the sliding operation continuous with the trigger operation is located in the area where the virtual resource of the preset type is located in the virtual scene comprises the following steps:
judging whether a touch point of sliding operation continuous with the triggering operation collides with the collider in the virtual scene in real time;
when it is determined that the touch point of the sliding operation continuous with the triggering operation is located in the area where the virtual resource of the preset type is located in the virtual scene, determining the current position of the touch point as the jump activation position includes:
when it is determined that a touch point of a sliding operation continuous with the trigger operation collides with the collision body in the virtual scene, determining a collision position of the touch point and the collision body as the jump activation position.
In an exemplary embodiment of the present disclosure, when determining that a touch point of a sliding operation continuous with the trigger operation collides with the collider in the virtual scene, the method further includes:
and displaying collision prompt information based on the collision position of the touch point and the collision body.
In an exemplary embodiment of the present disclosure, the method further comprises:
in the process of moving the touch point of the sliding operation, a ray is emitted from the touch point of the sliding operation by taking the jump activation position or the position of the triggering operation as a starting point;
displaying a collisionable prompt at a location where the ray collides with the collider in the virtual scene.
In an exemplary embodiment of the present disclosure, the method further comprises:
and during the movement process of the touch point of the sliding operation, dynamically displaying the direction and/or the distance of the touch point of the sliding operation relative to the position of the trigger operation or the jump activation position in real time based on the current position of the touch point of the sliding operation.
In an exemplary embodiment of the present disclosure, the jump activation position is at least one.
According to an aspect of the present disclosure, there is provided a virtual object control apparatus applied to a touch terminal capable of presenting a virtual scene and at least one virtual object, the virtual object control apparatus including:
the triggering module is used for responding to triggering operation acting on the area where the virtual object is located and monitoring sliding operation continuous to the triggering operation;
the determining module is used for determining a jump activation position when the situation that the sliding operation continuous with the triggering operation meets the preset condition is monitored;
and the control module is used for responding to a release operation continuous with the sliding operation and controlling the virtual object to execute continuous jumping action according to the position of the virtual object, the jump activation position and the position of the release operation.
According to an aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the virtual object control method of any one of the above.
According to an aspect of the present disclosure, there is provided an electronic device including:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the virtual object control method of any of the above via execution of the executable instructions.
The invention discloses a virtual object control method and device, a storage medium and an electronic device. The method comprises the steps of responding to a trigger operation acting on an area where a virtual object is located, monitoring a sliding operation continuous with the trigger operation, determining a jump activation position when the sliding operation continuous with the trigger operation is monitored to meet a preset condition, and responding to a release operation continuous with the sliding operation, and controlling the virtual object to execute continuous jump action according to the location of the virtual object, the jump activation position and the position of the release operation. On one hand, the starting point of continuous jumping is determined through trigger operation, the jumping transfer position (namely the jumping activation position) of continuous jumping is determined through sliding operation continuous with the trigger operation, the end point of continuous jumping is determined through release operation, so that when a user controls a virtual object to continuously jump, the jumping transfer position is accurately determined, interaction between the user and a scene is increased, the operation space is improved, the user obtains the enjoyment of competition and strong operation in a game, and a novel mode for controlling the virtual object to continuously jump for multiple times in a touch terminal is provided; on the other hand, the jump activation position is determined by monitoring the sliding operation, so that the user can rapidly determine the direction and the distance of each jump according to the sliding operation, and the user experience is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The above and other features and advantages of the present disclosure will become more apparent by describing in detail exemplary embodiments thereof with reference to the attached drawings. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty. In the drawings:
FIG. 1 is a flow chart of a virtual object control method of the present disclosure;
FIG. 2 is a flow chart of determining jump activation positions provided in an exemplary embodiment of the present disclosure;
fig. 3 is a first schematic diagram of determining jump activation positions provided in an exemplary embodiment of the present disclosure;
FIG. 4 is a diagram two of determining jump activation locations provided in an exemplary embodiment of the present disclosure;
fig. 5 is a flow chart of determining jump activation positions provided in another exemplary embodiment of the present disclosure;
fig. 6 is a third schematic diagram of determining jump activation positions provided in an exemplary embodiment of the present disclosure;
fig. 7 is a fourth schematic diagram of determining jump activation positions provided in an exemplary embodiment of the present disclosure;
FIG. 8 is a block diagram of a virtual object control apparatus of the present disclosure;
FIG. 9 is a block diagram illustration of an electronic device in an exemplary embodiment of the disclosure;
FIG. 10 is a schematic diagram illustrating a program product in an exemplary embodiment of the disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The same reference numerals denote the same or similar parts in the drawings, and thus, a repetitive description thereof will be omitted.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the embodiments of the disclosure can be practiced without one or more of the specific details, or with other methods, components, materials, devices, steps, and so forth. In other instances, well-known structures, methods, devices, implementations, materials, or operations are not shown or described in detail to avoid obscuring aspects of the disclosure.
The block diagrams shown in the figures are functional entities only and do not necessarily correspond to physically separate entities. That is, these functional entities may be implemented in the form of software, or in one or more software-hardened modules, or in different networks and/or processor devices and/or microcontroller devices.
The exemplary embodiment first discloses a virtual object control method, which is applied to a touch terminal capable of presenting a virtual scene and at least one virtual object, where the touch terminal may be, for example, a mobile phone, a tablet computer, a notebook computer, a game machine, a PDA, and other electronic devices with a touch screen. The game application can control the touch screen of the touch terminal to present virtual scenes, virtual objects, virtual battle scenes, virtual natural environments and the like through an application program interface of the touch terminal. Referring to fig. 1, the virtual object control method may include the steps of:
step S110, responding to a trigger operation acted on the area where the virtual object is located, and monitoring a sliding operation continuous to the trigger operation;
step S120, when the situation that the sliding operation continuous with the triggering operation meets the preset condition is monitored, determining a jump activation position;
step S130, responding to the release operation continuous with the sliding operation, and controlling the virtual object to execute continuous jumping action according to the position of the virtual object, the jump activation position and the position of the release operation.
According to the virtual object control method in the exemplary embodiment, on one hand, a starting point of continuous jumping is determined through a trigger operation, a jumping transfer position (namely, a jumping activation position) of the continuous jumping is determined through a sliding operation continuous with the trigger operation, and a terminal point of the continuous jumping is determined through a release operation, so that when a user controls the virtual object to continuously jump, the jumping transfer position is accurately determined, interaction between the user and a scene is increased, an operation space is improved, the user obtains enjoyment of competition and strong operation in a game, and a novel mode for controlling the virtual object to continuously jump for multiple times in a touch terminal is provided; on the other hand, the jump activation position is determined by monitoring the sliding operation, so that the user can rapidly determine the direction and the distance of each jump according to the sliding operation, and the user experience is improved.
Next, the virtual object control method in the present exemplary embodiment will be further explained with reference to fig. 1.
In step S110, in response to a trigger operation applied to the area where the virtual object is located, a sliding operation that is continuous with the trigger operation is monitored.
In the present exemplary embodiment, the virtual object is a virtual character or a virtual object controlled by a user. The area where the virtual object is located may be, for example, a circular area or a square area with the virtual object as a middle point, or may be an area within an outline of the virtual object, which is not particularly limited in this exemplary embodiment. The trigger operation may be, for example, a light press operation, a heavy press operation, and the like, which is not particularly limited in this exemplary embodiment.
The user may execute a trigger operation in an area where the virtual object is located through a finger, a stylus, or the like, when the touch terminal receives the trigger operation, the touch terminal determines the location of the virtual object as a jump starting point in response to the trigger operation, monitors a sliding operation continuous with the trigger operation in real time through a monitoring module, and triggers a jump function corresponding to the trigger operation. In order to avoid the misoperation of the user, a preset trigger time can be set, namely when the time for triggering the area where the virtual object is located exceeds the preset trigger time, the jump function corresponding to the trigger operation is triggered, and the sliding operation continuous to the trigger operation is monitored. The preset trigger time may be 3 seconds, 4 seconds, and the like, and this exemplary embodiment is not particularly limited thereto. Obviously, by setting the preset trigger time, when the user finds that the misoperation exists, the trigger operation which is acted on the area where the virtual object is located can be cancelled in the preset trigger time period, so that the misoperation is avoided.
And step S120, when the situation that the sliding operation continuous to the triggering operation meets the preset condition is monitored, determining the jump activation position.
In the present exemplary embodiment, the sliding track of the sliding operation may be a straight line or may be a curved line, and this exemplary embodiment is not particularly limited in this respect. The number of jump activation positions is at least one, that is, the number of jump activation positions may be 1, 2, or 3, and the like, which is not particularly limited in this exemplary embodiment. When the jump activation position is 1, only the jump function corresponding to the jump activation position is triggered. When there are a plurality of jump activation positions, each time a jump activation position is determined, the jump function corresponding to the jump activation position is triggered. It should be noted that triggering the jump function corresponding to the jump activation position means that the user can determine the direction and distance of the jump operation based on the jump activation position, and is not used to mean controlling the virtual object to perform the jump operation. In this exemplary embodiment, the current position of the touch point of the sliding operation may be obtained in real time during the moving process of the touch point of the sliding operation, and when the current position of the touch point of the sliding operation meets the preset condition, the current position of the touch point of the sliding operation may be determined as the jump activation position. It should be noted that the jump activation position is a point in the slide trajectory of the slide operation that meets the preset condition.
Next, two ways of determining the jump activation position will be explained.
Fig. 2 shows a first way of determining the jump activation position, which may comprise step S210 and step S220, wherein,
in step S210, it is determined in real time whether the touch point of the sliding operation continuous with the trigger operation moves out of the currently-associated jump operation area.
In the present exemplary embodiment, the currently-belonging skip operation region refers to a skip operation region to which a touch point of a slide operation currently belongs. The jump operation area to which the touch point of the sliding operation currently belongs may be a preset area centered on the position of the trigger operation or a preset area centered on the jump activation position. Specifically, when the position of the trigger operation is determined, the skip operation area to which the touch point of the slide operation currently belongs is a preset area centered on the position of the trigger operation, when the first skip activation position is determined, the skip operation area to which the touch point of the slide operation currently belongs is a preset area centered on the first skip activation position, when the second skip activation position is determined, the skip operation area to which the touch point of the slide operation currently belongs is a preset area centered on the second skip activation position, and so on, the skip operation area to which the touch point of the slide operation currently belongs can be determined according to the above principle. The preset area may be a circular area, a square area, an irregular area, or the like, the size of the preset area may be set by a developer according to specific requirements, and the preset area centered on the position of the trigger operation and centered on each jump activation position may be the same or different, and this is not particularly limited in this exemplary embodiment.
In step S220, when it is determined that the touch point of the sliding operation continuous with the trigger operation moves outside the jump operation region to which the touch point currently belongs, the current position of the touch point is determined as the jump activation position. In the present exemplary embodiment, when it is determined that the position of the touch point of the slide operation moves outside the jump operation region to which the touch point currently belongs, the current position of the touch point is determined as the jump activation position.
Next, step S210 and step S220 will be described by way of example.
For example, when the number of jump activation positions is 1, as shown in fig. 3, when the user performs a trigger operation in the area where the virtual object 301 is located, the touch terminal triggers a jump function corresponding to the trigger operation, monitors whether or not the touch point of the sliding operation continuous with the trigger operation moves outside the preset area 302 centered on the trigger operation (i.e., outside the jump operation area to which the touch point currently belongs), and when the position of the touch point of the sliding operation moves outside the preset area 302 centered on the trigger operation, determines the current position of the touch point as a jump activation position 303 (shown by enlarging a dotted line in fig. 3), and triggers the jump function corresponding to the jump activation position, at this time, the preset area 304 centered on the jump activation position 303 is the jump operation area to which the touch point of the sliding operation currently belongs.
After determining the jump activation position 303, the user continues to perform the sliding operation in the preset region 304 centered on the jump activation position 303, and after determining the end point 305 of the sliding operation (shown in fig. 3 by enlarging with a dotted line) in the preset region 304 centered on the jump activation position 303, the touch terminal performs the release operation at the end point, and the touch terminal controls the virtual object 301 to jump from the location of the virtual object 301 to the jump activation position 303, and to jump from the jump activation position 303 to the end point 305 of the sliding operation (i.e., the position of the release operation). Note that the route shown by the arrow in the figure is a movement route of the touch point of the slide operation.
For another example, when there are 2 jump activation positions, respectively, the first jump activation position and the second jump activation position, as shown in fig. 4, when the user performs a trigger operation in the area of the virtual object 401, the touch terminal triggers a jump function corresponding to the trigger operation, and monitors whether the touch point of the sliding operation continuous with the trigger operation moves out of the preset area 402 (i.e. out of the jump operation area to which the touch point belongs currently) centered on the trigger operation, when the position of the touch point of the sliding operation moves to the outside of the preset area 402 centered on the triggering operation, the current position of the touch point is determined as a first jump activation position 403 (shown enlarged by a dotted line in fig. 4), and a jump function corresponding to the first jump activation position 403 is triggered, and at this time, a preset area 404 centered on the first jump activation position 403 is a jump operation area to which a touch point of a sliding operation currently belongs.
After determining the first jump activation position 403, the user continues to perform the sliding operation, and the touch terminal continues to monitor whether the touch point of the sliding operation moves out of the preset region 404 centered on the first jump activation position 403, and when the touch point of the sliding operation moves out of the preset region 404 centered on the first jump activation position 403, determines the current position of the touch point as a second jump activation position 405 (shown by enlarging a dotted line in fig. 4), and triggers a jump function corresponding to the second jump activation position 405, at this time, the preset region 406 centered on the second jump activation position 405 is a jump operation region to which the touch point of the sliding operation currently belongs.
After determining the second jump activation position 405, the user continues to perform the sliding operation in the preset area 406 centered on the second jump activation position 405, and after determining the end point 407 of the sliding operation in the preset area 406 centered on the second jump activation position 405, performs the release operation at the end point 407, so that the touch terminal can control the virtual object 401 to jump from the location of the virtual object 401 to the first jump activation position 403, then jump from the first jump activation position 403 to the second jump activation position 405, and finally jump from the second jump activation position 405 to the end point 407 of the sliding operation (i.e., the position of the release operation). Note that the route shown by the arrow in the figure is a movement route of the touch point of the slide operation. The release operation will be described below, and thus will not be described in detail here.
Fig. 5 shows a second way of determining the jump activation position, which may include step S510 and step S520. Wherein:
in step S510, it is determined in real time whether a touch point of the sliding operation that is continuous with the trigger operation is located in an area where a preset type of virtual resource is located in the virtual scene.
In the present exemplary embodiment, the virtual resource may be any object or virtual object in a virtual scene, and the preset type may be a person, an object, a non-collision body, a collision body, and the like, which is not limited in this exemplary embodiment.
Whether the touch point of the sliding operation is located in the area where the virtual resource of the preset type is located can be judged by judging whether the coordinate of the touch point of the sliding operation is in the range of the area where the virtual resource of the preset type is located in real time.
In step S520, when it is determined that the touch point of the sliding operation continuous with the trigger operation is located in the area where the virtual resource of the preset type is located in the virtual scene, the current position of the touch point is determined as the jump activation position.
In this exemplary embodiment, when the coordinate of the touch point of the sliding operation is in the area where the virtual resource of the preset type is located, it is described that the touch point of the sliding operation is located in the area where the virtual resource of the preset type is located, and the current position of the touch point is determined as the jump activation position.
Next, the steps S510 and S520 are described by taking a preset type of virtual resource as an example of a collision volume.
The step S510 may include: and judging whether the touch points of the sliding operation continuous with the triggering operation collide with the collision bodies in the virtual scene in real time.
In the present exemplary embodiment, the collision volume may be, for example, a virtual character or a virtual object in a virtual scene, which is not particularly limited in the present exemplary embodiment. Whether the touch point of the sliding operation collides with the collision body can be determined by determining whether the position of the touch point of the sliding operation coincides with the position of the collision body, that is, when the position of the touch point of the sliding operation coincides with the position of the collision body, it is determined that the touch point of the sliding operation collides with the collision body, and when the position of the touch point of the sliding operation does not coincide with the position of the collision body, it is determined that the touch point of the sliding operation does not collide with the collision body. Since the collision body has a certain volume, the position of the collision body here may be any position in the collision body.
The step S520 may include: when it is determined that a touch point of a sliding operation continuous with the trigger operation collides with the collision body in the virtual scene, determining a collision position of the touch point and the collision body as the jump activation position. In the present exemplary embodiment, when the touch point collides with the collision body, the collision position of the touch point with the collision body is determined as the jump activation position.
The above-described process will be described below by way of example.
For example, when the number of jump activation positions is 1, as shown in fig. 6, when the user performs a trigger operation in the area where the virtual object 601 is located, the touch terminal triggers a jump function corresponding to the trigger operation; a user executes a sliding operation continuous with the triggering operation, the touch terminal judges whether a touch point of the sliding operation collides with the collision body 602 in the moving process in real time, and when the touch point of the sliding operation collides with the collision body 602 in the moving process, the collision position of the touch point and the collision body is determined as a jump activation position 603 (shown by the dotted line in an enlarged manner in fig. 6); after determining the jump activation position 603, the user continues to perform the sliding operation, and after determining the end point 604 of the touch point of the sliding operation, performs the release operation at the position of the end point 604, and the touch terminal can control the virtual object 601 to jump from the position of the virtual object 601 to the jump activation position 603, and jump from the jump activation position 603 to the end point 604 of the sliding operation (i.e., the position of the release operation). Note that the route shown by the arrow in the figure is a movement route of the touch point of the slide operation.
For another example, when the number of jump activation positions is 2, and the jump activation positions are the first jump activation position and the second jump activation position, as shown in fig. 7, when the user performs a trigger operation in the area where the virtual object 701 is located, the touch terminal triggers a jump function corresponding to the trigger operation; a user executes a sliding operation which is continuous with a triggering operation, the touch terminal judges whether a touch point of the sliding operation collides with the first collision body 702 in the moving process in real time, and determines the collision position of the touch point with the first collision body 702 as a first jump activation position 703 when the touch point of the sliding operation collides with the first collision body 702 in the moving process, and triggers a jump function corresponding to the first jump activation position 703; after determining the first jump activation position 703, the user continues to perform the sliding operation, the touch terminal determines in real time whether the touch point of the sliding operation collides with the second collision body 704 during the moving process, and determines the collision position of the touch point with the second collision body 704 as a second jump activation position 705 when the touch point of the sliding operation collides with the second collision body 704 during the moving process, and triggers a jump function corresponding to the second jump activation position 705. After determining the second jump activation position 705, the user continues to perform the sliding operation, and after determining the end point 706 of the touch point of the sliding operation, performs the release operation at the position of the end point 706, so that the touch terminal can control the virtual object 701 to jump from the position of the virtual object 701 to the first jump activation position 703, jump from the first jump activation position 703 to the second jump activation position 705, and jump from the second jump position 705 to the end point 706 of the sliding operation (i.e., the position of the release operation). Note that the route shown by the arrow in the figure is a movement route of the touch point of the slide operation. The release operation will be described below, and thus will not be described in detail here.
In order to more intuitively prompt a user that a touch point collides with a collision body, when the touch point collides with the collision body in a virtual scene, collision prompt information is displayed based on the collision position of the touch point and the collision body. The collided prompt message may be a text message, a picture message, or an animation message, which is not particularly limited in this exemplary embodiment.
In order to prompt the user of a location where a collision may currently occur, the method may further include: in the process of moving the touch point of the sliding operation, a ray is emitted from the touch point of the sliding operation by taking the jump activation position or the position of the triggering operation as a starting point; displaying a collisionable prompt at a location where the ray collides with the collider in the virtual scene.
In the present exemplary embodiment, when a user performs a slide operation continuous with a trigger operation after performing the trigger operation on a virtual object, a ray is emitted through a touch point of the slide operation with a position of the trigger operation as a starting point, and collision-enabled prompt information is displayed at a position where the ray collides with a collision body in a virtual scene. When the position of the touch point of the slide operation is moved to the position where the ray collides with the collision body, the ray passing through the touch point from the trigger operation as the start point may be cancelled from being displayed.
When the user has determined a jump activation position and the user continues to perform a sliding operation, a ray may be emitted from the jump activation position as a starting point through a touch point of the sliding operation, and a collision-enabling prompt message may be displayed at a position where the ray collides with a collision body, so that the user may conveniently and quickly determine the next jump activation position.
It should be noted that the collision-enabled prompt message may be text message, picture message, or animation message, and this exemplary embodiment is specifically limited to this step.
In step S130, in response to a release operation that is continuous with the slide operation, the virtual object is controlled to perform a continuous jump action according to the position of the virtual object, the jump activation position, and the position of the release operation.
In the present exemplary embodiment, the release operation may be a long press operation acting on the end point of the sliding operation, a heavy press operation, an operation in which a finger leaves the touch screen, or the like, and this is not particularly limited in the present exemplary embodiment. The consecutive jumping motion may include at least two jumping operations, and the jumping direction and the jumping distance of each jumping operation may be the same or different, which is not particularly limited in the present exemplary embodiment.
Controlling the virtual object to perform the successive jump action according to the location of the virtual object, the jump activation location, and the location of the release operation may include: and calculating jump parameters of each jump operation according to the position of the virtual object, the jump activation position and the position of the release operation, and controlling the virtual object to jump from the position of the virtual object to the jump activation position and from the jump activation position to the position of the release operation according to the jump parameters of each jump operation. When the virtual object is controlled to execute the continuous jump action, the virtual object jumps from the position where the virtual object is located, and jumps to the position of the release operation in sequence according to the sequence of the jump activation positions. It should be noted that the jump parameters of each jump operation may include a jump direction and a jump distance.
In summary, the starting point of the continuous jump is determined through the trigger operation, the jump relay position (namely the jump activation position) of the continuous jump is determined through the sliding operation continuous with the trigger operation, the end point of the continuous jump is determined through the release operation, so that when the user controls the virtual object to continuously jump, the jump relay position is accurately determined, meanwhile, the interaction between the user and the scene is increased, the operation space is improved, the user obtains the enjoyment of competition and strong operation in the game, and meanwhile, a new mode for controlling the virtual object to continuously jump for multiple times in the touch terminal is provided; in addition, the jump activation position is determined by monitoring the sliding operation, so that the user can quickly determine the direction and the distance of each jump according to the sliding operation, and the user experience is improved.
In addition, in order to enable the user to more intuitively acquire the current direction and position of the sliding operation, the method may further include: and in the moving process of the touch point of the sliding operation, dynamically displaying the direction and/or the distance of the touch point of the sliding operation relative to the position of the trigger operation or the jump activation position in real time based on the current position of the touch point of the sliding operation.
In the present exemplary embodiment, the direction and/or distance of the touch point with respect to the position of the trigger operation or the jump activation position may be calculated in real time according to the current position of the touch point of the slide operation and the position of the trigger operation or the jump activation position, and the direction and/or distance of the touch point with respect to the position of the trigger operation or the jump activation position may be dynamically displayed in real time based on the position of the trigger operation (e.g., above, below, left, or right of the position of the trigger operation, etc.) or the jump activation position (e.g., above, below, left, or right of the jump activation position, etc.).
Next, the above-described procedure is described by way of example, when the user performs a slide operation that is continuous with the trigger operation, the direction and distance of the touch point with respect to the position of the trigger operation are displayed above the current position of the touch point of the slide operation, and then, when the jump activation position is determined and the slide operation is continued, the direction and/or distance of the touch point with respect to the position of the trigger operation are canceled from being displayed, and the direction and/or distance of the touch point with respect to the jump activation position are displayed above the current position of the slide operation.
It should be noted that although the various steps of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that these steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
In an exemplary embodiment of the present disclosure, there is also provided a virtual object control apparatus, which is applied to a touch terminal capable of presenting a virtual scene and at least one virtual object, as shown in fig. 8, where the virtual object control apparatus 800 may include: a triggering module 801, a determining module 802, and a control module 803, wherein:
a triggering module 801, configured to monitor a sliding operation that continues with a triggering operation in response to the triggering operation that acts on an area where the virtual object is located;
a determining module 802, configured to determine a jump activation position when it is monitored that a sliding operation consecutive to the triggering operation satisfies a preset condition;
the control module 803 may be configured to, in response to a release operation that is continuous with the sliding operation, control the virtual object to perform a continuous jump action according to the location of the virtual object, the jump activation location, and the location of the release operation.
The specific details of each virtual object control device module are already described in detail in the corresponding virtual object control method, and therefore are not described herein again.
It should be noted that although in the above detailed description several modules or units of the apparatus for performing are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
In an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or program product. Thus, various aspects of the invention may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 900 according to this embodiment of the invention is described below with reference to fig. 9. The electronic device 900 shown in fig. 9 is only an example and should not bring any limitations to the function and scope of use of the embodiments of the present invention.
As shown in fig. 9, the electronic device 900 is embodied in the form of a general purpose computing device. Components of electronic device 900 may include, but are not limited to: the at least one processing unit 910, the at least one storage unit 920, a bus 930 connecting different system components (including the storage unit 920 and the processing unit 910), and a display unit 940.
Wherein the storage unit stores program code that is executable by the processing unit 910 to cause the processing unit 910 to perform steps according to various exemplary embodiments of the present invention described in the above section "exemplary methods" of the present specification. For example, the processing unit 910 may execute step S110 shown in fig. 1, in response to a trigger operation acting on an area where the virtual object is located, monitor a sliding operation that is continuous with the trigger operation; step S120, when the situation that the sliding operation continuous with the triggering operation meets the preset condition is monitored, determining a jump activation position; step S130, responding to the release operation continuous with the sliding operation, and controlling the virtual object to execute continuous jumping action according to the position of the virtual object, the jump activation position and the position of the release operation.
The storage unit 920 may include a readable medium in the form of a volatile storage unit, such as a random access memory unit (RAM)9201 and/or a cache memory unit 9202, and may further include a read only memory unit (ROM) 9203.
Storage unit 920 may also include a program/utility 9204 having a set (at least one) of program modules 9205, such program modules 9205 including but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 930 can be any of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 900 may also communicate with one or more external devices 970 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 900, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 900 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interface 950. Also, the electronic device 900 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN) and/or a public network, such as the Internet) via the network adapter 960. As shown, the network adapter 960 communicates with the other modules of the electronic device 900 via the bus 930. It should be appreciated that although not shown, other hardware and/or software modules may be used in conjunction with the electronic device 900, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, there is also provided a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to various exemplary embodiments of the invention described in the above section "exemplary methods" of the present description, when said program product is run on the terminal device.
Referring to fig. 10, a program product 1000 for implementing the above method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Furthermore, the above-described figures are merely schematic illustrations of processes involved in methods according to exemplary embodiments of the invention, and are not intended to be limiting. It will be readily appreciated that the processes illustrated in the figures do not indicate or limit the order of the events of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (9)

1. A virtual object control method is applied to a touch terminal capable of presenting a virtual scene and at least one virtual object, and comprises the following steps:
responding to a trigger operation acted on the area where the virtual object is located, and monitoring a sliding operation continuous with the trigger operation;
when the situation that the sliding operation continuous with the triggering operation meets the preset condition is monitored, determining a jump activation position;
in response to a release operation continuous with the sliding operation, controlling the virtual object to execute a continuous jump action according to the position of the virtual object, the jump activation position and the position of the release operation;
wherein, when it is monitored that the sliding operation continuous with the trigger operation satisfies a preset condition, determining the jump activation position includes:
judging whether the touch point of the sliding operation continuous with the triggering operation moves out of the current jumping operation area in real time;
determining the current position of the touch point as the jump activation position when judging that the touch point of the sliding operation continuous with the trigger operation moves out of the jump operation area to which the touch point belongs;
or, when it is monitored that the sliding operation continuous with the triggering operation satisfies a preset condition, determining the jump activation position includes:
judging whether a touch point of sliding operation continuous with the triggering operation is located in an area where a preset type of virtual resource is located in the virtual scene in real time;
when the touch point of the sliding operation continuous with the triggering operation is judged to be located in the area where the virtual resource of the preset type is located in the virtual scene, determining the current position of the touch point as a jump activation position;
after each determination of the jump activation location, triggering a jump function corresponding to the jump activation location.
2. The virtual object control method according to claim 1, wherein the virtual resource of the preset type is a collision volume;
the real-time judgment of whether the touch point of the sliding operation continuous with the trigger operation is located in the area where the virtual resource of the preset type is located in the virtual scene comprises the following steps:
judging whether a touch point of sliding operation continuous with the triggering operation collides with the collider in the virtual scene in real time;
when it is determined that the touch point of the sliding operation continuous with the triggering operation is located in the area where the virtual resource of the preset type is located in the virtual scene, determining the current position of the touch point as the jump activation position includes:
when it is determined that a touch point of a sliding operation continuous with the trigger operation collides with the collision body in the virtual scene, determining a collision position of the touch point and the collision body as the jump activation position.
3. The virtual object control method according to claim 2, wherein the determining that the touch point of the sliding operation continuous with the trigger operation collides with the collision body in the virtual scene further includes:
and displaying collision prompt information based on the collision position of the touch point and the collision body.
4. The virtual object control method according to claim 2, characterized in that the method further comprises:
in the process of moving the touch point of the sliding operation, a ray is emitted from the touch point of the sliding operation by taking the jump activation position or the position of the triggering operation as a starting point;
displaying a collisionable prompt at a location where the ray collides with the collider in the virtual scene.
5. The virtual object control method according to claim 1, characterized in that the method further comprises:
and during the movement process of the touch point of the sliding operation, dynamically displaying the direction and/or the distance of the touch point of the sliding operation relative to the position of the trigger operation or the jump activation position in real time based on the current position of the touch point of the sliding operation.
6. The virtual object control method according to any one of claims 1 to 5, wherein the jump activation position is at least one.
7. A virtual object control apparatus applied to a touch terminal capable of presenting a virtual scene and at least one virtual object, the virtual object control apparatus comprising:
the triggering module is used for responding to triggering operation acting on the area where the virtual object is located and monitoring sliding operation continuous to the triggering operation;
the determining module is used for determining a jump activation position when the situation that the sliding operation continuous with the triggering operation meets the preset condition is monitored;
the control module is used for responding to a release operation continuous with the sliding operation and controlling the virtual object to execute continuous jumping action according to the position of the virtual object, the jump activation position and the position of the release operation;
the determining module is used for judging whether the touch point of the sliding operation continuous with the triggering operation moves out of the current jumping operation area in real time; determining the current position of the touch point as the jump activation position when judging that the touch point of the sliding operation continuous with the trigger operation moves out of the jump operation area to which the touch point belongs;
or, the determining module is configured to determine in real time whether a touch point of the sliding operation that is continuous with the triggering operation is located in an area where a virtual resource of a preset type is located in the virtual scene; when the touch point of the sliding operation continuous with the triggering operation is judged to be located in the area where the virtual resource of the preset type is located in the virtual scene, determining the current position of the touch point as a jump activation position;
and the triggering module is used for triggering the jump function corresponding to the jump activation position after the jump activation position is determined.
8. A computer-readable storage medium on which a computer program is stored, the computer program, when executed by a processor, implementing the virtual object control method of any one of claims 1 to 6.
9. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the virtual object control method of any of claims 1-6 via execution of the executable instructions.
CN201810246492.0A 2018-03-23 2018-03-23 Virtual object control method and device, storage medium and electronic equipment Active CN108434731B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810246492.0A CN108434731B (en) 2018-03-23 2018-03-23 Virtual object control method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810246492.0A CN108434731B (en) 2018-03-23 2018-03-23 Virtual object control method and device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN108434731A CN108434731A (en) 2018-08-24
CN108434731B true CN108434731B (en) 2022-02-11

Family

ID=63196869

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810246492.0A Active CN108434731B (en) 2018-03-23 2018-03-23 Virtual object control method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN108434731B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110193198B (en) * 2019-05-23 2023-02-10 腾讯科技(深圳)有限公司 Object jump control method, device, computer equipment and storage medium
CN111729311B (en) * 2020-06-22 2024-05-10 苏州幻塔网络科技有限公司 Climbing and jumping method, climbing and jumping device, computer equipment and computer readable storage medium
CN111773681B (en) * 2020-08-03 2024-07-09 网易(杭州)网络有限公司 Method and device for controlling virtual game roles
CN116764215A (en) * 2022-03-09 2023-09-19 腾讯科技(深圳)有限公司 Virtual object control method, device, equipment, storage medium and program product

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1561496A1 (en) * 2004-02-09 2005-08-10 Nintendo Co., Limited Game apparatus and storage medium having game program stored therein
JP5701440B1 (en) * 2014-08-29 2015-04-15 株式会社Cygames Method to improve user input operability
JP2016179408A (en) * 2016-07-22 2016-10-13 株式会社タイトー Block game operation program and block game machine
JP2017035214A (en) * 2015-08-07 2017-02-16 株式会社あかつき Information processor, information processing system, and character movement control program
CN106938142A (en) * 2016-12-06 2017-07-11 任天堂株式会社 Games system, game processing method, game device and recording medium
WO2018042466A1 (en) * 2016-08-31 2018-03-08 任天堂株式会社 Game program, game processing method, game system, and game device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1561496A1 (en) * 2004-02-09 2005-08-10 Nintendo Co., Limited Game apparatus and storage medium having game program stored therein
JP5701440B1 (en) * 2014-08-29 2015-04-15 株式会社Cygames Method to improve user input operability
JP2017035214A (en) * 2015-08-07 2017-02-16 株式会社あかつき Information processor, information processing system, and character movement control program
JP2016179408A (en) * 2016-07-22 2016-10-13 株式会社タイトー Block game operation program and block game machine
WO2018042466A1 (en) * 2016-08-31 2018-03-08 任天堂株式会社 Game program, game processing method, game system, and game device
CN106938142A (en) * 2016-12-06 2017-07-11 任天堂株式会社 Games system, game processing method, game device and recording medium

Also Published As

Publication number Publication date
CN108434731A (en) 2018-08-24

Similar Documents

Publication Publication Date Title
CN108434731B (en) Virtual object control method and device, storage medium and electronic equipment
CN107019909B (en) Information processing method, information processing device, electronic equipment and computer readable storage medium
CN108579089B (en) Virtual item control method and device, storage medium and electronic equipment
CN108465238B (en) Information processing method in game, electronic device and storage medium
CN109460179B (en) Virtual object control method and device, electronic equipment and storage medium
CN109260713B (en) Virtual object remote assistance operation method and device, storage medium and electronic equipment
CN107656620B (en) Virtual object control method and device, electronic equipment and storage medium
CN109960558B (en) Virtual object control method and device, computer storage medium and electronic equipment
CN108037888B (en) Skill control method, skill control device, electronic equipment and storage medium
CN109857303B (en) Interaction control method and device
US20170242578A1 (en) Method and a device for controlling a moving object, and a mobile apparatus
CN108595022B (en) Virtual character advancing direction adjusting method and device, electronic equipment and storage medium
CN107967096A (en) Destination object determines method, apparatus, electronic equipment and storage medium
CN108579077B (en) Information processing method, device, storage medium and electronic equipment
CN110624241A (en) Information processing method and device, electronic equipment and storage medium
CN109316745B (en) Virtual object motion control method and device, electronic equipment and storage medium
CN108776544B (en) Interaction method and device in augmented reality, storage medium and electronic equipment
CN111481923B (en) Rocker display method and device, computer storage medium and electronic equipment
CN108355352B (en) Virtual object control method and device, electronic device and storage medium
CN108245889B (en) Free visual angle orientation switching method and device, storage medium and electronic equipment
CN107426412B (en) Information processing method and device, storage medium and electronic equipment
CN115944919A (en) Virtual role control method and device, computer storage medium and electronic equipment
CN110215709B (en) Object selection method and device, storage medium and electronic equipment
CN111530065A (en) Game control method and device, computer storage medium and electronic equipment
CN114011062A (en) Information processing method, information processing device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant