CN111921201A - Method and device for generating frame data, storage medium and computer equipment - Google Patents

Method and device for generating frame data, storage medium and computer equipment Download PDF

Info

Publication number
CN111921201A
CN111921201A CN202010993082.XA CN202010993082A CN111921201A CN 111921201 A CN111921201 A CN 111921201A CN 202010993082 A CN202010993082 A CN 202010993082A CN 111921201 A CN111921201 A CN 111921201A
Authority
CN
China
Prior art keywords
frame
following
target
data
data frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010993082.XA
Other languages
Chinese (zh)
Other versions
CN111921201B (en
Inventor
徐华龙
何文辉
董星辰
冯越宇
李聪
罗天成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Perfect Tianzhiyou Technology Co ltd
Original Assignee
Chengdu Perfect Tianzhiyou Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Perfect Tianzhiyou Technology Co ltd filed Critical Chengdu Perfect Tianzhiyou Technology Co ltd
Priority to CN202011439747.9A priority Critical patent/CN112426717A/en
Priority to CN202010993082.XA priority patent/CN111921201B/en
Publication of CN111921201A publication Critical patent/CN111921201A/en
Application granted granted Critical
Publication of CN111921201B publication Critical patent/CN111921201B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application discloses a method and a device for generating frame data, a storage medium and computer equipment, wherein the method comprises the following steps: acquiring target moving path frame data corresponding to a target object, and determining a synchronous following data frame corresponding to the following object according to a preset following frame number difference corresponding to the following object and the target moving path frame data, wherein the following frame time corresponding to the synchronous following data frame is the same as the target frame time corresponding to the target data frame, the position of the following frame corresponding to the synchronous following data frame is determined based on the position of the target following frame corresponding to the corresponding target data frame, and the target following frame is a data frame of which the forward phase difference of the target data frame is a preset following frame number difference; performing interpolation processing based on the position corresponding to the target moving path frame data, and determining an asynchronous following data frame corresponding to a following object; and generating following moving path frame data corresponding to the following object according to the synchronous following data frame and the asynchronous following data frame.

Description

Method and device for generating frame data, storage medium and computer equipment
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method and an apparatus for generating frame data, a storage medium, and a computer device.
Background
In a conventional MMORPG game (multiplayer online role-playing game), there are a large number of multi-player collaborative plays (such as multi-player copy, multi-player mission, large game battlefield, etc.), and these plays have a common point: the method needs to be started by multiple persons going to a starting point at the same time, a long distance is often needed to be traveled in the process of going to the starting point of the playing method, the process of catching up the way is usually very boring, the game experience of a player is greatly reduced by the boring process, however, the catching up experience is not rare in the game, and therefore, how to improve the game experience of the player in catching up the way in each playing method becomes very important.
With the rapid development of the social requirement and the personal acceptance requirement of the game, most of the existing games adopt the following function to improve the experience of the player, and the following playing method in the game shows explosive growth in recent time. The current popular solution for the movement following play method is as follows: when the second role needs to follow the first role, the first role client side can determine a following path of the second role according to the moving path of the first role, the first role client side sends the following path to the server, the server forwards the following path to the second role client side, and the second role client side controls the second role to move according to the received following path so as to show the following effect of the second role on the first role. The above solution has two drawbacks: firstly, a following path is calculated by a first role client, and the client has performance pressure, so that the client is easy to jam and lack of performance; secondly, the second character can only completely follow the moving track of the first character, if the first character moves with the effects of blocking, fast, slow and the like, the effect of the second character following the first character is poor in expressive force, and thirdly, the player can not follow the NPC character in the game.
In conclusion, the defects of the following schemes are urgently needed to be solved.
Disclosure of Invention
In view of the above, the present application provides a method and apparatus for generating frame data, a storage medium, and a computer device.
According to an aspect of the present application, there is provided a method for generating frame data, applied to a server, including:
acquiring target moving path frame data corresponding to a target object, and determining a synchronous following data frame corresponding to the following object according to a preset following frame number difference corresponding to the following object and the target moving path frame data, wherein the following frame time corresponding to the synchronous following data frame is the same as the target frame time corresponding to the target data frame, the following frame position corresponding to the synchronous following data frame is determined based on the position of the target following frame corresponding to the corresponding target data frame, and the target following frame is a data frame of which the forward direction of the target data frame is different from the preset following frame number difference;
performing interpolation processing based on the position corresponding to the target moving path frame data, and determining an asynchronous following data frame corresponding to the following object;
and generating following moving path frame data corresponding to the following object according to the synchronous following data frame and the asynchronous following data frame.
In particular, the asynchronously following data frame comprises an initially following data frame; the determining of the asynchronous following data frame corresponding to the following object specifically includes:
acquiring a first target frame position and a first target frame time corresponding to a first target data frame in the target moving path frame data, and acquiring a first following frame position of the following object, wherein the first target data frame is a first frame in the target moving path frame data;
performing interpolation based on the first target frame position and the first following frame position to obtain an initial following frame position matched with the difference quantity of the preset following frames;
marking initial following frame time corresponding to the initial following frame position, and determining the initial following data frame in the following movement path frame data, wherein the first initial following frame time is the first target frame time.
Specifically, the asynchronous following data frame comprises a launch following data frame; the determining of the asynchronous following data frame corresponding to the following object specifically includes:
acquiring a first target frame position corresponding to a first target data frame and a second target frame position corresponding to a second target data frame in the target moving path frame data, wherein the second target data frame is a data frame which is different from the first target data frame by the preset following frame number difference quantity;
interpolating based on the first target frame position and the second target frame position to obtain a starting following frame position matched with the difference quantity of the preset following frames, wherein the starting following frame position comprises the second target frame position;
marking the starting following frame time corresponding to the starting following needle position according to the first target frame time and the second target frame time, and determining the starting following frame data in the following moving path frame data.
Specifically, before performing interpolation based on the first target frame position and the second target frame position, the method further includes:
acquiring all starting target frame positions corresponding to the first target data frame and the second target data frame;
and if the starting target frame position is repeated, performing interpolation based on the first target frame position and the second target frame position.
Specifically, each target data frame included in the target moving path frame data further corresponds to a target frame state, and the following frame state corresponding to the synchronous following data frame is a target frame state corresponding to a target data frame which is different by the preset following frame number difference before the target data frame of time synchronization.
Specifically, each target data frame included in the target movement path frame data also corresponds to a target frame rate, and the asynchronous following data frame includes a step-out following data frame; the determining of the asynchronous following data frame corresponding to the following object specifically includes:
when the target frame speed corresponding to any third target data frame in the target data frames is zero, acquiring the state of a stopping target frame corresponding to a stopping target data frame which is equal to the difference of the preset following frame number before the third target data frame;
when the number of the states of the stopping target frames is multiple, acquiring a fourth target data frame with changed state, and determining a fourth target frame position corresponding to the fourth target data frame;
acquiring a second following frame position corresponding to the synchronous following data frame with synchronous time according to a third target frame time corresponding to the third target data frame;
determining the step-down following data frame corresponding to the following object based on a distance between the second following frame position and the fourth target frame position.
Specifically, the determining the step-out following data frame corresponding to the following object specifically includes:
and if the distance between the second following frame position and the fourth target frame position is less than or equal to a preset stopping following distance, taking the fourth target frame position as a first stopping position of the following object, and determining the stopping following data frame according to the second following frame position and the first stopping position.
Specifically, the determining the step-out following data frame corresponding to the following object specifically includes:
and if the distance between the second following frame position and the fourth target frame position is greater than the preset stopping following distance, determining a second stopping position of the following object based on a third target frame position corresponding to the third target data frame, and determining the stopping following data frame according to the second following frame position and the second stopping position.
Specifically, if the following object includes a plurality of objects, before the obtaining of the target moving path frame data corresponding to the target object, the method further includes:
respectively determining the following sequence of each following object, and respectively determining the preset following frame number difference corresponding to each following object according to the following sequence;
after generating the following movement path frame data corresponding to the following object, the method further includes:
and sending the following moving path frame data to following terminals corresponding to the following objects respectively.
According to another aspect of the present application, there is provided an apparatus for generating frame data, applied to a server, including:
a synchronous following frame determining module, configured to obtain target moving path frame data corresponding to a target object, and determine a synchronous following data frame corresponding to the following object according to a preset following frame number difference corresponding to the following object and the target moving path frame data, where a following frame time corresponding to the synchronous following data frame is the same as a target frame time corresponding to the target data frame, a following frame position corresponding to the synchronous following data frame is determined based on a position of a target following frame corresponding to the corresponding target data frame, and the target following frame is a data frame of which a forward direction of the target data frame is different from the preset following frame number difference;
the asynchronous following frame determining module is used for performing interpolation processing on the basis of the position corresponding to the target moving path frame data and determining an asynchronous following data frame corresponding to the following object;
and the following frame data generation module is used for generating following moving path frame data corresponding to the following object according to the synchronous following data frame and the asynchronous following data frame.
In particular, the asynchronously following data frame comprises an initially following data frame; the asynchronous following frame determining module specifically includes:
an initial frame time obtaining unit, configured to obtain a first target frame position and a first target frame time corresponding to a first target data frame in the target movement path frame data, and obtain a first following frame position of the following object, where the first target data frame is a first frame in the target movement path frame data;
an initial frame position determining unit, configured to perform interpolation based on the first target frame position and the first following frame position to obtain an initial following frame position that matches the difference between the preset following frames;
and an initial following frame determining unit, configured to mark an initial following frame time corresponding to the initial following frame position, and determine the initial following data frame in the following movement path frame data, where a first initial following frame time is the first target frame time.
Specifically, the asynchronous following data frame comprises a launch following data frame; the asynchronous following frame determining module specifically includes:
a first synchronization frame data obtaining unit, configured to obtain a first target frame position corresponding to a first target data frame and a second target frame position corresponding to a second target data frame in the target moving path frame data, where the second target data frame is a data frame that differs by the preset number difference of following frames after the first target data frame;
a starting frame position determining unit, configured to perform interpolation based on the first target frame position and the second target frame position to obtain a starting following frame position that matches the preset following frame difference number, where the starting following frame position includes the second target frame position;
and the starting frame data determining unit is used for marking the starting following frame time corresponding to the starting following needle position according to the first target frame time and the second target frame time and determining the starting following frame data in the following moving path frame data.
Specifically, the following data frame determination module further includes:
a second starting frame data obtaining unit, configured to obtain all starting target frame positions corresponding to the first target data frame and the second target data frame before performing interpolation based on the first target frame position and the second target frame position;
and the starting judging unit is used for carrying out interpolation based on the first target frame position and the second target frame position if the starting target frame position has repetition.
Specifically, each target data frame included in the target moving path frame data further corresponds to a target frame state, and the following frame state corresponding to the synchronous following data frame is a target frame state corresponding to a target data frame which is different by the preset following frame number difference before the target data frame of time synchronization.
Specifically, each target data frame included in the target movement path frame data also corresponds to a target frame rate, and the asynchronous following data frame includes a step-out following data frame; the asynchronous following frame determining module specifically includes:
a step-out frame state obtaining unit, configured to obtain, when a target frame speed corresponding to any one third target data frame in the target data frames is zero, a step-out target frame state corresponding to a step-out target data frame before the third target data frame, which is equivalent to the preset following frame number difference;
the state switching frame acquisition unit is used for acquiring a fourth target data frame with changed state and determining a fourth target frame position corresponding to the fourth target data frame when the stop target frame states comprise a plurality of states;
a stop frame position obtaining unit, configured to obtain, according to a third target frame time corresponding to the third target data frame, a second following frame position corresponding to a synchronized following data frame that is time-synchronized;
and the step-stopping following frame determining unit is used for determining a step-stopping following data frame corresponding to the following object based on the distance between the second following frame position and the fourth target frame position.
Specifically, the step-down following frame determining unit specifically includes:
and the first stopping following frame determining subunit is configured to, if the distance between the second following frame position and the fourth target frame position is less than or equal to a preset stopping following distance, use the fourth target frame position as the first stopping position of the following object, and determine the stopping following data frame according to the second following frame position and the first stopping position.
Specifically, the step-down following frame determining unit specifically includes:
a second stopping following frame determining subunit, configured to determine, if a distance between the second following frame position and the fourth target frame position is greater than the preset stopping following distance, a second stopping position of the following object based on a third target frame position corresponding to the third target data frame, and determine the stopping following data frame according to the second following frame position and the second stopping position.
Specifically, the apparatus further comprises:
a following frame number difference determining module, configured to determine a following order of each following object before obtaining target moving path frame data corresponding to a target object if the following object includes a plurality of following objects, and determine a preset following frame number difference corresponding to each following object according to the following order;
and the following data frame sending module is used for sending the following moving path frame data to the following terminals corresponding to the following objects after the following moving path frame data corresponding to the following objects are generated.
According to yet another aspect of the present application, there is provided a storage medium having stored thereon a computer program which, when executed by a processor, implements the above-described method of generating frame data.
According to yet another aspect of the present application, there is provided a computer device comprising a storage medium, a processor, and a computer program stored on the storage medium and executable on the processor, the processor implementing the above method of generating frame data when executing the program.
By means of the technical scheme, the server obtains the target moving path frame data corresponding to the target object, and accordingly determines the synchronous following data frame and the asynchronous following data frame corresponding to the following object on the basis of the target moving path frame data according to the preset following frame number difference, and the following moving path frame data is generated, so that the following object can show the following effect on the target object when moving based on the following moving path frame data. Compared with the prior art, the method and the device have the advantages that the complex process of determining the data of the following moving path frame is processed by the high-performance server, the processing efficiency is greatly improved, the problems that blocking is caused due to the fact that performance pressure of a client is too large, frames are lost, performance is lost, and even the problem of tracking loss occurs in the following process are solved, interpolation is carried out through the moving track position based on the target object, so that the following track position of the following object is determined, the following movement performance effect of the following object is improved, on the other hand, the problem that the NPC character in a game scene cannot be followed in the prior art is solved, technical support is provided for increasing game playing methods in the game development stage, guarantee is provided for players to experience more playing methods in the game experience stage, and the game experience of the players and the competitiveness of game products are improved.
The foregoing description is only an overview of the technical solutions of the present application, and the present application can be implemented according to the content of the description in order to make the technical means of the present application more clearly understood, and the following detailed description of the present application is given in order to make the above and other objects, features, and advantages of the present application more clearly understandable.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a flowchart illustrating a method for generating frame data according to an embodiment of the present disclosure;
fig. 2 is a schematic flowchart illustrating another method for generating frame data according to an embodiment of the present application;
fig. 3 is a schematic flowchart illustrating another method for generating frame data according to an embodiment of the present application;
fig. 4 is a schematic flowchart illustrating another method for generating frame data according to an embodiment of the present application;
fig. 5 is a schematic structural diagram illustrating an apparatus for generating frame data according to an embodiment of the present application;
fig. 6 shows a schematic structural diagram of another apparatus for generating frame data according to an embodiment of the present application.
Detailed Description
The present application will be described in detail below with reference to the accompanying drawings in conjunction with embodiments. It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
In this embodiment, a method for generating frame data is provided, and is applied to a server, as shown in fig. 1, and the method includes:
step 101, acquiring target moving path frame data corresponding to a target object, and determining a synchronous following data frame corresponding to the following object according to a preset following frame number difference corresponding to the following object and the target moving path frame data, wherein the following frame time corresponding to the synchronous following data frame is the same as the target frame time corresponding to the target data frame, the following frame position corresponding to the synchronous following data frame is determined based on the position of the target following frame corresponding to the corresponding target data frame, and the target following frame is a data frame of which the forward phase difference of the target data frame is the preset following frame number difference;
102, performing interpolation processing based on a position corresponding to target moving path frame data, and determining an asynchronous following data frame corresponding to a following object;
and 103, generating following moving path frame data corresponding to the following object according to the synchronous following data frame and the asynchronous following data frame.
The embodiment of the application is mainly applied to a server, in a scene applied to a game server, a motion following instruction can be specifically a motion following instruction of a game character, and after the game server receives the game character motion following instruction from a game player client, a target object terminal and a following terminal are determined by analyzing the instruction, so that a followed object indicated by the instruction, namely a target object, and a following object are determined. The following object may be one or more, the motion following instruction may be from a target object terminal, for example, in a multi-player cooperative game, a player performs team formation in a game scene, a team leader initiates a team following instruction to instruct one or more team members in the team to follow the player, or the motion following instruction may be from a following terminal, for example, a player a clicks an avatar of a player B in the game scene to initiate following, and the player a follows the player B to move in the game scene. In addition, the motion following instruction may be the following of the player by the player, for example, the player a follows the player B, or the following of the NPC in the game scene by the player, for example, the player a follows the NPC character "small fishing village" in the game scene, or the following of the NPC in the game scene follows the player a, for example, the NPC character "town monster tower guard" in the game scene. The embodiment of the present application takes application to a game server as an example, and explains the provided technical solution, but the present application is not limited to application to a game server, and can be applied to motion following of any application scene, for example, following an object to follow a target object in animation production.
The server obtains target moving path frame data corresponding to the target object from the target terminal, the target moving path frame data being a moving path of the target object recorded in frame units at the beginning and after of the creation of the motion following instruction, for example, the frame rate in the game scene is 10Hz, that is, 1 second is represented by 10 frames, the motion following instruction is created at 1 click, then data frames of the character at 1 click and after 1 click, that is, data frames corresponding to 1 click, 1 click 0 minute 0.1 second, and 1 click 0 minute 0.1 second are obtained, the data frames at least include target object position data and an identifier capable of indicating the corresponding time of the data frame, for example, the identifier of the corresponding time of the data frame may be in a number form, specifically, the data frame of the target object at 1 click may be represented by the 1 st frame, the data frame … … of the target object at 1 click 0 minute 0.1 second is identified by the 2 nd frame, or may be represented by actual time corresponding to the data frame or encoded data for actual time. In addition, in the target terminal, the client can record all moving path frame data of the target object in real time, so that the server can call the historical data of the target object at any time; the movement path frame data of the target object can be recorded after the movement following instruction is created, and data support is provided for other roles to follow the target object; the latest moving path frame data generated by the target object may also be recorded, for example, the moving path frame data within 1 minute is recorded, and the earlier data is deleted.
After acquiring target moving path frame data corresponding to a target object, a server determines moving path frame data of the following object based on the data and a preset following frame number difference, wherein the preset following frame number difference refers to a frame number of a difference between a moving path of the following object and a moving path of the target object, when the following object comprises a plurality of following objects, the following frame number differences corresponding to the following objects can be sequentially determined according to a following sequence of the following objects, the following frame number differences can be sequentially increased along with the following sequence, for example, the preset following frame number difference corresponding to a first following object is 5 frames, the preset following frame number difference corresponding to a second following object is 8 frames, and the like, assuming that the target object corresponds to a following object, the preset following frame number difference corresponding to the following object is 5 frames, and in a following state, if the current position of the target object is A, then the following object arrives at a after 0.5 seconds (preset following frame number difference/frame rate =5/10= 0.5). After the following movement path frame data is determined, the server sends the following movement path frame data to the following object client side, the following object client side can control the following object to move according to the received following movement path frame data, the effect that the following object can move along with the target object is achieved, a player of the following object can move along with the target object, and operation is not needed to be carried out along with the player.
In the above embodiment, each target data frame included in the target movement path frame data corresponds to a target frame time and a target frame position. The synchronous following phase refers to a phase in which the following object moves completely according to the movement track of the target object, and includes an asynchronous following phase, for example, in a starting phase of the target object, the target object may have a stop-and-go phenomenon, in order to improve the following performance effect, the following object may move not according to the stop-and-go manner of the target object but in a smoother manner, and at this time, the following object may not move according to the target frame position of the target data frame corresponding to the target object.
For the synchronous following stage, the synchronous following data frame is corresponding to the following moving path frame data, most following processes belong to synchronous following in the following process in the actual game scene, in the embodiment of the application, each target data frame is corresponding to a target frame time used for reflecting the actual time or game world time corresponding to the frame data, the following frame time corresponding to the synchronous following data frame is the same as the target frame time corresponding to the target data frame,
for example, the following object follows the target object at the beginning of 1 click, the moving path of the target object after 1 click by 0.5 second is synchronously followed, and if the preset following frame number difference is 5, the game frame rate is 10Hz, and the following object and the target object have a following time difference of 0.5 second, the following object should follow the moving feature of the target object at 1 click by 0.5 second at 1 click by 1 second. The synchronous following start time of the following object is determined to be 1 point zero 1 second, a data frame corresponding to 1 point zero 1 second is found in the target moving path frame data, 1 point zero 1 second is determined to be the following frame time corresponding to the synchronous following frame, then 5 frames are shifted forward, the target frame position corresponding to the target data frame of 1 point zero 0.5 second is obtained, the position is determined to be the position of the synchronous following data frame, the following frame time corresponding to the synchronous following frame is 1 point zero 1 second, for example, the following frame time corresponding to the synchronous following frame is 1 point zero 1.1 second, and then the following frame position corresponding to the synchronous following frame is the position corresponding to the target frame of 1 point zero 0.6 second. And by analogy, all synchronous following frames are determined, and then the following moving path frame data is determined.
In addition, in some special following stages, the following movement path frame data of the following object can be determined by interpolation based on the corresponding position of the target movement path frame data and the corresponding position of the following object. For example, in the initial following stage of the following object, the motion following instruction is issued at this time, but due to the difference of the preset following frame number, the following object does not start to follow the target object yet, and the interpolation may be performed based on the position of the following object and the initial position of the target object corresponding to the target moving path frame data of the target object, so that the initial following data frame of the following object is determined based on the interpolation result. For another example, the target object may show a fast and slow moving speed in several consecutive frames, for example, the first frame moves 10 meters to the second frame, the second frame moves 1 meter to the third frame, and the third frame moves 8 meters to the fourth frame … …, and then the positions of the following data frames that can show a smooth moving effect may be obtained by interpolating the positions corresponding to the first frame and the last frame in the several frames of data. For the following movement path frames corresponding to some special following stages, the following movement path frames can be determined in the interpolation mode, and therefore the smooth following effect of the following object is achieved.
By applying the technical scheme of the embodiment, the server acquires target moving path frame data corresponding to the target object, so that synchronous following data frames and asynchronous following data frames corresponding to the following object are determined on the basis of the target moving path frame data according to the preset following frame number difference, and the following moving path frame data are generated, so that the following object moves based on the following moving path frame data, and the following effect on the target object can be shown. Compared with the prior art, the method and the device have the advantages that the complex process of determining the data of the following moving path frame is processed by the high-performance server, the processing efficiency is greatly improved, the problems of blocking, frame loss and performance loss caused by overlarge performance pressure of the client side are solved, even the problem of tracking loss occurs in the following process, interpolation is carried out through the moving track position based on the target object, the position of the following track of the following object is determined, the following motion performance effect of the following object is improved, on the other hand, the problem that the NPC character in the game scene cannot be followed in the prior art is solved, technical support is provided for increasing game playing methods in the game development stage, guarantee is provided for more playing methods experienced by players in the game experience stage, and the game experience of the players and the competitiveness of game products are improved.
Further, as a refinement and an extension of the specific implementation of the foregoing embodiment, in order to fully illustrate the specific implementation process of the present embodiment, another method for generating frame data is provided, as shown in fig. 2, the method includes:
in order to improve the performance effect of the following function, embodiments of the present application provide multiple ways of determining the following movement path frame data, and multiple implementations may be used in different stages of target following, where the specific step 103 may include:
in a first mode, in an initial following stage of following an object, as shown in fig. 2, step 102 specifically includes:
102-1-1, acquiring a first target frame position and a first target frame time corresponding to a first target data frame in target moving path frame data, and acquiring a first following frame position of a following object, wherein the first target data frame is a first frame in the target moving path frame data;
102-1-2, performing interpolation based on the first target frame position and the first following frame position to obtain an initial following frame position matched with the difference quantity of the preset following frames;
and 102-1-3, marking initial following frame time corresponding to the initial following frame position, and determining an initial following data frame in the following movement path frame data, wherein the first initial following frame time is a first target frame time.
In the above embodiment, the initial following stage of the following object refers to a stage in which the motion following instruction has been issued, but the following object has not yet started following the target object due to the existence of the preset following frame number difference, for example, the preset following frame number difference is 5, and if the following object wants to move along with the target object, the following object needs to wait until the target object moves for 5 frames before starting moving. In order to solve the problem that the following display effect is poor due to the fact that the following object is in a waiting state in the display time of the 5 frames of images, the initial following data frame is set in the initial following stage, so that the following object can move according to the initial following data frame, and in-place waiting is avoided. Specifically, a first frame in the target movement path frame data is acquired as a first target data frame, a first target frame position and a first target frame time corresponding to the first target data frame, namely an initial position of a target object, the first target frame time is taken as a following start time of the following object, the following object enters a following state from the time, the following object data frame corresponding to the time is a first following frame of the following object, the position of the following object at the following start time, namely the first following frame position, is further acquired, interpolation is carried out on the first target frame position and the first following frame position, the initial following data frame of the following object is determined according to an interpolation result, wherein the interpolation result is used as position information of the initial following data frame, and the time information of the initial following data frame is synchronous with the target following data frame.
For example, if the following frame number difference is set to 5, the time for acquiring the first target data frame corresponding to the target object is 1 click, the position is a, and the position B of the following object at 1 click is acquired, then the position a and the position B are interpolated to obtain 4 interpolated positions including the position B, the interpolated 1, the interpolated 2, the interpolated 3, and the interpolated 4 are respectively labeled with corresponding time information according to the order of the distance between the interpolated position and the position a from large to small, the time corresponding to the position B is 1 click, the time corresponding to the interpolated 1 is 1-point-zero-0.1 second, and the time corresponding to the interpolated 2 is 1-point-zero-0.2 second … …. Therefore, the initial following data frame is determined, the following object moves according to the initial following data frame, the following object can be shown as a moving state to approach the target object in the first 5 frames of images at the beginning of following, the stay waiting effect of the following object is prevented from being shown by the following object client, and the following state expressive force in a game scene is improved.
In a second mode, in a starting following stage of following an object, as shown in fig. 3, step 102 specifically includes:
102-2-1, acquiring a first target frame position corresponding to a first target data frame and a second target frame position corresponding to a second target data frame in target moving path frame data, wherein the second target data frame is a data frame which is different from the first target data frame by a preset following frame number difference quantity;
102-2-2, performing interpolation based on the first target frame position and the second target frame position to obtain a starting following frame position matched with the difference quantity of the preset following frames, wherein the starting following frame position comprises the second target frame position;
and step 102-2-3, marking the starting following frame time corresponding to the starting following needle position according to the first target frame time and the second target frame time, and determining starting following frame data in the following movement path frame data.
In the above embodiment, the starting and following phase of the following object refers to a following phase of the following object to the starting and moving path of the target object, for example, after the motion following instruction is issued, the target object starts to change from an original stop state to a moving state, and the target object may be in an untimely state of stop-and-go within a short time when the target object starts to move.
Specifically, a first frame in the target movement path frame data is acquired as a first target data frame and a second target data frame which is different from the first target data frame by a preset following frame number difference, and a first target frame position corresponding to the first target data frame and a second target frame position corresponding to the second target data frame are acquired, for example, the preset following frame number difference is 5, the first target data frame is a data frame corresponding to 1-point integer, and the second target data frame is a data frame corresponding to 1-point zero 0.5 second. Further, based on the first target frame position and the second target frame position, interpolation is carried out to obtain 5 interpolation positions (5 is a preset following frame number difference) including the second target frame position as starting following frame positions, then corresponding time information is sequentially marked to each starting following frame position to obtain a complete starting following frame, the starting following frame is used as a part of following moving path frame data to be sent to a following terminal to guide a following moving path of a following object, so that the movement of the following object in a starting stage tends to be smooth, and the following effect is improved.
In addition, in the above embodiment, if the initiating character follows and the target object does not show a start state of walking or stopping, determining the following data frame of the following object directly according to the target data frame corresponding to the target object, without performing smoothing processing on the position information in the target data frame, specifically, after step 102-2-1 and before step 102-2-2, the method may further include: acquiring all starting target frame positions corresponding to the first target data frame and the second target data frame; if the starting target frame position is repeated, the step 102-2-2 is executed. If the obtained starting target frame position is repeated, the target object shows a starting state of stop-and-go after the starting of the following instruction is described, and the smoothing processing of the starting state is carried out according to the method.
In a third mode, in the step-down following phase of following the object, as shown in fig. 4, step 102 specifically includes:
102-3-1, when the target frame speed corresponding to any third target data frame in the target data frames is zero, acquiring the state of a stopping target frame corresponding to a stopping target data frame which is equal to the difference quantity of the preset following frames before the third target data frame;
102-3-2, when the number of the states of the stopping target frames is multiple, acquiring a fourth target data frame with the changed state, and determining a fourth target frame position corresponding to the fourth target data frame;
102-3-3, acquiring a second following frame position corresponding to the synchronous following data frame of time synchronization according to a third target frame time corresponding to a third target data frame;
and 102-3-4, determining the corresponding stopping following data frame of the following object based on the distance between the second following frame position and the fourth target frame position.
In the above embodiment, it should be noted that each target data frame included in the target moving path frame data further corresponds to a target frame state, and the following frame state corresponding to the synchronous following data frame is a target frame state corresponding to a target data frame which is different by a preset following frame number difference before the target data frame of the time synchronization. That is, the following data frame follows the state of the target data frame in addition to the location of the corresponding target data frame, where the target frame state may include a flight state, a walking state, etc. of the target object. If the target object is in the flight state at position a, then the following object should also be in the flight state when following to position a, and if the target object is switched from the flight state to the walking state at position B, then the following object should also be switched from the flight state to the walking state when following to position B. On the basis, there are some special cases, for example, the target object stops at the position D soon after the position C is switched from the flight state to the walking state, if the following object should also stop at the position D after the position C is switched from the flight state to the walking state in a synchronous following manner, but if the following object stops at the same position as the target object in this manner, this may cause poor overlapping display effect of multiple characters in the game scene, therefore, when the target object stops moving, the following object should stop at a distance from the target object, and furthermore, if the state of the target object changes in the step-down stage, the step-down following data frame of the following object may be determined by the above-described embodiment.
Specifically, first, a data frame corresponding to the target object in the step-down stage is acquired, and if the speed corresponding to the target data frame is zero, it indicates that the target object stops moving at the corresponding position of the frame, so that a third target data frame with target frame data of zero in the target data frame and the previous 5 data frames (5 is a preset following frame number difference) including the third target data frame are acquired as the step-down target data frames; secondly, judging whether the target object is switched in the step-down stage or not by acquiring the state of the step-down target frame corresponding to the step-down target data frame, and if the state of the step-down target frame comprises a plurality of descriptions that the target object is switched in the step-down stage; then, after the target object is determined to have state switching in the step-down stage, acquiring a fourth target data frame with state switching and a fourth target frame position corresponding to the fourth target data frame, namely the position of the target object with state switching; then, acquiring the position of the following object at a third target frame time (the third target frame time is the time when the target object stops moving), namely the second following frame position; and finally, setting the stopping following data frame of the following object according to the distance between the second following position and the fourth target frame position, namely determining the stopping following data frame according to the distance between the position of the state switching of the target object and the position of the following object at the time when the target object stops moving.
In step 102-3-4 of the above embodiment, specifically, if the distance between the second following frame position and the fourth target frame position is less than or equal to the preset stopping following distance, the fourth target frame position is used as the first stopping position of the following object, and the stopping following data frame is determined according to the second following frame position and the first stopping position. And if the distance between the second following frame position and the fourth target frame position is greater than the preset stopping following distance, determining a second stopping position of the following object based on a third target frame position corresponding to a third target data frame, and determining the stopping following data frame according to the second following frame position and the second stopping position.
In this embodiment, it is assumed that the target object is switched from the flying state to the walking state in the step-down phase, the target object stops at the position D soon after the position C is switched from the flying state to the walking state, the following object moves to the position E when the target object stops at the position D, and the step-down following data frame of the following object is determined according to the distance between the position E and the position D since the target object does not move after stopping at the position D. If the distance between the position E and the position D is close (less than or equal to the preset stopping following distance), which indicates that the following object is close to the position D when the target object stops moving, in order to show that the state of the following object is switched, the following object is moved to the position C and then stops moving, namely, the fourth target frame position (the state switching position of the target object) is used as the first stopping position of the following object, and a stopping following data frame is set based on the fourth target frame position (the position corresponding to the last frame of the stopping following data frame is ensured to be the fourth target frame position), and as the state of the following object follows the state of the target object, the following object can be shown as the following object moves to the position C and stops moving according to the stopping following data frame and is shown as a flying state before the position C, a walking state is represented at the position C.
If the position E is far away from the position D (larger than the preset stopping following distance), which indicates that the target object is far away from the following object when the target object stops moving, in order to reflect the following of the target object by the following object on the level of position and state and enable the following object not to overlap with the target object when the following object stops, the stopping position of the following object is determined based on the position D (third target frame position), namely the stopping following data frame of the following object is determined according to the moving stopping position of the target object, for example, the position of the target object path with a certain distance from the position D is determined as a second stopping position F (for example, the position 0.5 m away from the position D in a game scene is determined as the stopping position) based on the moving direction of the target object, and the stopping following data frame is set based on the second stopping position, because the state of the following object follows the state of the target object, the following object moving in the step-down following data frame may be represented as the following object moving to the position F and stopping following the target object, and the following object is represented as a flying state before the position C and as a walking state after and at the position C.
The following data frame is determined through the three embodiments, so that the following object can move to the target object instead of stay in place at the initial stage, smooth starting can be achieved at the starting stage, the situation that the following object is completely in a stuck state of walking and stopping can be avoided at the starting stage, the following object can completely move along with the position and the state of the target object at the synchronization stage, the state switching process of the target object can be completely reflected and the following object can stop at a position where the following object is not overlapped with the target object, the technical effect of improving the expression effect of the following object in the following process is achieved through the three embodiments, the game experience of a user is improved, the calculation is carried out by matching with a high-performance server, the calculation accuracy is guaranteed, and pressure on performance can not be brought to a client.
In addition, in any embodiment of the present application, specifically, if the following objects include a plurality of following objects, the following order of each following object is determined, and the preset following frame number difference corresponding to each following object is determined according to the following order. Accordingly, the following movement path frame data is transmitted to the following terminals corresponding to the plurality of following objects, respectively.
In this embodiment, one target object may correspond to a plurality of following objects, for example, a game team in a game scene includes 3 persons, the captain initiates the follow-up in the team, two other captain members may follow the captain, and in order to avoid that each captain member does not overlap during the follow-up process, a plurality of preset following frame number differences may be set for this case, for example, the following frame number difference corresponding to the following object 1 is 5, and the following frame number difference corresponding to the following object 2 is 10. On the basis, according to the method for determining the following moving path frame data, the following moving path frame data corresponding to each following object are respectively determined and sent to the corresponding following terminals, so that the roles are not overlapped when the target objects are followed by a plurality of roles, and the following display effect is improved.
Further, as a specific implementation of the method in fig. 1, an embodiment of the present application provides an apparatus for generating frame data, which is applied to a server, and as shown in fig. 5, the apparatus includes:
the synchronous following frame determining module 61 is configured to acquire target moving path frame data corresponding to a target object, and determine a synchronous following data frame corresponding to the following object according to a preset following frame number difference corresponding to the following object and the target moving path frame data, where a following frame time corresponding to the synchronous following data frame is the same as a target frame time corresponding to the target data frame, a following frame position corresponding to the synchronous following data frame is determined based on a position of the target following frame corresponding to the corresponding target data frame, and the target following frame is a data frame of which a forward direction difference of the target data frame is a preset following frame number difference;
an asynchronous following frame determining module 62, configured to perform interpolation processing based on a position corresponding to the target moving path frame data, and determine an asynchronous following data frame corresponding to a following object;
and a following frame data generating module 63, configured to generate following moving path frame data corresponding to the following object according to the synchronous following data frame and the asynchronous following data frame.
In a specific application scenario, as shown in fig. 6, the asynchronous following frame determining module 62 specifically includes:
an initial frame time obtaining unit 6201, configured to obtain a first target frame position and a first target frame time corresponding to a first target data frame in target movement path frame data, and obtain a first following frame position of a following object, where the first target data frame is a first frame in the target movement path frame data;
an initial frame position determining unit 6202, configured to perform interpolation based on the first target frame position and the first following frame position to obtain an initial following frame position matching a difference number of preset following frames;
an initial following frame determining unit 6203, configured to mark an initial following frame time corresponding to an initial following frame position, and determine an initial following data frame in the following movement path frame data, where a first initial following frame time is a first target frame time.
In a specific application scenario, as shown in fig. 6, the asynchronous following frame determining module 62 specifically includes:
a first start frame data obtaining unit 6204, configured to obtain a first target frame position corresponding to a first target data frame and a second target frame position corresponding to a second target data frame in the target moving path frame data, where the second target data frame is a data frame that differs by a preset number difference of following frames after the first target data frame;
a starting frame position determining unit 6205, configured to perform interpolation based on the first target frame position and the second target frame position to obtain a starting following frame position that matches a preset following frame difference number, where the starting following frame position includes the second target frame position;
a starting frame data determining unit 6206, configured to label, according to the first target frame time and the second target frame time, a starting following frame time corresponding to the starting following pin position, and determine starting following frame data in the following movement path frame data.
In a specific application scenario, as shown in fig. 6, the asynchronous following frame determining module 62 further includes:
a second starting frame data obtaining unit 6207, configured to obtain all starting target frame positions corresponding to the first target data frame and the second target data frame before performing interpolation based on the first target frame position and the second target frame position;
a starting judgment unit 6208, configured to perform interpolation based on the first target frame position and the second target frame position if the starting target frame position has repetition.
In a specific application scenario, each target data frame included in the target moving path frame data also corresponds to a target frame state, and the following frame state corresponding to the synchronous following data frame is the target frame state corresponding to the target data frame which is different by a preset following frame number difference before the target data frame of time synchronization.
In a specific application scenario, as shown in fig. 6, each target data frame included in the target movement path frame data also corresponds to a target frame rate; the asynchronous following frame determining module 62 specifically includes:
a stopping frame state obtaining unit 6209, configured to, when a target frame speed corresponding to any one third target data frame in the target data frames is zero, obtain a stopping target frame state corresponding to a stopping target data frame before the third target data frame and corresponding to a preset number of following frame differences;
a state switching frame acquiring unit 6210, configured to acquire, when the number of states of the step-down target frames includes a plurality of states, a fourth target data frame of which the state changes, and determine a fourth target frame position corresponding to the fourth target data frame;
a stop frame position obtaining unit 6211, configured to obtain, according to a third target frame time corresponding to a third target data frame, a second following frame position corresponding to a synchronous following data frame of time synchronization;
a stop following frame determining unit 6212 configured to determine a stop following data frame corresponding to the following object based on a distance between the second following frame position and the fourth target frame position.
In a specific application scenario, not shown in the figure, the step-down following frame determining unit 6212 specifically includes:
a first stopping following frame determining subunit 62121, configured to, if the distance between the second following frame position and the fourth target frame position is less than or equal to the preset stopping following distance, take the fourth target frame position as the first stopping position of the following object, and determine the stopping following data frame according to the second following frame position and the first stopping position.
In a specific application scenario, not shown in the figure, the step-down following frame determining unit 6212 specifically includes:
a second stopping following frame determining subunit 62122, configured to determine, if the distance between the second following frame position and the fourth target frame position is greater than the preset stopping following distance, a second stopping position of the following object based on a third target frame position corresponding to the third target data frame, and determine the stopping following data frame according to the second following frame position and the second stopping position.
In a specific application scenario, as shown in fig. 6, the apparatus further includes:
a following frame number difference determining module 64, configured to determine a following order of each following object before obtaining target moving path frame data corresponding to the target object if the following objects include a plurality of following objects, and determine a preset following frame number difference corresponding to each following object according to the following order;
and the following data frame sending module 65 is configured to generate following movement path frame data corresponding to the following object, and then send the following movement path frame data to the corresponding following terminals.
It should be noted that other corresponding descriptions of the functional units involved in the apparatus for generating frame data provided in the embodiment of the present application may refer to corresponding descriptions in the methods in fig. 1 to fig. 4, and are not described herein again.
Based on the method shown in fig. 1 to 4, correspondingly, the present application also provides a storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the method for generating frame data shown in fig. 1 to 4.
Based on such understanding, the technical solution of the present application may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.), and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the implementation scenarios of the present application.
Based on the method shown in fig. 1 to 4 and the virtual device embodiment shown in fig. 5 to 6, in order to achieve the above object, the present application further provides a computer device, which may specifically be a personal computer, a server, a network device, and the like, where the computer device includes a storage medium and a processor; a storage medium for storing a computer program; a processor for executing a computer program to implement the above-described method of generating frame data as shown in fig. 1 to 4.
Optionally, the computer device may also include a user interface, a network interface, a camera, Radio Frequency (RF) circuitry, sensors, audio circuitry, a WI-FI module, and so forth. The user interface may include a Display screen (Display), an input unit such as a keypad (Keyboard), etc., and the optional user interface may also include a USB interface, a card reader interface, etc. The network interface may optionally include a standard wired interface, a wireless interface (e.g., a bluetooth interface, WI-FI interface), etc.
It will be appreciated by those skilled in the art that the present embodiment provides a computer device architecture that is not limiting of the computer device, and that may include more or fewer components, or some components in combination, or a different arrangement of components.
The storage medium may further include an operating system and a network communication module. An operating system is a program that manages and maintains the hardware and software resources of a computer device, supporting the operation of information handling programs, as well as other software and/or programs. The network communication module is used for realizing communication among components in the storage medium and other hardware and software in the entity device.
Through the above description of the embodiments, those skilled in the art may clearly understand that the present application may be implemented by software plus a necessary general hardware platform, and may also obtain target movement path frame data corresponding to a target object by a hardware implementation server, so as to determine a synchronous following data frame and an asynchronous following data frame corresponding to a following object on the basis of the target movement path frame data according to a preset following frame number difference, thereby generating following movement path frame data, so that the following object moves based on the following movement path frame data, and may exhibit a following effect on the target object. Compared with the prior art, the method and the device have the advantages that the complex process of determining the data of the following moving path frame is processed by the high-performance server, the processing efficiency is greatly improved, the problems of blocking, frame loss and performance loss caused by overlarge performance pressure of the client side are solved, even the problem of tracking loss occurs in the following process, interpolation is carried out through the moving track position based on the target object, the position of the following track of the following object is determined, the following motion performance effect of the following object is improved, on the other hand, the problem that the NPC character in the game scene cannot be followed in the prior art is solved, technical support is provided for increasing game playing methods in the game development stage, guarantee is provided for more playing methods experienced by players in the game experience stage, and the game experience of the players and the competitiveness of game products are improved.
Those skilled in the art will appreciate that the figures are merely schematic representations of one preferred implementation scenario and that the blocks or flow diagrams in the figures are not necessarily required to practice the present application. Those skilled in the art will appreciate that the modules in the devices in the implementation scenario may be distributed in the devices in the implementation scenario according to the description of the implementation scenario, or may be located in one or more devices different from the present implementation scenario with corresponding changes. The modules of the implementation scenario may be combined into one module, or may be further split into a plurality of sub-modules.
The above application serial numbers are for description purposes only and do not represent the superiority or inferiority of the implementation scenarios. The above disclosure is only a few specific implementation scenarios of the present application, but the present application is not limited thereto, and any variations that can be made by those skilled in the art are intended to fall within the scope of the present application.

Claims (12)

1. A method for generating frame data, applied to a server, includes:
acquiring target moving path frame data corresponding to a target object, and determining a synchronous following data frame corresponding to the following object according to a preset following frame number difference corresponding to the following object and the target moving path frame data, wherein the following frame time corresponding to the synchronous following data frame is the same as the target frame time corresponding to the target data frame, the following frame position corresponding to the synchronous following data frame is determined based on the position of the target following frame corresponding to the corresponding target data frame, and the target following frame is a data frame of which the forward direction of the target data frame is different from the preset following frame number difference;
performing interpolation processing based on the position corresponding to the target moving path frame data, and determining an asynchronous following data frame corresponding to the following object;
and generating following moving path frame data corresponding to the following object according to the synchronous following data frame and the asynchronous following data frame.
2. The method of claim 1, wherein the asynchronously following data frame comprises an initially following data frame; the determining of the asynchronous following data frame corresponding to the following object specifically includes:
acquiring a first target frame position and a first target frame time corresponding to a first target data frame in the target moving path frame data, and acquiring a first following frame position of the following object, wherein the first target data frame is a first frame in the target moving path frame data;
performing interpolation based on the first target frame position and the first following frame position to obtain an initial following frame position matched with the difference quantity of the preset following frames;
marking initial following frame time corresponding to the initial following frame position, and determining the initial following data frame in the following movement path frame data, wherein the first initial following frame time is the first target frame time.
3. The method of claim 1, wherein the asynchronously following data frame comprises a startling following data frame; the determining of the asynchronous following data frame corresponding to the following object specifically includes:
acquiring a first target frame position corresponding to a first target data frame and a second target frame position corresponding to a second target data frame in the target moving path frame data, wherein the second target data frame is a data frame which is different from the first target data frame by the preset following frame number difference quantity;
interpolating based on the first target frame position and the second target frame position to obtain a starting following frame position matched with the difference quantity of the preset following frames, wherein the starting following frame position comprises the second target frame position;
marking the starting following frame time corresponding to the starting following needle position according to the first target frame time and the second target frame time, and determining the starting following frame data in the following moving path frame data.
4. The method of claim 3, wherein prior to said interpolating based on said first target frame position and said second target frame position, said method further comprises:
acquiring all starting target frame positions corresponding to the first target data frame and the second target data frame;
and if the starting target frame position is repeated, performing interpolation based on the first target frame position and the second target frame position.
5. The method according to claim 1, wherein each target data frame included in the target movement path frame data further corresponds to a target frame status, and the following frame status corresponding to the synchronous following data frame is a target frame status corresponding to a target data frame that differs by the preset following frame number difference before the target data frame that is time-synchronized.
6. The method of claim 5, wherein the target movement path frame data includes each target data frame further corresponding to a target frame rate, and wherein the asynchronously following data frame includes a stopped following data frame; the determining of the asynchronous following data frame corresponding to the following object specifically includes:
when the target frame speed corresponding to any third target data frame in the target data frames is zero, acquiring the state of a stopping target frame corresponding to a stopping target data frame which is equal to the difference of the preset following frame number before the third target data frame;
when the number of the states of the stopping target frames is multiple, acquiring a fourth target data frame with changed state, and determining a fourth target frame position corresponding to the fourth target data frame;
acquiring a second following frame position corresponding to the synchronous following data frame with synchronous time according to a third target frame time corresponding to the third target data frame;
determining the step-down following data frame corresponding to the following object based on a distance between the second following frame position and the fourth target frame position.
7. The method according to claim 6, wherein the determining the step-out following data frame corresponding to the following object specifically includes:
and if the distance between the second following frame position and the fourth target frame position is less than or equal to a preset stopping following distance, taking the fourth target frame position as a first stopping position of the following object, and determining the stopping following data frame according to the second following frame position and the first stopping position.
8. The method according to claim 6, wherein the determining the step-out following data frame corresponding to the following object specifically includes:
and if the distance between the second following frame position and the fourth target frame position is greater than the preset stopping following distance, determining a second stopping position of the following object based on a third target frame position corresponding to the third target data frame, and determining the stopping following data frame according to the second following frame position and the second stopping position.
9. The method according to claim 1, wherein if the following object includes a plurality of objects, before the obtaining target moving path frame data corresponding to the target object, the method further includes:
respectively determining the following sequence of each following object, and respectively determining the preset following frame number difference corresponding to each following object according to the following sequence;
after generating the following movement path frame data corresponding to the following object, the method further includes:
and sending the following moving path frame data to following terminals corresponding to the following objects respectively.
10. An apparatus for generating frame data, applied to a server, comprising:
a synchronous following frame determining module, configured to obtain target moving path frame data corresponding to a target object, and determine a synchronous following data frame corresponding to the following object according to a preset following frame number difference corresponding to the following object and the target moving path frame data, where a following frame time corresponding to the synchronous following data frame is the same as a target frame time corresponding to the target data frame, a following frame position corresponding to the synchronous following data frame is determined based on a position of a target following frame corresponding to the corresponding target data frame, and the target following frame is a data frame of which a forward direction of the target data frame is different from the preset following frame number difference;
the asynchronous following frame determining module is used for performing interpolation processing on the basis of the position corresponding to the target moving path frame data and determining an asynchronous following data frame corresponding to the following object;
and the following frame data generation module is used for generating following moving path frame data corresponding to the following object according to the synchronous following data frame and the asynchronous following data frame.
11. A storage medium having stored thereon a computer program, characterized in that the computer program, when being executed by a processor, implements the method of generating frame data according to any one of claims 1 to 9.
12. A computer device comprising a storage medium, a processor and a computer program stored on the storage medium and executable on the processor, wherein the processor implements the method of generating frame data according to any one of claims 1 to 9 when executing the computer program.
CN202010993082.XA 2020-09-21 2020-09-21 Method and device for generating frame data, storage medium and computer equipment Active CN111921201B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011439747.9A CN112426717A (en) 2020-09-21 2020-09-21 Method and device for generating frame data, storage medium and computer equipment
CN202010993082.XA CN111921201B (en) 2020-09-21 2020-09-21 Method and device for generating frame data, storage medium and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010993082.XA CN111921201B (en) 2020-09-21 2020-09-21 Method and device for generating frame data, storage medium and computer equipment

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202011439747.9A Division CN112426717A (en) 2020-09-21 2020-09-21 Method and device for generating frame data, storage medium and computer equipment

Publications (2)

Publication Number Publication Date
CN111921201A true CN111921201A (en) 2020-11-13
CN111921201B CN111921201B (en) 2021-01-08

Family

ID=73334883

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202010993082.XA Active CN111921201B (en) 2020-09-21 2020-09-21 Method and device for generating frame data, storage medium and computer equipment
CN202011439747.9A Pending CN112426717A (en) 2020-09-21 2020-09-21 Method and device for generating frame data, storage medium and computer equipment

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202011439747.9A Pending CN112426717A (en) 2020-09-21 2020-09-21 Method and device for generating frame data, storage medium and computer equipment

Country Status (1)

Country Link
CN (2) CN111921201B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112203073A (en) * 2020-12-07 2021-01-08 南京爱奇艺智能科技有限公司 Asynchronous frame extrapolation pipeline method and system suitable for VR real-time rendering application
CN112426717A (en) * 2020-09-21 2021-03-02 成都完美天智游科技有限公司 Method and device for generating frame data, storage medium and computer equipment

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113144620B (en) * 2021-05-20 2024-07-12 北京字节跳动网络技术有限公司 Method, device, platform, readable medium and equipment for detecting frame synchronous game

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180243653A1 (en) * 2016-02-01 2018-08-30 Tencent Technology (Shenzhen) Company Limited Method for determining movement track, and user equipment
CN110292773A (en) * 2019-07-04 2019-10-01 珠海西山居移动游戏科技有限公司 A kind of role movement follower method and device calculate equipment and storage medium
CN111389004A (en) * 2020-03-25 2020-07-10 网易(杭州)网络有限公司 Control method of virtual role, storage medium and processor
CN111437605A (en) * 2020-03-27 2020-07-24 腾讯科技(深圳)有限公司 Method for determining virtual object behaviors and hosting virtual object behaviors
CN111450531A (en) * 2020-03-30 2020-07-28 腾讯科技(深圳)有限公司 Virtual character control method, virtual character control device, electronic equipment and storage medium

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8016672B2 (en) * 2008-03-27 2011-09-13 Sony Computer Entertainment Inc. Device and method to control game where characters are moved on map
JP5614956B2 (en) * 2009-08-11 2014-10-29 株式会社バンダイナムコゲームス Program, image generation system
JP6379077B2 (en) * 2015-09-03 2018-08-22 株式会社カプコン GAME PROGRAM AND GAME DEVICE
US10791285B2 (en) * 2015-10-05 2020-09-29 Woncheol Choi Virtual flying camera system
CN106302679B (en) * 2016-08-08 2018-10-02 腾讯科技(深圳)有限公司 A kind of virtual objects movement synchronous method, client and server
JP7058034B2 (en) * 2017-09-29 2022-04-21 グリー株式会社 Game processing program, game processing method, and game processing device
WO2020028603A2 (en) * 2018-08-01 2020-02-06 Sony Interactive Entertainment LLC Player induced counter-balancing of loads on a character in a virtual environment
CN109316745B (en) * 2018-10-12 2022-05-31 网易(杭州)网络有限公司 Virtual object motion control method and device, electronic equipment and storage medium
CN109529338B (en) * 2018-11-15 2021-12-17 腾讯科技(深圳)有限公司 Object control method, device, electronic design and computer readable medium
CN110694266B (en) * 2019-10-23 2023-07-18 网易(杭州)网络有限公司 Game state synchronization method, game state display method and game state synchronization device
CN111921201B (en) * 2020-09-21 2021-01-08 成都完美天智游科技有限公司 Method and device for generating frame data, storage medium and computer equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180243653A1 (en) * 2016-02-01 2018-08-30 Tencent Technology (Shenzhen) Company Limited Method for determining movement track, and user equipment
CN110292773A (en) * 2019-07-04 2019-10-01 珠海西山居移动游戏科技有限公司 A kind of role movement follower method and device calculate equipment and storage medium
CN111389004A (en) * 2020-03-25 2020-07-10 网易(杭州)网络有限公司 Control method of virtual role, storage medium and processor
CN111437605A (en) * 2020-03-27 2020-07-24 腾讯科技(深圳)有限公司 Method for determining virtual object behaviors and hosting virtual object behaviors
CN111450531A (en) * 2020-03-30 2020-07-28 腾讯科技(深圳)有限公司 Virtual character control method, virtual character control device, electronic equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李井颂等: "用于游戏 NPC路径规划的改进遗传算法", 《传感器与微***2017年》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112426717A (en) * 2020-09-21 2021-03-02 成都完美天智游科技有限公司 Method and device for generating frame data, storage medium and computer equipment
CN112203073A (en) * 2020-12-07 2021-01-08 南京爱奇艺智能科技有限公司 Asynchronous frame extrapolation pipeline method and system suitable for VR real-time rendering application
CN112203073B (en) * 2020-12-07 2021-03-05 南京爱奇艺智能科技有限公司 Asynchronous frame extrapolation pipeline method and system suitable for VR real-time rendering application

Also Published As

Publication number Publication date
CN112426717A (en) 2021-03-02
CN111921201B (en) 2021-01-08

Similar Documents

Publication Publication Date Title
CN111921201B (en) Method and device for generating frame data, storage medium and computer equipment
US11484802B2 (en) Interactive gameplay playback system
CN110180168B (en) Game picture display method and device, storage medium and processor
CN112107862A (en) Game character movement following method and device, storage medium and computer equipment
CN110062271A (en) Method for changing scenes, device, terminal and storage medium
KR101954010B1 (en) Method and terminal for implementing virtual character turning
CN111888766B (en) Information processing method and device in game, electronic equipment and storage medium
US10888771B2 (en) Method and device for object pointing in virtual reality (VR) scene, and VR apparatus
CN113209618B (en) Virtual character control method, device, equipment and medium
US20230051703A1 (en) Gesture-Based Skill Search
CN111760286A (en) Switching method and device of mirror operation mode, storage medium and electronic device
JP2022503919A (en) Establishing and managing multiplayer sessions
CN111643903A (en) Control method and device of cloud game, electronic equipment and storage medium
CN112604286B (en) Game skill synchronous execution method and device
CN111729312A (en) Position synchronization method, device and equipment
CN114887329A (en) Virtual object control method, device, equipment and storage medium
CN113975802A (en) Game control method, device, storage medium and electronic equipment
JP2023547721A (en) Screen display methods, devices, equipment, and programs in virtual scenes
JP6679054B1 (en) Game device, game system, program, and game control method
CN108415749B (en) Display processing method, medium, device and computing equipment
WO2024055811A1 (en) Message display method and apparatus, device, medium, and program product
CN111135578B (en) Game character moving method and device
CN113457155B (en) Method and device for controlling display in game, electronic equipment and readable storage medium
CN115366093A (en) Robot task scheduling method, equipment and system
JP6813324B2 (en) Screen control program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant