CN117899473B - Image frame display method, device, computer equipment and storage medium - Google Patents

Image frame display method, device, computer equipment and storage medium Download PDF

Info

Publication number
CN117899473B
CN117899473B CN202410280714.6A CN202410280714A CN117899473B CN 117899473 B CN117899473 B CN 117899473B CN 202410280714 A CN202410280714 A CN 202410280714A CN 117899473 B CN117899473 B CN 117899473B
Authority
CN
China
Prior art keywords
priority
display area
display
rendering
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410280714.6A
Other languages
Chinese (zh)
Other versions
CN117899473A (en
Inventor
徐士立
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202410280714.6A priority Critical patent/CN117899473B/en
Publication of CN117899473A publication Critical patent/CN117899473A/en
Application granted granted Critical
Publication of CN117899473B publication Critical patent/CN117899473B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/53Querying
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/54Browsing; Visualisation therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/02Non-photorealistic rendering
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application discloses an image frame display method, an image frame display device, computer equipment and a storage medium, and belongs to the technical field of computers. The method comprises the following steps: determining priorities of a plurality of display areas in a display interface based on a control instruction of a virtual object in a virtual scene and an image frame currently displayed in the display interface; generating a next frame picture of the display area based on picture data of the next frame of the display area under the condition that the priority of the display area meets the first rendering condition; predicting a next frame of picture of the display area based on the displayed image frame under the condition that the priority of the display area meets the second rendering condition; in the display interface, an image frame constituted by a next frame of the plurality of display areas is displayed. The application reduces the calculated amount of the next image frame as much as possible while ensuring the accuracy of the next frame of picture rendered, so as to reduce the requirement on equipment.

Description

Image frame display method, device, computer equipment and storage medium
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to an image frame display method, an image frame display device, computer equipment and a storage medium.
Background
With the development of computer technology, games are increasingly favored by users. During the running of the game by the device, the device renders an image frame containing the virtual scene for viewing by the user. At present, when an image frame is rendered, the rendering is performed based on virtual scene data corresponding to the image frame, but the rendering mode has large calculation amount and high requirement on equipment.
Disclosure of Invention
The embodiment of the application provides an image frame display method, an image frame display device, computer equipment and a storage medium, which can reduce the calculated amount of rendering the next image frame as much as possible so as to reduce the requirement on equipment. The technical scheme is as follows.
In one aspect, there is provided an image frame display method, the method including:
Determining priorities of a plurality of display areas in a display interface based on a control instruction of a virtual object in a virtual scene and a current display image frame in the display interface, wherein the current display image frame comprises a scene picture of the virtual scene, and the priorities indicate the influence degree of picture change of the display area on virtual contrast;
Generating a next frame picture of the display area based on picture data of the next frame of the display area under the condition that the priority of the display area meets a first rendering condition;
predicting a next frame picture of the display area based on the displayed image frame under the condition that the priority of the display area meets a second rendering condition;
And displaying an image frame formed by a next frame picture of the display areas in the display interface.
In another aspect, there is provided an image frame display apparatus, the apparatus including:
The determining module is used for determining priorities of a plurality of display areas in the display interface based on a control instruction of a virtual object in a virtual scene and an image frame currently displayed in the display interface, wherein the currently displayed image frame comprises a scene picture of the virtual scene, and the priorities indicate the influence degree of picture change of the display areas on virtual contrast;
a generating module, configured to generate a next frame picture of the display area based on picture data of a next frame of the display area if the priority of the display area meets a first rendering condition;
a prediction module, configured to predict a next frame of the display area based on the displayed image frame if the priority of the display area satisfies a second rendering condition;
And the display module is used for displaying an image frame formed by a next frame picture of the display areas in the display interface.
In one possible implementation, the determining module is configured to determine priorities of the plurality of display areas based on the control instruction and a virtual element in the currently displayed image frame, where the virtual element includes at least one of a virtual object, a virtual control, or a virtual map.
In another possible implementation manner, the determining module is configured to query a prioritization policy based on the control instruction and the virtual elements in the currently displayed image frame, where the prioritization policy indicates a priority of a display area including each virtual element under the action of each control instruction; under the condition that a first priority is inquired, setting the priority of a display area containing the virtual element as the first priority, wherein the first priority is the priority of the display area containing the virtual element in the priority division strategy under the action of the control instruction; and setting the priority of the rest display areas in the display interface as the lowest priority.
In another possible implementation manner, the determining module is further configured to determine a rendering condition that is met by the priority of each display area based on the running state parameter of the local device, the priorities of the multiple display areas, and a rendering policy, where the rendering condition includes the first rendering condition or the second rendering condition, and the rendering policy indicates a rendering condition that is met by each priority for different running state parameters of the device.
In another possible implementation manner, the rendering policy includes sub-policies corresponding to multiple status parameter intervals, where the sub-policies corresponding to the status parameter intervals indicate rendering conditions satisfied by each priority when an operation status parameter of the device belongs to the status parameter interval; the determining module is used for inquiring the rendering strategy based on the running state parameters of the local terminal equipment to obtain a target sub-strategy, wherein the running state parameters belong to a state parameter interval corresponding to the target sub-strategy; and inquiring the target sub-strategy based on the priority of the display area to obtain rendering conditions met by the priority of the display area.
In another possible implementation, the target sub-policy further indicates a rendering condition that the second priority satisfies during display of consecutive k image frames, k being an integer greater than 1; the determining module is configured to determine, based on target information and the target sub-policy, a rendering condition currently satisfied by the second priority, where the target information indicates a case of the rendering condition satisfied by the second priority in a process of generating k-1 image frames that are displayed, the k-1 image frames including the currently displayed image frame and the k-1 image frames being consecutive.
In another possible implementation, the apparatus further includes:
The fusion module is used for fusing the plurality of first display areas to obtain a fusion display area under the condition that the plurality of first display areas have overlapping areas, wherein the first display area is any one of the plurality of display areas;
The determining module is further configured to determine that the priority of the fused display area meets the first rendering condition when the priority of any one of the first display areas meets the first rendering condition;
the determining module is further configured to determine that the priority of the fused display area satisfies the second rendering condition when the priorities of the plurality of first display areas satisfy the second rendering condition.
In another possible implementation manner, the determining module is further configured to determine that, when there are overlapping areas in the plurality of first display areas and the rendering conditions that are satisfied by priorities of the plurality of first display areas are different, the priority of the second display area satisfies the second rendering condition; the second display area is a display area except a third display area in the first display areas, and the third display area is a display area with priority meeting the first rendering condition in the first display areas.
In another possible implementation manner, the prediction module is configured to predict, based on a history frame of a fourth display area and a next frame of a fifth display area, a next frame of the fourth display area when a priority of the fourth display area satisfies a second rendering condition, where the fourth display area is any one of the plurality of display areas whose priority satisfies the second rendering condition, and the fifth display area is a display area of the plurality of display areas whose priority satisfies the first rendering condition and is adjacent to the fourth display area.
In another possible implementation manner, the generating module is configured to generate, based on picture data of a next frame of the display area, a next frame picture of the display area if it is determined that the next image frame is a non-key frame and the priority of the display area satisfies the first rendering condition.
In another possible implementation manner, the generating module is further configured to generate, when determining that a next image frame is a key frame, the next image frame based on picture data of a next frame of the display interface;
the display module is further configured to display the next image frame in the display interface.
In another possible implementation manner, the determining module is further configured to determine that the next image frame is the key frame if a target ratio is greater than a threshold, where the target ratio is a ratio of a number of sixth display areas to a total number of the plurality of display areas, or a ratio of a sum of areas of the sixth display areas to an area of the display interface, and the sixth display area is a display area, where priorities of the plurality of display areas satisfy the first rendering condition; or alternatively
The determining module is further configured to determine that the next image frame is the key frame if the displayed n image frames are obtained based on only the second rendering condition, where the n image frames include the currently displayed image frame and the n image frames are continuous, and n is an integer greater than 0.
In another possible implementation, the apparatus further includes:
the system comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring a current frame rate and a maximum frame rate under the condition that an operation state parameter of local equipment belongs to a first state parameter interval, wherein the current frame rate is a frame rate according to which an image frame is currently rendered;
and the adjusting module is used for increasing the current frame rate under the condition that the current frame rate is smaller than the maximum frame rate.
In another possible implementation manner, the obtaining module is further configured to obtain a first duration and a second duration when the running state parameter belongs to a second state parameter interval, where the first duration is a duration of currently rendering one image frame, the second duration is a duration of rendering one image frame according to the current frame rate, and a minimum running state parameter in the second state parameter interval is greater than a maximum running state parameter in the first state parameter interval;
The adjusting module is further configured to reduce the current frame rate when the first time period is longer than the second time period.
In another aspect, a computer device is provided, the computer device including a processor and a memory, the memory storing at least one computer program, the at least one computer program loaded and executed by the processor to implement the operations performed by the image frame display method as described in the above aspects.
In another aspect, there is provided a computer-readable storage medium having stored therein at least one computer program loaded and executed by a processor to implement the operations performed by the image frame display method of the above aspect.
In yet another aspect, a computer program product is provided, comprising a computer program which, when executed by a processor, implements the operations performed by the image frame display method as described in the above aspects.
In the scheme provided by the embodiment of the application, based on the control instruction to the virtual object and the currently displayed image frame, the influence degree of the picture change of a plurality of display areas in the display interface on the virtual game is determined, namely, the priority of the plurality of display areas is determined, so that the next frame picture of the display area is generated by using the next frame picture data corresponding to the display area according to the rendering condition met by the priority of the plurality of display areas, and the next frame picture of the display area is predicted by using the rendered image frame according to the display area meeting the second rendering condition, thus, the next frame picture is only generated according to the picture data for the display area with large influence of picture change on the virtual game, the next frame picture of other display areas is predicted based on the rendered image frame, so that the calculation amount for rendering the next image frame is reduced as much as possible while the accuracy of the rendered next frame picture is ensured, and the requirement on equipment is reduced.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of an implementation environment provided by an embodiment of the present application;
Fig. 2 is a flowchart of an image frame display method according to an embodiment of the present application;
FIG. 3 is a flowchart of another image frame display method according to an embodiment of the present application;
FIG. 4 is a schematic diagram of an image frame provided by an embodiment of the present application;
FIG. 5 is a schematic illustration of another implementation environment provided by an embodiment of the present application;
FIG. 6 is a flow chart of a game initialization provided by an embodiment of the present application;
FIG. 7 is a flowchart of yet another image frame display method according to an embodiment of the present application;
Fig. 8 is a schematic structural diagram of an image frame display device according to an embodiment of the present application;
Fig. 9 is a schematic diagram of a structure of another image frame display device according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a terminal according to an embodiment of the present application;
Fig. 11 is a schematic structural diagram of a server according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the following detailed description of the embodiments of the present application will be given with reference to the accompanying drawings.
The terms "first," "second," "third," "fourth," "fifth," "sixth," and the like as used herein may be used to describe various concepts, but are not limited by these terms unless otherwise specified. These terms are only used to distinguish one concept from another. For example, a first display region can be referred to as a second display region, and similarly, a second display region can be referred to as a first display region without departing from the scope of the application.
The terms "at least one", "a plurality", "each", "any" as used herein, at least one includes one, two or more, a plurality includes two or more, and each refers to each of the corresponding plurality, any of which refers to any of the plurality. For example, the plurality of display areas includes 3 display areas, and each refers to each of the 3 display areas, and any one of the 3 display areas can be the first display area, or the second display area, or the third display area.
It should be noted that, the information (including but not limited to user equipment information, user personal information, etc.), data (including but not limited to data for analysis, stored data, presented data, etc.), and signals related to the present application are all authorized by the user or are fully authorized by the parties, and the collection, use, and processing of the related data is required to comply with the relevant laws and regulations and standards of the relevant countries and regions. For example, image frames, control instructions, picture data, and the like, which are referred to in the present application, are acquired with sufficient authorization.
The nouns involved in the embodiments of the present application are briefly described:
Virtual scene: is a scene that the application displays (or provides) while running on the terminal, i.e. a scene that the terminal displays while running a game, also called a world scene. The virtual scene is a simulation environment for the real world, or a semi-simulated and semi-fictional virtual environment, or a purely fictional virtual environment. The virtual scene is any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, or a three-dimensional virtual scene, which is not limited in the present application. For example, a virtual scene includes sky, land, sea, etc., the land includes environmental elements of a desert, city, etc., and a user can control a virtual object to move in the virtual scene. Of course, the virtual scene also includes virtual props, such as throwing objects, buildings, vehicles, etc., and can also be used to simulate real environments in different weather, such as sunny days, rainy days, foggy days, or night days. The variety of scene elements enhances the diversity and realism of virtual scenes.
Virtual object: refers to a virtual character that is movable in a virtual scene, the movable object being a virtual character, a virtual animal, a cartoon character, or the like. The virtual object is a virtual avatar in the virtual scene for representing a user. The virtual scene comprises a plurality of virtual objects, and each virtual object has a shape and a volume thereof in the virtual scene and occupies a part of space in the virtual scene. Alternatively, the virtual object is a character controlled by operating on a client, or an artificial intelligence set in the virtual environment by training (ARTIFICIAL INTELLIGENCE, AI), or a Non-player character set in the virtual scene (Non-PLAYER CHARACTER, NPC). Optionally, the virtual object is a virtual character playing an athletic in the virtual scene.
Virtual prop: refers to props that can be used with virtual objects in a virtual scene. For example, the virtual prop is a virtual armor or a virtual weapon, etc.
Virtual game: in the virtual scene, the game of at least two virtual objects for fight is played, and optionally, the virtual objects are played by at least two virtual objects for fight in a single game. Each virtual fight corresponds to a fight duration/fight number, and when the virtual fight corresponds to the fight duration, the virtual object with the survival duration reaching the fight duration obtains win; when the virtual fight corresponds to the number of fight people, the last or set of surviving virtual objects gets victory. Alternatively, the virtual match may be a single match mode virtual match (i.e., the virtual objects in the virtual match are all single combat), a double match mode virtual match (i.e., the virtual objects in the virtual match may be two-person team combat or single combat), or a four-person match mode (i.e., the virtual match may be a team of up to four virtual objects), where when the match mode is a double match mode or a four-person match mode, the first virtual object may be matched with the second virtual object having a friend relationship or may be matched with the third virtual object having no friend relationship.
The image frame display method provided by the embodiment of the application can be executed by computer equipment. Optionally, the computer device is a terminal. Optionally, the terminal is a smart phone, a tablet computer, a notebook computer, a desktop computer, a smart speaker, a smart watch, a smart voice interaction device, a smart home appliance, a vehicle-mounted terminal, and the like, but is not limited thereto. Optionally, the image frame display method provided by the embodiment of the application is performed jointly by the terminal and the server, and the server is used for providing the terminal with the picture data for generating the image frames so that the terminal can generate and display the image frames. Optionally, the server is an independent physical server, or a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs (Content Delivery Network, content delivery networks), basic cloud computing services such as big data and artificial intelligent platforms, and the like.
In some embodiments, the computer program according to the embodiments of the present application may be deployed to be executed on one computer device or on multiple computer devices located at one site or on multiple computer devices distributed across multiple sites and interconnected by a communication network, where the multiple computer devices distributed across multiple sites and interconnected by the communication network form a blockchain system.
In some embodiments, the computer device is provided as a terminal. FIG. 1 is a schematic diagram of an implementation environment provided by an embodiment of the present application. Referring to fig. 1, the implementation environment includes a terminal 101 and a server 102, and the terminal 101 and the server 102 are connected through a wireless or wired network.
The terminal 101 is provided with an application served by the server 102, and the terminal 101 can realize functions such as data transmission and image frame display by the application. Alternatively, the application is an application in the operating system of the terminal 101 or an application provided for a third party. For example, the application is a game application having a game function, but of course, the game application can also have other functions, such as a criticizing function, a shopping function, a navigation function, and the like.
The terminal 101 is configured to log in to an application based on an account number, and to obtain screen data from the server 102 by interacting with the application, so that the terminal 101 displays an image frame of a scene screen including a virtual scene based on the screen data by the application.
Fig. 2 is a flowchart of an image frame display method according to an embodiment of the present application, in which the method is performed by a terminal, for example, as shown in fig. 2, and the method includes the following steps.
201. The terminal determines priorities of a plurality of display areas in the display interface based on a control instruction of a virtual object in the virtual scene and an image frame currently displayed in the display interface, wherein the currently displayed image frame comprises a scene picture of the virtual scene, and the priorities indicate the influence degree of picture change of the display areas on virtual games.
In the embodiment of the application, the terminal can display the image frames in the display interface, and the terminal can sequentially display a plurality of image frames in the display interface along with the time so as to display the scene picture of the virtual scene to change along with the time. In the process of sequentially displaying a plurality of image frames in a display interface, when one image frame is currently displayed in the display interface, the priority of a plurality of display areas in the display interface is divided based on the currently displayed image frame and a control instruction for a virtual object in a virtual scene, so that the next image frame of each display area can be rendered in different rendering modes according to the priority of each display area, and the next image frame formed by the next image frames of the plurality of display areas can be displayed in the display interface.
Wherein the virtual object is any type of virtual object, for example, the virtual object is a virtual character, a virtual animal, etc. The control instructions for the virtual object can be any type of instruction, for example, a move instruction, an attack instruction, a redemption equipment instruction, a skill release instruction, etc. The virtual object is a virtual object controlled by the terminal, or is a virtual object controlled by other terminals, or is an artificial intelligent virtual object. That is, the control command to the virtual object is triggered by the terminal, or by other terminals, or automatically by the server. Each display area is a partial area in the display interface, and a plurality of display areas can constitute the display interface, or a plurality of display areas are only partial areas in the display interface. There may or may not be overlapping display areas among the plurality of display areas. The display region can be an arbitrarily shaped region, for example, a rectangular region, a circular region, or the like. The priority of the display area is related to the influence degree of the picture change of the display area on the virtual game, and the higher the priority of the display area is, the larger the influence of the picture change of the display area on the virtual game is; the lower the priority of a display area, the less the impact of picture changes identifying that display area on the virtual office.
202. And under the condition that the priority of the display area meets the first rendering condition, the terminal generates a next frame picture of the display area based on picture data of the next frame of the display area.
In the embodiment of the application, the rendering conditions comprise two types, one is a first rendering condition and the other is a second rendering condition, the picture generation modes of the display areas with the priorities meeting different rendering conditions are different, the priority meeting the first rendering condition is higher than the priority meeting the second rendering condition, and for the display areas with the priorities meeting the first rendering condition, the next frame picture of the display areas is generated by utilizing the next frame picture data corresponding to the display areas; and for the display area with the priority meeting the second rendering condition, predicting the next frame picture of the display area by utilizing the rendered image frame.
Here, the picture data refers to data for rendering a picture, and for example, the picture data includes data for generating an avatar of a virtual object, picture background data, and the like. The next frame of picture of the display area refers to a picture displayed by the display area when the display interface displays the next image frame.
For example, the first rendering condition is an actual rendering condition, the second rendering condition is an intelligent prediction condition, and for a display area with priority meeting the actual rendering condition, next frame picture of the display area is generated by using next frame picture data corresponding to the display area; and for the display area with the priority meeting the intelligent prediction, predicting the next frame picture of the display area by utilizing the rendered image frame.
In the embodiment of the application, the influence of the picture change of the display area which preferentially meets the first rendering condition on the virtual game is large, and then the next frame picture of the display area is generated by utilizing the next frame picture data corresponding to the display area for the display area which preferentially meets the first rendering condition, so that the next frame picture of the display area which preferentially meets the first rendering condition is accurate enough.
203. And under the condition that the priority of the display area meets the second rendering condition, the terminal predicts the next frame picture of the display area based on the displayed image frame.
In the embodiment of the application, the influence of the picture change of the display area which preferentially meets the second rendering condition on the virtual game is small, and then the next frame picture of the display area is predicted by utilizing the rendered image frame for the display area which meets the second rendering condition, so that the calculation amount for generating the next frame picture can be reduced while ensuring that the next frame picture of the display area which meets the second rendering condition is accurate enough.
Wherein the displayed image frame refers to an image frame that has been displayed by the terminal in the display area, the displayed image frame includes a currently displayed image frame, and optionally, the displayed image frame further includes an image frame preceding the currently displayed image frame.
204. The terminal displays an image frame formed by a next frame picture of the plurality of display areas on the display interface.
In the embodiment of the application, when the next frame picture of each display area is obtained, the next image frame can be formed by the next frame pictures of a plurality of display areas, and then the next image frame is displayed in the display interface.
In the scheme provided by the embodiment of the application, based on the control instruction to the virtual object and the currently displayed image frame, the influence degree of the picture change of a plurality of display areas in the display interface on the virtual game is determined, namely, the priority of the plurality of display areas is determined, so that the next frame picture of the display area is generated by using the next frame picture data corresponding to the display area according to the rendering condition met by the priority of the plurality of display areas, and the next frame picture of the display area is predicted by using the rendered image frame according to the display area meeting the second rendering condition, thus, the next frame picture is only generated according to the picture data for the display area with large influence of picture change on the virtual game, the next frame picture of other display areas is predicted based on the rendered image frame, so that the calculation amount for rendering the next image frame is reduced as much as possible while the accuracy of the rendered next frame picture is ensured, and the requirement on equipment is reduced.
Based on the embodiment shown in fig. 2, the embodiment of the present application determines the priority of the display area based on the virtual element in the currently displayed image frame, and determines the rendering condition satisfied by each priority in combination with the running state parameter of the local device, and the specific process is described in the following embodiment.
Fig. 3 is a flowchart of another image frame display method according to an embodiment of the present application, in which the method is performed by a terminal, for example, as shown in fig. 3, and the method includes the following steps.
301. The terminal determines priorities of a plurality of display areas in the display interface based on a control instruction of a virtual object in the virtual scene and virtual elements in an image frame currently displayed in the display interface, wherein the currently displayed image frame comprises scene pictures of the virtual scene, the priorities indicate the influence degree of picture changes of the display areas on virtual games, and the virtual elements comprise at least one of virtual objects, virtual controls or virtual maps.
In the embodiment of the application, in the process of displaying the virtual scene by the terminal, the virtual elements influencing the picture displayed by the terminal not only comprise the virtual objects controlled by the user through the terminal, but also comprise other virtual objects, virtual controls or virtual maps in the virtual scene, and the like. The other virtual objects in the virtual scene are virtual objects controlled by other terminals, artificial intelligent virtual objects and the like. The virtual control is any type of control, for example, the virtual control is a control for controlling the movement of the virtual object, a control for exchanging virtual props for the virtual object, and a control for controlling the attack of the virtual object. The virtual map is a thumbnail map or a large map of the virtual scene, and the virtual map can indicate the positions of various virtual objects in the virtual scene. Therefore, based on the virtual elements in the currently displayed image frame and the control instructions for the virtual objects in the virtual scene, the influence of the control instructions on which virtual elements can be affected is determined, and further the influence of the picture change of the display area where each virtual element is located on the virtual game is determined, so that the priority of a plurality of display areas in the display interface is determined.
In the embodiment of the application, in the process of virtual game progress, the picture change displayed by the terminal in the display interface is influenced by a plurality of factors, and in the process of user experience, the most attention is paid to the change of virtual elements caused by the control instruction triggered by the terminal by the user, so that the influence degree of the picture change of a plurality of display areas in the display interface on the virtual game is determined based on the virtual elements in the currently displayed image frames and the control instruction on the virtual objects in the virtual scene, namely the priority of the plurality of display areas in the display interface is determined, so that the calculation amount of the generated image frames is reduced while the display effect of the displayed image frames is ensured.
For example, in a virtual scene, a virtual object may generate movement or fight against other virtual objects, and then the area where the virtual object is located may change in the picture in the next image frame; in the case that the virtual control in the virtual scene is triggered or is in countdown cooling, the picture of the area where the virtual control is located in the next image frame may change; in the case where the position of each virtual object in the virtual scene changes, the virtual map may change to indicate the real-time position of each virtual object.
In one possible implementation, the display area includes at least one virtual element, or the display area is an area of influence of the control instruction.
In the embodiment of the application, the image frame currently displayed on the display interface comprises one or more virtual elements, the pictures of the areas where the virtual elements are located in different image frames may change, and the control instruction may affect the picture of a certain area when controlling the virtual object, for example, the control instruction controls the virtual object to jump to the certain area or controls the virtual object to release skills to the certain area. Therefore, according to the position of each virtual element in the display interface and the influence area of the control instruction, a plurality of display areas are divided from the display interface, so that the virtual elements in the image frame are contained in the display areas, and then the priority of each display area is determined, so that the priority of the display area containing the virtual element affected or the priority of the display area affected by the control instruction is determined, and the priority of the display area with excessive display areas is not required to be determined, so that the calculation amount of data is reduced.
In the embodiment of the application, only the display area containing the virtual element is determined from the display interface, or the display area influenced by the control instruction is determined, wherein the determined display area is equivalent to a key display area, and the key display area comprises the area where the virtual element is located, the influence area of the control instruction, the operable area and the like, so that a plurality of display areas determined from the display interface may be part of the display interface.
In one possible implementation, the control instruction for the virtual object is triggered by a virtual control displayed by the user through the terminal, or is triggered automatically by the game system. The control instruction automatically triggered by the game system refers to an attack instruction, a moving instruction or an instruction for updating the virtual object in the virtual scene.
For example, when the user clicks the virtual control displayed by the terminal, the terminal is equivalent to receiving a control instruction corresponding to the virtual control, or when other users click the virtual control displayed on other terminals, the other terminals receive the control instruction and interact with the server, and the server sends the control instruction to each terminal; or instructions for controlling the artificial intelligent virtual object to execute actions by the server in the process of supporting game running, or instructions for updating the server at regular time, such as instructions for updating the displayed artificial intelligent virtual object in the virtual scene.
In the embodiment of the application, under the condition of triggering the control instructions, the virtual element or elements in the virtual scene are influenced, wherein the influences comprise virtual object movement, virtual object deformation, virtual control countdown, virtual map movement and the like, and the influences can cause the game picture displayed by the terminal to change.
In one possible implementation, this step 301 includes the following steps 3011-3013.
3011. Based on the control instructions and the virtual elements in the currently displayed image frame, a prioritization policy is queried, which indicates the priority of the display area containing each virtual element under the action of each control instruction.
In the embodiment of the application, a priority classification strategy is preset, the priority classification strategy indicates a plurality of virtual elements possibly displayed by a terminal in the process of displaying a picture of a virtual scene and a plurality of display areas where each virtual element is positioned in a display interface, and also indicates a plurality of control instructions possibly triggered in the virtual scene, and for each virtual element, each display area and each control instruction, the priority classification strategy indicates the priority of the display area where each virtual element is positioned under the influence of each control instruction when each virtual element is positioned in each display area. Therefore, based on the control instruction and the virtual element in the currently displayed image frame, the priority classification strategy is queried to determine the priority of each display area, and the accuracy of the determined priority of each display area is ensured.
For example, for any virtual element, the virtual element may appear in 3 display areas, and the control instructions that may be triggered in the virtual scene are 3, then the prioritization policy indicates, for each of the 3 display areas, the priority of the display area where the virtual element is located under the action of each control instruction when the virtual element is in the display area, that is, the prioritization policy indicates 3 priorities corresponding to each display area.
Where the prioritization policy can be expressed in any form, for example, the prioritization policy is expressed in a form of a table or text.
In one possible implementation, the prioritization policy is set by a developer.
In the embodiment of the application, when the control instruction is triggered, the state of the virtual element is changed along with the control instruction, so that the display picture of the area where the virtual element is located is changed. Because the changes have different influences on the game result due to the differences of the virtual elements and the control instructions, some of the changes directly influence the game result, and some of the changes do not have great influence, therefore, the priority of the picture area is set through the influence degree of the picture change on the virtual game, and the priority classification strategy is further obtained.
In one possible implementation, the prioritization policy indicates a priority of the display area containing the first virtual element under the action of each control instruction, and in the event that the priority of the display area containing the first virtual element is determined, the priority of the display area containing the second virtual element.
In the embodiment of the application, the first virtual element is a virtual element directly related to the control instruction in the virtual scene, such as a virtual object indicated by the control instruction, a virtual control triggering the control instruction, and the like. The second virtual element is a virtual element affected by the control instruction to the first virtual element, for example, the second virtual element is another virtual object, another virtual control, or a virtual map affected by the control instruction to the first virtual element. The control instruction for a certain virtual object affects one or more virtual elements, so that the priority of each display area under various conditions is pointed out in the priority classification strategy, the priority of each display area can be determined based on the priority classification strategy, and the accuracy of the determined priority is ensured.
Alternatively, in the case where a plurality of virtual elements are affected by a control instruction to a first virtual element, the virtual element that is most affected is determined as a second virtual element.
For example, in response to an attack instruction on a first virtual object, the plurality of virtual objects are attacked by the first virtual object, and then the most damaged virtual object among the plurality of virtual objects is determined as a second virtual object.
Optionally, the prioritization policy includes a plurality of sub-policies, each sub-policy indicating a priority of a display area containing the first virtual element under a control instruction, and also indicating a priority of a display area containing the second virtual element under the control instruction in case the priority of the display area containing the first virtual element is determined.
For example, the priority classification policy is expressed in the form of a table, and the priority classification policy includes multiple sub-policies, as shown in table 1, a first virtual element identifier indicates a first virtual element, a control instruction identifier indicates a control instruction, a position of the first virtual element indicates a position of the first virtual element in a virtual scene, a region occupied by the first virtual element is a range of a display region including the first virtual element, a second virtual element identifier indicates a second virtual element affected by the control instruction to the first virtual element, an affected position is a position of the second virtual element in the virtual scene, and an affected region is a range of the display region including the second virtual element. And also indicated in table 1 are various field types such as INT (integer type), list (List), enumeration type, etc. In the embodiment of the application, the first virtual element identifier and the second virtual element identifier in the same sub-policy are different.
TABLE 1
3012. And under the condition that the first priority is inquired, setting the priority of the display area containing the virtual element as the first priority, wherein the first priority is the priority of the display area containing the virtual element in the priority division strategy under the action of the control instruction.
In the embodiment of the application, based on the control instruction and the virtual element query priority classification strategy in the currently displayed image frame, the priority of the display area containing the virtual element is indicated as the first priority under the condition that the first priority is queried.
For example, the prioritization policy includes various sub-policies, referring to table 1, taking a first virtual element as a first virtual object and a second virtual element as a second virtual object as an example, in response to a control instruction for the first virtual object, based on the first virtual object and the query prioritization policy, the priority of the display area including the first virtual object is determined to be priority 1, and the priority of the display area including the second virtual object is determined to be priority 2.
3013. The priority of the remaining display areas in the display interface is set to the lowest priority.
In the embodiment of the application, for the display area with the priority not queried from the priority classification strategy, the picture change of the display area has little influence on virtual games, and the priority of the display area is set to be the lowest priority.
In the embodiment of the application, the priority of each display area is pointed out by the priority dividing strategy, in the process of virtual game execution, the picture change displayed by the terminal in the display interface is influenced by a plurality of factors, and in the process of user experience, the most attention is paid to the change of virtual elements caused by the control instruction triggered by the terminal by the user, so that the influence degree of the picture change of a plurality of display areas in the display interface on the virtual game is determined by inquiring the priority dividing strategy based on the virtual elements in the currently displayed image frame and the control instruction on the virtual object in the virtual scene, namely, the priority of the plurality of display areas in the display interface is determined, and the accuracy of the determined priority is ensured.
It should be noted that, in the embodiment of the present application, the priority of each display area is determined by combining the virtual element in the currently displayed image frame as an example, and in another embodiment, the priority of a plurality of display areas in the display interface is determined based on the control instruction for the virtual object in the virtual scene and the currently displayed image frame in the display interface, without executing the step 301.
302. The terminal determines rendering conditions met by the priorities of the display areas based on the running state parameters of the local terminal device, the priorities of the display areas and the rendering strategies, wherein the rendering conditions comprise first rendering conditions or second rendering conditions, and the rendering strategies indicate the rendering conditions met by each priority for different running state parameters of the device.
In the embodiment of the application, the local terminal equipment is the terminal, the running state parameter of the terminal indicates the running state of the terminal, the calculated amount of the terminal can influence the running state of the terminal in the process of generating the image frames, the calculated amount in the process of generating the next frame of picture of the display area with different priority levels meeting different rendering conditions is different, and the rendering strategy indicates the rendering condition met by each priority level of the different running state parameters of the equipment, so that the rendering condition met by the priority level of each display area is determined based on the running state parameter of the local terminal equipment, the priority levels of a plurality of display areas and the rendering strategy, so that the calculated amount can be reduced as much as possible while the running state of the terminal is prevented from being influenced, the performance of the terminal is ensured, and the terminal can normally display the image frames.
Wherein the rendering policy can be represented in any form, e.g. the rendering policy is represented in text or in tabular execution.
In one possible implementation, the operating state parameter includes at least one of temperature or load rate.
Wherein the temperature refers to the temperature of the device, and the load factor is the load factor of the CPU (Central Processing Unit ) or the load factor of the GPU (Graphics Processing Unit, image processor) of the device.
In the embodiment of the application, the temperature of the terminal is increased in the process of generating the image frame, and the load rate of the terminal is increased. As the temperature of the terminal increases or the load factor of the terminal increases, the performance of the terminal decreases. For example, the too high temperature of the terminal may cause the processing frequency of the CPU and the GPU of the terminal to decrease, resulting in the increase of the length required for the terminal to generate the image frame, and further, the abnormality of the image frame displayed by the terminal, so that in the process of generating the image frame, different rendering modes are determined in consideration of the temperature or the load factor of the terminal, so as to avoid the increase of the temperature or the load factor of the terminal caused by the generation of the image frame, so that the effect of the generated image frame is ensured as much as possible under the condition of ensuring the performance of the terminal.
In one possible implementation, the rendering policy includes sub-policies corresponding to multiple status parameter intervals, where the sub-policies corresponding to the status parameter intervals indicate rendering conditions that are met by each priority if an operational status parameter of the device belongs to the status parameter interval.
In the embodiment of the present application, the multiple state parameter intervals are non-overlapping and continuous, for example, the multiple state parameter intervals include 3 types, the maximum operation state parameter of the first state parameter interval is equal to the minimum operation state parameter of the second state parameter area, and the maximum operation state parameter of the second state parameter interval is equal to the minimum operation state parameter of the third state parameter area, that is, the 3 state parameter intervals are continuous.
For example, if the operating state parameter of the terminal is temperature, the state parameter interval is a temperature interval, and the multiple temperature intervals include T1, T2 and T3, respectively, where the maximum temperature of T1 is equal to the minimum temperature of T2, and the maximum temperature of T2 is equal to the minimum temperature of T3, that is, 3 temperature intervals are continuous. In case the temperature of the terminal belongs to T1, the performance of the computing unit in the terminal for generating the image frames is good, i.e. the terminal is able to generate the image frames in a high performance mode. Under the condition that the temperature of the terminal belongs to T2, the performance of a computing unit used for generating image frames in the terminal is reduced, and the computing unit of the terminal operates in an equalization model at the moment so as to avoid that the terminal enters a frequency limiting mode due to the fact that the temperature of the terminal rises too high, namely the performance of the terminal is prevented from being too poor. In case the temperature of the terminal belongs to T3, the computing unit of the terminal needs to operate in a low power consumption mode, i.e. the operating frequency of the computing unit may be low and the performance of the terminal may be poor. Therefore, the sub-strategies corresponding to the temperature intervals are set, so that the rendering mode of each display area can be determined when the temperature of the terminal belongs to each temperature interval, and the performance of the terminal can be considered while the image frame is generated.
Optionally, the target sub-policy further indicates a rendering condition that the second priority satisfies during display of consecutive k image frames, k being an integer greater than 1.
In the embodiment of the present application, the target sub-policy is any sub-policy in the rendering policy, the second priority is any priority, and the target sub-policy indicates that in the process of displaying consecutive k image frames, the rendering conditions satisfied by the second priority may be different. For example, k is 2, the target sub-policy indicates that in the process of displaying 2 continuous image frames, one image frame performs actual rendering, and the other image frame performs intelligent prediction, and the actual rendering refers to actual rendering by calling a computing unit according to normal rendering logic; the intelligent prediction refers to intelligent prediction according to a frame image rendered before and an image actually rendered in an adjacent area.
For example, the rendering policy is represented in a table form, and as shown in table 2, the rendering policy includes 3 temperature intervals and 3 priorities, the rendering policy indicates rendering conditions that are satisfied by the priorities in 3 of the temperatures of the devices in each temperature interval, the rendering conditions include a first rendering condition or a second rendering condition, the first rendering condition is an actual rendering condition, and the second rendering condition is an intelligent prediction.
TABLE 2
Wherein, "25% of frames are actually rendered" means that after the picture of the display area in one image frame is obtained based on actual rendering, the picture of the display area in the next 3 image frames is obtained based on intelligent prediction. "50% of frames are actually rendered" means that after a picture of a display area in one image frame is obtained based on actual rendering, a picture of the display area in the next image frame is obtained based on intelligent prediction. "75% of frames are actually rendered" means that after the pictures in the first 3 image frames of the display area are obtained based on actual rendering, the pictures in the 4 th image frame of the display area are obtained based on intelligent prediction.
Optionally, in the case that the rendering policy includes sub-policies corresponding to various status parameter intervals, the step 302 includes: inquiring a rendering strategy based on the running state parameters of the local terminal equipment to obtain a target sub-strategy, wherein the running state parameters belong to a state parameter interval corresponding to the target sub-strategy; and inquiring the target sub-strategy based on the priority of the display area to obtain rendering conditions met by the priority of the display area.
In the embodiment of the application, based on the running state parameters of the local equipment, various state parameter intervals are inquired, which state parameter interval the running state parameters of the local equipment belong to can be determined, and then the sub-strategy corresponding to the state parameter interval to which the running state parameters of the local equipment belong is determined as the target sub-strategy, and based on the priority of the display area, the target sub-strategy is inquired to determine the rendering condition met by the priority of the display area when the next image frame is displayed, so that the accuracy of the determined rendering condition is ensured.
Optionally, the target sub-policy further indicates a rendering condition satisfied by the second priority during display of consecutive k image frames, k being an integer greater than 1; a process of determining rendering conditions satisfied by the priority of the display area based on the target sub-policy, comprising: and determining rendering conditions currently met by the second priority based on target information and a target sub-policy, wherein the target information indicates the situation of the rendering conditions met by the second priority in the process of generating the displayed k-1 image frames, and the k-1 image frames comprise the currently displayed image frames and the k-1 image frames are continuous.
In the embodiment of the application, the target sub-strategy indicates the rendering condition met by the second priority in the process of displaying the continuous k image frames, and the condition of the rendering condition met by the second priority in the process of generating the displayed k-1 image frames is acquired so as to be combined with the target sub-strategy, so that the rendering condition met by the second priority in the process of rendering the next image frame can be determined, the rendering condition met by the second priority in the process of continuously k image frames is ensured to be consistent with the target sub-strategy, and the accuracy of determining the rendering condition met by the second priority currently is ensured.
It should be noted that, in the embodiment of the present application, the rendering condition satisfied by the priority of each display area is determined based on the operation status parameter of the local device, and in another embodiment, the above step 302 is not required to be performed, but other manners are adopted to determine the rendering condition satisfied by the priority of each display area.
303. And under the condition that the priority of the display area meets the first rendering condition, the terminal generates a next frame picture of the display area based on picture data of the next frame of the display area.
In one possible implementation, this step 303 includes: and under the condition that the priority of the display area meets the first rendering condition, calling a computing unit to render the picture data of the next frame of the display area, and obtaining the picture of the next frame of the display area.
In the embodiment of the application, the computing unit is used for loading the picture data to render the picture, and then the computing unit is called to load the picture data of the next frame of the display area to generate the next frame picture of the display area so as to ensure the accuracy of the generated next frame picture.
In one possible implementation, the process of acquiring the picture data of the next frame of the display area includes: based on the virtual element contained in the display area in the currently displayed image frame, element data of the next frame of the virtual element is obtained from the server, and the element data of the next frame of the virtual element contained in the display area is determined as picture data of the next frame of the display area.
The element data is used for rendering out the virtual element, for example, the virtual element is a virtual control, and the element data of the virtual element includes the position, the size, the color, the shape and the like of the virtual control.
In the embodiment of the application, when the terminal currently displays the image frame in the display interface, for any display area in the display interface, the virtual element contained in the display area may change in the next frame of image, and the server stores the data of the virtual element at each moment, the terminal interacts with the server to obtain the element data of the virtual element contained in the display area in the next frame from the server, and the element data of the virtual element contained in the display area in the next frame is used as the next frame of image data of the display area, so as to ensure the accuracy of the image data of the next frame of the display area.
Optionally, if the frame data of the next frame of the display area further includes background data, the process of acquiring the frame data of the next frame of the display area includes: based on the virtual element contained in the display area in the currently displayed image frame, acquiring element data of a next frame of the virtual element and background data of an area where the virtual element is located from a server, and determining the element data of the next frame of the virtual element contained in the display area and the background data of the area where the virtual element is located as picture data of the next frame of the display area.
The background data of the region where the virtual element is located is used for generating a background image of the region where the virtual element is located, and the element data of the next frame of the virtual element contained in the display region and the background data of the region where the virtual element is located are obtained to be used as the next frame of picture data of the display region, so that the accuracy of the next frame of picture of the display region generated subsequently is ensured.
In one possible implementation, this step 303 includes: and generating a next frame picture of the display area based on picture data of the next frame of the display area under the condition that the next image frame is determined to be a non-key frame and the priority of the display area meets the first rendering condition.
In the embodiment of the disclosure, in the process that the terminal sequentially displays each image frame in the display interface, some image frames may be key frames, some image frames are non-key frames, the content included in the key frames is important content, the key frames are rendered based on the picture data of the whole image frame, and the non-key frames are rendered according to the priority of each display area, so that when the next image frame is the non-key frame and the priority of the display area meets the first rendering condition, the next frame picture of the display area is generated based on the picture data of the next frame of the display area, so as to ensure the accuracy of rendering.
304. And under the condition that the priority of the display area meets the second rendering condition, the terminal predicts the next frame picture of the display area based on the displayed image frame.
In one possible implementation, this step 304 includes: when the priority of the fourth display area satisfies the second rendering condition, predicting the next frame of the fourth display area based on the history frame of the fourth display area and the next frame of the fifth display area, wherein the fourth display area is any display area of the plurality of display areas, the priority of which satisfies the second rendering condition, and the fifth display area is a display area of the plurality of display areas, the priority of which satisfies the first rendering condition, and which is adjacent to the fourth display area.
In the embodiment of the application, considering that the image frames displayed by the terminal have continuity, that is, pictures in a plurality of image frames in the same display area have continuity, and pictures in adjacent display areas in the next image frame can affect each other, the next frame of the fourth display area is predicted based on the historical frame of the fourth display area and the next frame of the fifth display area, so as to ensure the accuracy of the next frame of the fourth display area.
The history frame picture of the fourth display area comprises a picture displayed in the fourth display area in the image frame currently displayed by the terminal, and optionally, the history frame picture of the fourth display area further comprises a picture displayed in the fourth display area in the image frame before the image frame currently displayed. When the fifth display area is displayed, the next frame of picture is obtained according to the step 303.
Optionally, the process of predicting the next frame of picture of the fourth display area includes: and predicting the next frame picture of the fourth display area based on the virtual element in the history frame picture of the fourth display area and the virtual element in the next frame picture of the fifth display area.
In the embodiment of the application, the display state of the virtual element is considered to have continuity in a plurality of image frames, so that the next frame of the fourth display area is predicted based on the virtual element in the history frame of the fourth display area and the virtual element in the next frame of the fifth display area, so as to ensure the accuracy of the next frame of the fourth display area.
305. The terminal displays an image frame formed by a next frame picture of the plurality of display areas on the display interface.
In one possible implementation, where multiple display areas can form a display interface, then step 305 includes: based on the positions of the plurality of display areas in the display interface, the next frame of the plurality of display areas is formed into an image frame, and the image frame is displayed in the display interface.
In the embodiment of the application, the display interface is divided into a plurality of display areas, and the next frame of picture of each display area is respectively acquired according to the priority of each display area, so that the next frame of picture of the plurality of display areas is spliced into a complete image frame for display, and the accuracy of the displayed image frame is ensured.
In one possible implementation, the process of acquiring the next image frame when the plurality of display areas are part of the display interface includes: based on the currently displayed image frame, predicting a background image in the next image frame, and forming the background image of the next image frame and the next frame picture of a plurality of display areas.
In the embodiment of the application, the plurality of display areas are partial areas of the display interface, the next frame of picture of the display area can only display pictures of virtual elements contained in the display area, and the pictures of the virtual elements are obtained by virtual rendering based on the elements of the virtual elements contained in the display area or are obtained by prediction based on the displayed image frames. Considering that the picture content of a plurality of continuous image frames displayed by the terminal has continuity, the background image of the next image frame can be predicted based on the currently displayed image frames, and then the background image is combined with the pictures of the display elements in each display area, so that the next image frame can be obtained, and the accuracy of the obtained image frame is ensured.
It should be noted that, in the embodiment of the present application, the background image in the next image frame is predicted based on the currently displayed image frame, and in another embodiment, the background image in the next image frame can also be obtained from the server, and the background image in the next image frame is rendered based on the background data.
In one possible implementation, the process of displaying the next image frame when the plurality of display areas are part of the display interface includes: the next image frame is constructed based on the next frame picture of the plurality of display areas and the next frame background picture.
In the embodiment of the application, each display area contains one or more virtual elements, and then the next frame of picture of the obtained display areas is the next frame of picture of the display area containing the virtual elements, and the next frame of background picture is shown for other display areas not containing the virtual elements, so that the next image frame is formed based on the next frame of picture of the display areas and the next frame of background picture, thereby ensuring the accuracy of the next image frame.
In the scheme provided by the embodiment of the application, based on the control instruction to the virtual object and the currently displayed image frame, the influence degree of the picture change of a plurality of display areas in the display interface on the virtual game is determined, namely, the priority of the plurality of display areas is determined, so that the next frame picture of the display area is generated by using the next frame picture data corresponding to the display area according to the rendering condition met by the priority of the plurality of display areas, the next frame picture of the display area is generated by using the next frame picture data corresponding to the display area, and the next frame picture of the display area is predicted by using the rendered image frame according to the display area with the priority meeting the second rendering condition, thus, the next frame picture is generated according to the picture data only for the display area with large influence of picture change on the virtual game, the next frame picture of other display areas is predicted based on the rendered image frame, so that the calculation amount for rendering the next image frame is reduced as much as possible while the accuracy of the rendered next frame picture is ensured, and the requirement on equipment is reduced.
It should be noted that, on the basis of the embodiment shown in fig. 3, there may be overlapping areas in the multiple display areas in the display interface, and before the next frame of picture in each display area is acquired, the display area with overlapping areas is further processed, so that the next frame of picture in each display area is generated by using the priority of the processed display area. In one possible implementation, the process of processing the display area where there is overlap includes the following two ways.
The first way is: under the condition that a plurality of first display areas have overlapping areas, the plurality of first display areas are fused to obtain a fused display area, wherein the first display area is any one of the plurality of display areas; determining that the priority of the fusion display area meets the first rendering condition under the condition that the priority of any first display area meets the first rendering condition; and under the condition that the priorities of the plurality of first display areas meet the second rendering condition, determining that the priorities of the fusion display areas meet the second rendering condition.
In the embodiment of the application, after the plurality of display areas are determined from the display interface, whether each display area is overlapped or not can be determined, under the condition that the plurality of first display areas are determined to be overlapped, the plurality of first display areas can be fused into one display area, namely, the fused display area is obtained, and further, the rendering condition met by the priority of the fused display area is determined according to the rendering condition met by the priority of the plurality of first display areas, so that the next frame of picture of the fused display area can be generated according to the rendering condition met by the priority of the fused display area, and thus, the next frame of picture does not need to be generated for each first display area respectively, the condition that the next frame of picture of the plurality of first display areas is overlapped is avoided, and the accuracy of the next frame of picture generated subsequently can be ensured.
Wherein, the existence of the overlapping areas of the plurality of first display areas means that the plurality of first display areas overlap each other. For example, there are overlapping areas of the 3 first display areas, the 1 st first display area overlaps with the 2 nd first display area, and the 2 nd first display area overlaps with the 3 rd first display area. For example, there are overlapping areas of the 3 first display areas, the 1 st first display area overlaps with the 2 nd and 3 rd first display areas, respectively, and the 2 nd and 3 rd first display areas overlap.
The second way is: determining that the priority of the second display area meets the second rendering condition when the plurality of first display areas have overlapping areas and the rendering conditions met by the priorities of the plurality of first display areas are different; the second display area is a display area except a third display area in the plurality of first display areas, and the third display area is a display area with priority meeting the first rendering condition in the plurality of first display areas.
The sum of the areas occupied by the plurality of first display areas in the display interface is equal to the sum of the second display area and the third display area, and the third display area and the second display area do not have an overlapping area.
In the embodiment of the application, the plurality of first display areas have overlapping areas, the priority of the plurality of first display areas meets different rendering conditions, and the rendering modes corresponding to the different rendering conditions are considered to be different, so that the display areas with the priority meeting the first rendering conditions are determined from the plurality of first display areas, the display areas except the display areas with the priority meeting the first rendering conditions in the plurality of first display areas are used as second display areas, and the priority of the second display areas is determined to meet the second rendering conditions, so that when the next image frame is generated, the accuracy of the next frame picture is ensured, the calculated amount of the next image frame is reduced as much as possible, the demand on calculation resources is less in the picture rendering process, the burden on equipment is reduced, the requirement on equipment is reduced, and the picture rendering performance of the equipment is improved.
As shown in fig. 4, the display interface includes a display area 1, a display area 2, and a display area 3, wherein virtual elements in the display area 1, the display area 2, and the display area 3 are all changed under the influence of a control instruction of a virtual object, an attack state of the virtual object 1 in the display area 1 is changed, a state of the virtual object 2 in the display area 2 is changed under the attack of the virtual object 1, and the virtual object 3 in the display area 3 runs. When the display area 1 and the display area 2 overlap, the priority of the display area 1 meets the first rendering condition, and the priority of the display area 2 and the priority of the display area 3 meet the second rendering condition, the display area 4 is determined based on the display area 1 and the display area 2, the display area 4 is an area which is not overlapped with the display area 1 in the display area 2, when the next frame picture of each display area is rendered, the next frame picture of the display area 1 is generated according to the rendering mode corresponding to the first rendering condition, and the next frame picture of the display area 3 and the next frame picture of the display area 4 are generated according to the rendering mode corresponding to the second rendering condition.
It should be noted that, the embodiment shown in fig. 3 is described by taking the following image frame as a non-key frame as an example, and in another embodiment, in the case that the following image frame is a key frame, other manners are adopted to display the following image frame.
In one possible implementation, the process of displaying the next image frame includes: generating a next image frame based on picture data of the next frame of the display interface under the condition that the next image frame is determined to be a key frame; in the display interface, the next image frame is displayed.
In the embodiment of the application, the content contained in the key frame is important content, and the key frame is rendered according to the rendering mode corresponding to the first rendering condition, namely, the rendering is performed based on the whole frame of picture data corresponding to the key frame, so that the accuracy of the next image frame is ensured, and the accuracy of the rendering is ensured.
Optionally, the process of determining whether the next image frame is a key frame includes the following two ways.
The first way is: and under the condition that the target proportion is larger than the threshold value, determining the next image frame as a key frame, wherein the target proportion is the ratio of the number of the sixth display areas to the total number of the plurality of display areas or the ratio of the sum of the areas of the sixth display areas to the area of the display interface, and the sixth display area is the display area with the priority meeting the first rendering condition in the plurality of display areas.
In the embodiment of the application, under the condition that the priorities of a plurality of display areas contained in the display interface are determined, the rendering condition met by the priorities of all the display areas can be determined, and then the display area with the priorities meeting the first rendering condition in the display interface can be determined from the plurality of display areas, so that the number or the total area of the display areas with the priorities meeting the first rendering condition can be determined, and under the condition that the ratio of the number of the display areas with the priorities meeting the first rendering condition in the display interface to the total number of the display areas in the display interface is greater than a threshold value, or under the condition that the ratio of the area of the display area with the priorities meeting the first rendering condition in the display interface to the area of the display interface is greater than a threshold value, the next image frame is determined to be a key frame, so that the next image frame can be directly rendered from the whole frame to ensure the rendering efficiency of the image frame.
For example, the display interface includes a plurality of display areas, and when a ratio of the number of display areas in the display interface having a priority that satisfies the first rendering condition to the total number of display areas in the display interface is greater than 90%, or when a ratio of the area of the display area in the display interface having a priority that satisfies the first rendering condition to the area of the display interface is greater than 90%, the next image frame is determined to be a key frame.
The second way is: in the case where the n image frames that have been displayed are obtained based on only the second rendering condition, it is determined that the next image frame is a key frame, the n image frames include the image frame that is currently displayed and the n image frames are consecutive, and n is an integer greater than 0.
In the embodiment of the application, the displayed n image frames are obtained only based on the second rendering condition, namely, the displayed n image frames are all predicted, and then the next image frame is determined to be a key frame, so that the image frames can be rendered based on the whole frame rendering data of the key frame, the situation that the predicted image frames are displayed for a long time to cause inaccurate displayed image frames is avoided, and the accuracy of the displayed image frames is ensured.
The n image frames include an image frame currently displayed by the display interface, and for the process of generating any one of the n image frames, determining a plurality of display areas from the display interface according to the above manner, and predicting the image frame when priorities of the plurality of display areas meet the second rendering condition.
In one possible implementation, a rendering mode of each image frame is recorded in a process of displaying a plurality of image frames, and whether the image frames are key frames can be determined by querying the recorded rendering mode.
It should be noted that, based on the embodiments shown in fig. 2 to 3, in the process of displaying the image frames by the terminal, the frame rate according to which the image frames are displayed by the local device is adjusted by combining the operation parameters of the local device.
In one possible implementation, the process of adjusting the current frame rate of the home device includes: under the condition that the running state parameters of the local terminal equipment belong to a first state parameter interval, acquiring a current frame rate and a maximum frame rate, wherein the current frame rate is the frame rate according to which the image frames are currently rendered; in the case that the current frame rate is less than the maximum frame rate, the current frame rate is increased.
In the embodiment of the application, the terminal displays the image frames according to a certain frame rate, so that the image frames with the number corresponding to the frame rate are displayed in unit time, and the terminal can influence the running state of the terminal when the terminal displays the image frames with a high frame rate, such as the temperature or the load rate of the terminal, so that the maximum frame rate is obtained in the process of displaying the image frames, and the frame rate adopted for displaying the image frames is adjusted in real time based on the running state parameters of the local terminal equipment, so as to ensure the display effect.
In the embodiment of the application, the first state parameter interval is an arbitrary state parameter interval, and the running state parameter of the local terminal equipment belongs to the first state parameter interval, which means that the local terminal equipment can achieve better performance, therefore, the current frame rate is increased under the condition that the current frame rate is smaller than the maximum frame rate, so as to ensure the display effect of the terminal.
For example, the operation state parameters of the home terminal device include temperature, the first state parameter interval is a temperature interval 1, and when the temperature of the home terminal device belongs to the temperature interval 1, the current operation state of the home terminal device is good, and a higher frame rate can be supported, so that when the current frame rate is smaller than the maximum frame rate, the current frame rate is increased, so as to ensure the display effect of the terminal.
In another possible implementation manner, the process of adjusting the current frame rate of the local device includes: under the condition that the running state parameters belong to a second state parameter interval, acquiring a first time length and a second time length, wherein the first time length is the time length for currently rendering one image frame, the second time length is the time length for rendering one image frame according to the current frame rate, and the smallest running state parameter in the second state parameter interval is larger than the largest running state parameter in the first state parameter interval; and in the case that the first time length is longer than the second time length, reducing the current frame rate.
In the embodiment of the application, when the running state parameter of the local equipment belongs to the second state parameter interval, the time length for rendering one image frame currently and the time length for rendering one image frame according to the current frame rate are determined so as to determine whether the local equipment can support displaying the image frame according to the current running state and display the image frame according to the current frame rate or not, and when the first time length is longer than the second time length, the current running state of the local equipment can not support displaying the image frame according to the current frame rate, so that the current frame rate is reduced, the calculated amount is reduced, the condition of frame loss is avoided, and the display effect is ensured.
The minimum operation state parameter in the second state parameter interval is greater than the maximum operation state parameter in the first state parameter interval, when the operation state parameter belongs to the second state parameter interval, the operation state of the local device is poorer than the operation state when the operation state parameter belongs to the first state parameter area, and the frame rate which can be supported by the local device is lower.
For example, the operating state parameter of the home terminal device includes a temperature, the second state parameter interval is a temperature interval 2, where when the temperature of the home terminal device belongs to the temperature interval 2, it indicates that the current operating state of the home terminal device is poor, and the frame rate can be supported to be low, so that a duration required for actually rendering an image frame at the current time is compared with a duration required for rendering the image frame at the frame rate, the currently adopted frame rate is 120 frames, that is, the duration required for rendering the image frame at the frame rate is 1000 ms/120=8.3 ms, and if the time required for actually rendering the image frame at the current time is longer than 8.3 ms, the currently adopted frame rate is reduced.
In the embodiment of the application, the pictures always change in real time in the running process of the game, but the focus of attention of the user is different in different scenes, for example, in a combat scene, the stable frame rate and high-definition image quality can directly influence the control experience of the user, so that the final game result is influenced, and the combat scene is the scene with the heaviest load in the whole game, so that the image frames are displayed according to the scheme provided by the embodiment of the application in the process of displaying the image frames of the combat scene by the terminal.
Based on the embodiments shown in fig. 2 to 4, the embodiment of the present application provides a schematic diagram of another implementation environment. As shown in fig. 5, the implementation environment includes a terminal, a game client installed on the terminal, and a game server. The game client comprises a main logic module, a behavior acquisition module, a rendering strategy control module and a state acquisition module; the game server comprises a storage module and a game logic module; the terminal comprises a temperature control module and a display module.
The temperature control module of the terminal is used for detecting the temperature of the terminal in the game running process, and is also used for responding to the state query request of the game client and feeding back the temperature of the terminal to the game client. The display module of the terminal is used for receiving the image frames of the game client and displaying the image frames on a terminal screen.
The main logic module of the game client is used for communicating with the game server, and can acquire the priority division strategy and the rendering strategy from the server and locally cache the priority division strategy and the rendering strategy. The behavior acquisition module of the game client is used for acquiring a control instruction triggered by a player, a control instruction triggered by other virtual objects or a control instruction automatically triggered by the game server in the game process. The system state acquisition module of the game client is used for interacting with the terminal in the game process, and acquiring the temperature of the terminal from the temperature control module. The rendering strategy control module of the game client is used for dividing the area of the display interface based on the information acquired by the behavior acquisition module and the system state acquisition module and the strategy cached by the main logic module, performing differentiated rendering according to the priority of the display area to obtain an image frame, and transmitting the image frame to the display module of the terminal for display.
The game logic module in the game server is used for interacting with the game client, responding to a strategy acquisition request sent by the game client through the main logic module, acquiring a priority division strategy and a rendering strategy from the storage module and issuing the priority division strategy and the rendering strategy to the game client; and acquiring the uploaded priority division strategy and rendering strategy, and storing the priority division strategy and rendering strategy through a storage module. The storage module in the game server is used for storing the prioritization policy and the rendering policy.
On the basis of the embodiments shown in fig. 2 to 5, the embodiment of the present application further provides a flowchart for initializing a game, as shown in fig. 6, and the method includes the following steps.
601. After the terminal starts virtual game through the game client, an initialization flow is started.
602. The terminal sends a policy acquisition request to the game server through the game client, wherein the policy acquisition request is used for requesting to acquire a priority division policy and a rendering policy.
603. And the server responds to the policy acquisition request, acquires the prioritizing policy and the rendering policy from the storage module and sends the prioritizing policy and the rendering policy to the terminal.
604. The terminal receives the priority division strategy and the rendering strategy sent by the game server through the game client, and caches the priority division strategy and the rendering strategy in the local memory.
605. The terminal detects the running state of the local terminal equipment in real time so as to detect the change condition of the system temperature.
On the basis of the embodiments shown in fig. 2 to 6 described above, an embodiment of the present application provides a flowchart of still another image frame display method, as shown in fig. 7, which includes the following steps.
701. The terminal obtains the current temperature of the terminal at fixed time in the virtual game process through the game client.
702. The terminal judges the current temperature grade of the terminal through the game client, namely, judges the temperature interval of the current temperature of the terminal.
703. If the current temperature of the terminal belongs to a temperature interval T1 and the temperature is in a normal range, judging whether the current frame rate reaches the maximum frame rate or not; if the current frame rate reaches the maximum frame rate, step 705 is performed; if the current frame rate does not reach the maximum frame rate, the current frame rate is increased.
The maximum temperature of the temperature interval T1 is equal to the minimum temperature of the temperature interval T2, and the maximum temperature of the temperature interval T2 is equal to the minimum temperature of the temperature interval T3.
704. If the current temperature of the terminal belongs to a temperature interval T3, judging whether the rendering of the image frame can be completed within the target time; if yes, go to step 705; if the rendering of the image frame cannot be completed within the target time, the current frame rate is reduced. The target time is the time length of rendering one image frame according to the current frame rate.
705. It is determined whether the next image frame is a key frame.
In the embodiment of the present application, whether the current image frame is a key frame can be determined based on the rendering mode of the displayed image frame, which is the same as the second mode of the process of determining whether the next image frame is a key frame, and will not be described herein.
706. If the next image frame is a key frame, the entire image frame is directly rendered based on picture data of the next image frame.
707. If the next image frame is not the key frame, determining the priority of each display area based on the priority division strategy and the rendering strategy, and generating the next frame picture of each display area based on the rendering condition met by the priority of each display area, thereby obtaining the next image frame.
In the embodiment of the present application, in the case where the rendering condition satisfied by the priority of each display area is determined, in the case where the target scale is greater than the threshold, it is determined that the next image frame is a key frame, and then the above-mentioned step 706 is executed again.
The target ratio is a ratio of the number of the sixth display areas to the total number of the plurality of display areas, or a ratio of the sum of areas of the sixth display areas to the area of the display interface, and the sixth display area is a display area with priority meeting the first rendering condition in the plurality of display areas.
708. And the terminal sends the generated image frames to the display module for display through the game client.
709. The process flow of the next image frame is entered in the above manner.
In the embodiment of the application, the terminal adopts high frame rate to improve the display effect of the image frames, so that the user experience is smoother. However, the high frame rate display can cause excessive load of the CPU and the GPU, so that the temperature of the equipment rises, and the temperature of the mobile phone terminal rises sharply, thereby seriously threatening the operation safety of the terminal. According to the scheme provided by the embodiment of the application, the load of the terminal can be kept within a reasonable range while the game experience of the user is ensured, and the temperature abnormality of the terminal is avoided. In the game running process, the display areas of the game are divided, the key display areas are actually rendered, the non-key areas are intelligently predicted according to the images of the previous frame and the key areas, the image frames are obtained, and the calculated amount for generating the image frames is reduced as much as possible.
According to the scheme provided by the embodiment of the application, the influence of the game picture on the user experience is classified, the region which directly influences the user experience is actually rendered, and other regions are intelligently predicted. Therefore, the game can be ensured to run in a high frame, the user has good visual experience, the calculated amount of each frame of image rendering can be greatly reduced, the load of a calculating unit is reduced, the excessively rapid temperature rise is avoided, and the high frame rate can be continuously run.
According to the scheme provided by the embodiment of the application, through a dynamic frame image prediction technology, the game experience of a user can be ensured, and meanwhile, the load of the system can be kept within a reasonable range, so that the abnormal temperature of the terminal is avoided. In the game running process, the image area with the greatest influence on the visual experience of the user is actually rendered, so that the optimal visual experience is provided for the user, and a complete picture is generated through intelligent prediction for other non-key areas, so that the rendering calculation amount is greatly reduced, the system load is not excessively increased while the game is displayed in a high frame, and the user always has smooth game experience of high frame display.
Fig. 8 is a schematic structural diagram of an image frame display device according to an embodiment of the present application, as shown in fig. 8, the device includes:
A determining module 801, configured to determine priorities of a plurality of display areas in a display interface based on a control instruction for a virtual object in a virtual scene and an image frame currently displayed in the display interface, where the currently displayed image frame includes a scene picture of the virtual scene, and the priorities indicate a degree of influence of picture changes of the display areas on a virtual game;
A generating module 802, configured to generate a next frame of picture of the display area based on picture data of a next frame of the display area if the priority of the display area satisfies the first rendering condition;
A prediction module 803, configured to predict a next frame of the display area based on the displayed image frame, if the priority of the display area satisfies the second rendering condition;
the display module 804 is configured to display, on the display interface, an image frame formed by a next frame of the plurality of display areas.
In one possible implementation, the determining module 801 is configured to determine priorities of the plurality of display areas based on the control instruction and a virtual element in the currently displayed image frame, where the virtual element includes at least one of a virtual object, a virtual control, or a virtual map.
In another possible implementation manner, the determining module 801 is configured to query a prioritization policy based on the control instruction and the virtual elements in the currently displayed image frame, where the prioritization policy indicates a priority of a display area including each virtual element under the action of each control instruction; under the condition that the first priority is inquired, setting the priority of the display area containing the virtual element as the first priority, wherein the first priority is the priority of the display area containing the virtual element in the priority division strategy under the action of the control instruction; the priority of the remaining display areas in the display interface is set to the lowest priority.
In another possible implementation manner, the determining module 801 is further configured to determine, based on the operation state parameter of the local device, priorities of the multiple display areas, and a rendering policy, a rendering condition that is satisfied by the priority of each display area, where the rendering condition includes a first rendering condition or a second rendering condition, and the rendering policy indicates a rendering condition that is satisfied by each priority for different operation state parameters of the device.
In another possible implementation manner, the rendering policy includes sub-policies corresponding to multiple state parameter intervals, where the sub-policies corresponding to the state parameter intervals indicate rendering conditions satisfied by each priority when an operation state parameter of the device belongs to the state parameter interval; a determining module 801, configured to query a rendering policy based on an operation state parameter of the local device, to obtain a target sub-policy, where the operation state parameter belongs to a state parameter interval corresponding to the target sub-policy; and inquiring the target sub-strategy based on the priority of the display area to obtain rendering conditions met by the priority of the display area.
In another possible implementation, the target sub-policy further indicates a rendering condition satisfied by the second priority during display of consecutive k image frames, k being an integer greater than 1; a determining module 801, configured to determine, based on target information and a target sub-policy, a rendering condition currently met by the second priority, where the target information indicates a situation of the rendering condition met by the second priority in generating k-1 image frames that are displayed, where the k-1 image frames include the currently displayed image frame and the k-1 image frames are consecutive.
In another possible implementation, as shown in fig. 9, the apparatus further includes:
The fusing module 805 is configured to fuse the plurality of first display areas to obtain a fused display area when the plurality of first display areas have overlapping areas, where the first display area is any one of the plurality of display areas;
The determining module 801 is further configured to determine that the priority of the fused display area meets the first rendering condition if the priority of any one of the first display areas meets the first rendering condition;
The determining module 801 is further configured to determine that the priority of the fused display area satisfies the second rendering condition when the priorities of the plurality of first display areas satisfy the second rendering condition.
In another possible implementation manner, the determining module 801 is further configured to determine that the priority of the second display area meets the second rendering condition when there is an overlapping area in the plurality of first display areas and the rendering conditions met by the priorities of the plurality of first display areas are different; the second display area is a display area except a third display area in the plurality of first display areas, and the third display area is a display area with priority meeting the first rendering condition in the plurality of first display areas.
In another possible implementation manner, the prediction module 803 is configured to predict, based on the history frame of the fourth display area and the next frame of the fifth display area, the next frame of the fourth display area if the priority of the fourth display area satisfies the second rendering condition, where the fourth display area is any one of the plurality of display areas whose priority satisfies the second rendering condition, and the fifth display area is a display area of the plurality of display areas whose priority satisfies the first rendering condition and is adjacent to the fourth display area.
In another possible implementation manner, the generating module 802 is configured to generate, based on picture data of a next frame of the display area, a next frame picture of the display area if it is determined that the next image frame is a non-key frame and the priority of the display area satisfies the first rendering condition.
In another possible implementation manner, the generating module 802 is further configured to generate the next image frame based on the picture data of the next frame of the display interface if it is determined that the next image frame is a key frame;
The display module 804 is further configured to display a next image frame in the display interface.
In another possible implementation manner, the determining module 801 is further configured to determine that the next image frame is a key frame if the target proportion is greater than a threshold, where the target proportion is a ratio of a number of sixth display areas to a total number of the plurality of display areas, or a ratio of a sum of areas of the sixth display areas to an area of the display interface, and the sixth display area is a display area, where priorities of the plurality of display areas satisfy the first rendering condition; or alternatively
The determining module 801 is further configured to determine, when the displayed n image frames are obtained based on the second rendering condition only, that the next image frame is a key frame, the n image frames include a currently displayed image frame and the n image frames are consecutive, and n is an integer greater than 0.
In another possible implementation, as shown in fig. 9, the apparatus further includes:
An obtaining module 806, configured to obtain, when the operation state parameter of the local device belongs to the first state parameter interval, a current frame rate and a maximum frame rate, where the current frame rate is a frame rate according to which an image frame is currently rendered;
an adjustment module 807 for increasing the current frame rate in case the current frame rate is smaller than the maximum frame rate.
In another possible implementation manner, the obtaining module 806 is further configured to obtain a first duration and a second duration when the running state parameter belongs to the second state parameter interval, where the first duration is a duration of currently rendering one image frame, the second duration is a duration of rendering one image frame according to the current frame rate, and a minimum running state parameter in the second state parameter interval is greater than a maximum running state parameter in the first state parameter interval;
the adjusting module 807 is further configured to reduce the current frame rate if the first time period is longer than the second time period.
It should be noted that: the image frame display device provided in the above embodiment is only exemplified by the division of the above functional modules, and in practical application, the above functional allocation may be performed by different functional modules according to needs, that is, the internal structure of the computer device is divided into different functional modules, so as to complete all or part of the functions described above. In addition, the image frame display device and the image frame display method provided in the foregoing embodiments belong to the same concept, and specific implementation processes thereof are detailed in the method embodiments, which are not repeated herein.
The embodiment of the application also provides a computer device, which comprises a processor and a memory, wherein at least one computer program is stored in the memory, and the at least one computer program is loaded and executed by the processor to realize the operations executed by the image frame display method of the embodiment.
Optionally, the computer device is provided as a terminal. Fig. 10 shows a block diagram of a terminal 1000 according to an exemplary embodiment of the present application. Terminal 1000 includes: a processor 1001 and a memory 1002.
The processor 1001 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. The processor 1001 may be implemented in at least one hardware form of DSP (DIGITAL SIGNAL Processing), FPGA (Field-Programmable gate array), PLA (Programmable Logic Array ). The processor 1001 may also include a main processor, which is a processor for processing data in the awake state, also called a CPU, and a coprocessor; a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, the processor 1001 may be integrated with a GPU for use in responsible for rendering and rendering of content that is to be displayed by the display screen. In some embodiments, the processor 1001 may also include an AI (ARTIFICIAL INTELLIGENCE ) processor for processing computing operations related to machine learning.
Memory 1002 may include one or more computer-readable storage media, which may be non-transitory. Memory 1002 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1002 is used to store at least one computer program for execution by processor 1001 to implement the image frame display method provided by the method embodiments of the present application.
In some embodiments, terminal 1000 can optionally further include: a peripheral interface 1003, and at least one peripheral. The processor 1001, the memory 1002, and the peripheral interface 1003 may be connected by a bus or signal line. The various peripheral devices may be connected to the peripheral device interface 1003 via a bus, signal wire, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1004, a display 1005, a camera assembly 1006, audio circuitry 1007, and a power supply 1008.
Peripheral interface 1003 may be used to connect I/O (Input/Output) related at least one peripheral to processor 1001 and memory 1002. In some embodiments, processor 1001, memory 1002, and peripheral interface 1003 are integrated on the same chip or circuit board; in some other embodiments, either or both of the processor 1001, memory 1002, and peripheral interface 1003 may be implemented on a separate chip or circuit board, which is not limited in this embodiment.
Radio Frequency circuit 1004 is used to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. Radio frequency circuitry 1004 communicates with a communication network and other communication devices via electromagnetic signals. The radio frequency circuit 1004 converts an electrical signal into an electromagnetic signal for transmission, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1004 includes: antenna systems, RF transceivers, one or more amplifiers, tuners, oscillators, digital signal processors, codec chipsets, subscriber identity module cards, and so forth. Radio frequency circuitry 1004 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocol includes, but is not limited to: the world wide web, metropolitan area networks, intranets, generation mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (WIRELESS FIDELITY ) networks. In some embodiments, the radio frequency circuit 1004 may further include NFC (NEAR FIELD Communication) related circuits, which is not limited by the present application.
The display screen 1005 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display 1005 is a touch screen, the display 1005 also has the ability to capture touch signals at or above the surface of the display 1005. The touch signal may be input to the processor 1001 as a control signal for processing. At this time, the display 1005 may also be used to provide virtual buttons and/or virtual keyboards, also referred to as soft buttons and/or soft keyboards. In some embodiments, display 1005 may be one, disposed on the front panel of terminal 1000; in other embodiments, display 1005 may be provided in at least two, separately provided on different surfaces of terminal 1000 or in a folded configuration; in other embodiments, display 1005 may be a flexible display disposed on a curved surface or a folded surface of terminal 1000. Even more, the display 1005 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The display 1005 may be made of LCD (Liquid CRYSTAL DISPLAY), OLED (Organic Light-Emitting Diode) or other materials.
The camera assembly 1006 is used to capture images or video. Optionally, camera assembly 1006 includes a front camera and a rear camera. The front camera is arranged on the front panel of the terminal, and the rear camera is arranged on the back of the terminal. In some embodiments, the at least two rear cameras are any one of a main camera, a depth camera, a wide-angle camera and a tele camera, so as to realize that the main camera and the depth camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize a panoramic shooting and Virtual Reality (VR) shooting function or other fusion shooting functions. In some embodiments, camera assembly 1006 may also include a flash. The flash lamp can be a single-color temperature flash lamp or a double-color temperature flash lamp. The dual-color temperature flash lamp refers to a combination of a warm light flash lamp and a cold light flash lamp, and can be used for light compensation under different color temperatures.
The audio circuit 1007 may include a microphone and a speaker. The microphone is used for collecting sound waves of users and environments, converting the sound waves into electric signals, and inputting the electric signals to the processor 1001 for processing, or inputting the electric signals to the radio frequency circuit 1004 for voice communication. For purposes of stereo acquisition or noise reduction, the microphone may be multiple, each located at a different portion of terminal 1000. The microphone may also be an array microphone or an omni-directional pickup microphone. The speaker is used to convert electrical signals from the processor 1001 or the radio frequency circuit 1004 into sound waves. The speaker may be a conventional thin film speaker or a piezoelectric ceramic speaker. When the speaker is a piezoelectric ceramic speaker, not only the electric signal can be converted into a sound wave audible to humans, but also the electric signal can be converted into a sound wave inaudible to humans for ranging and other purposes. In some embodiments, audio circuit 1007 may also include a headphone jack.
Power supply 1008 is used to power the various components in terminal 1000. The power supply 1008 may be an alternating current, a direct current, a disposable battery, or a rechargeable battery. When the power supply 1008 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 1000 can further include one or more sensors 1009. The one or more sensors 1009 include, but are not limited to: a temperature sensor 1010. Temperature sensor 1010 is capable of detecting the temperature of terminal 1000.
Those skilled in the art will appreciate that the structure shown in fig. 10 is not limiting and that terminal 1000 can include more or fewer components than shown, or certain components can be combined, or a different arrangement of components can be employed.
Optionally, the computer device is provided as a server. Fig. 11 is a schematic structural diagram of a server according to an embodiment of the present application, where the server 1100 may have a relatively large difference due to different configurations or performances, and may include one or more processors (Central Processing Units, CPUs) 1101 and one or more memories 1102, where the memories 1102 store at least one computer program, and the at least one computer program is loaded and executed by the processors 1101 to implement the methods provided in the above-mentioned method embodiments. Of course, the server may also have a wired or wireless network interface, a keyboard, an input/output interface, and other components for implementing the functions of the device, which are not described herein.
The embodiment of the present application also provides a computer-readable storage medium having stored therein at least one computer program loaded and executed by a processor to implement the operations performed by the image frame display method of the above embodiment.
The embodiment of the present application also provides a computer program product, which includes a computer program that, when executed by a processor, implements the operations performed by the image frame display method of the above embodiment.
Those of ordinary skill in the art will appreciate that all or a portion of the steps implementing the above embodiments may be implemented by hardware, or may be implemented by a program for instructing relevant hardware, where the program may be stored in a computer readable storage medium, and the above storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The foregoing description of the embodiments of the application is merely illustrative of the principles of the embodiments of the present application, and various modifications, equivalents, improvements, etc. may be made without departing from the spirit and principles of the embodiments of the application.

Claims (29)

1. An image frame display method, the method comprising:
Determining priorities of a plurality of display areas in a display interface based on a control instruction of a virtual object in a virtual scene and a current display image frame in the display interface, wherein the current display image frame comprises scene images of the virtual scene, the priorities indicate the influence degree of image changes of the display areas on virtual games, and the influence degree of image changes of display areas with high priorities on the virtual games is higher than that of image changes of display areas with low priorities on any two of the plurality of display areas;
Determining a rendering condition met by the priority of each display area based on the running state parameters of the local terminal equipment, the priorities of the display areas and a rendering strategy, wherein the rendering condition comprises a first rendering condition or a second rendering condition, the rendering strategy indicates the rendering condition met by each priority of different running state parameters of the equipment, and the priority meeting the first rendering condition is higher than the priority meeting the second rendering condition;
Generating a next frame picture of the display area based on picture data of the next frame of the display area under the condition that the priority of the display area meets the first rendering condition;
predicting a next frame of picture of the display area based on the displayed image frame in case the priority of the display area satisfies the second rendering condition;
And displaying an image frame formed by a next frame picture of the display areas in the display interface.
2. The method of claim 1, wherein determining the priority of the plurality of display areas in the display interface based on the control command for the virtual object in the virtual scene and the image frame currently displayed in the display interface comprises:
And determining the priority of the display areas based on the control instruction and a virtual element in the currently displayed image frame, wherein the virtual element comprises at least one of a virtual object, a virtual control or a virtual map.
3. The method of claim 2, wherein the determining the priority of the plurality of display regions based on the control instruction and a virtual element in the currently displayed image frame comprises:
Inquiring a priority classification strategy based on the control instruction and the virtual elements in the currently displayed image frame, wherein the priority classification strategy indicates the priority of a display area containing each virtual element under the action of each control instruction;
Under the condition that a first priority is inquired, setting the priority of a display area containing the virtual element as the first priority, wherein the first priority is the priority of the display area containing the virtual element in the priority division strategy under the action of the control instruction;
and setting the priority of the rest display areas in the display interface as the lowest priority.
4. The method according to claim 1, wherein the rendering policy includes sub-policies corresponding to a plurality of status parameter intervals, the sub-policies corresponding to status parameter intervals indicating rendering conditions satisfied by each priority in a case where an operation status parameter of the device belongs to the status parameter interval; the determining, based on the running state parameter of the home terminal device, the priorities of the plurality of display areas, and the rendering policy, a rendering condition satisfied by the priority of each display area includes:
Inquiring the rendering strategy based on the running state parameters of the local terminal equipment to obtain a target sub-strategy, wherein the running state parameters belong to a state parameter interval corresponding to the target sub-strategy;
And inquiring the target sub-strategy based on the priority of the display area to obtain rendering conditions met by the priority of the display area.
5. The method of claim 4, wherein the target sub-policy further indicates a rendering condition that the second priority satisfies during display of consecutive k image frames, k being an integer greater than 1; the querying the target sub-policy based on the priority of the display area to obtain the rendering condition satisfied by the priority of the display area includes:
determining rendering conditions currently met by the second priority based on target information and the target sub-policy, the target information indicating a situation of rendering conditions met by the second priority in generating k-1 image frames that are displayed, the k-1 image frames including the currently displayed image frame and the k-1 image frames being consecutive.
6. The method according to claim 1, wherein the method further comprises, in a case where the priority of the display area satisfies the first rendering condition, based on the picture data of the next frame of the display area:
under the condition that a plurality of first display areas have overlapping areas, the plurality of first display areas are fused to obtain a fused display area, wherein the first display area is any display area in the plurality of display areas;
determining that the priority of the fusion display area meets the first rendering condition under the condition that the priority of any first display area meets the first rendering condition;
And determining that the priority of the fusion display area meets the second rendering condition under the condition that the priorities of the plurality of first display areas meet the second rendering condition.
7. The method according to claim 1, wherein the method further comprises, in a case where the priority of the display area satisfies the first rendering condition, based on the picture data of the next frame of the display area:
Determining that the priority of a second display area meets the second rendering condition when the plurality of first display areas have overlapping areas and the rendering conditions met by the priorities of the plurality of first display areas are different;
The second display area is a display area except a third display area in the first display areas, and the third display area is a display area with priority meeting the first rendering condition in the first display areas.
8. The method according to claim 1, wherein predicting a next frame picture of the display area based on the displayed image frame in a case where the priority of the display area satisfies the second rendering condition, comprises:
And when the priority of a fourth display area meets the second rendering condition, predicting a next frame picture of the fourth display area based on a history frame picture of the fourth display area and a next frame picture of a fifth display area, wherein the fourth display area is any display area of the plurality of display areas, the priority of which meets the second rendering condition, and the fifth display area is a display area of the plurality of display areas, the priority of which meets the first rendering condition and is adjacent to the fourth display area.
9. The method according to claim 1, wherein the generating the next frame picture of the display area based on picture data of the next frame of the display area in the case where the priority of the display area satisfies the first rendering condition includes:
And generating a next frame picture of the display area based on picture data of the next frame of the display area under the condition that the next image frame is determined to be a non-key frame and the priority of the display area meets the first rendering condition.
10. The method according to claim 1, wherein the method further comprises:
generating a next image frame based on picture data of the next frame of the display interface under the condition that the next image frame is determined to be a key frame;
And displaying the next image frame in the display interface.
11. The method according to claim 10, wherein the method further comprises:
Determining that the next image frame is the key frame when a target proportion is greater than a threshold value, wherein the target proportion is a ratio of the number of sixth display areas to the total number of the plurality of display areas or a ratio of the sum of the areas of the sixth display areas to the area of the display interface, and the sixth display area is a display area with priority meeting the first rendering condition in the plurality of display areas; or alternatively
In the case where the n displayed image frames are obtained based on the second rendering condition only, determining the next image frame as the key frame, the n image frames including the currently displayed image frame and the n image frames being consecutive, n being an integer greater than 0.
12. The method according to claim 1, wherein the method further comprises:
under the condition that the running state parameters of the local terminal equipment belong to a first state parameter interval, acquiring a current frame rate and a maximum frame rate, wherein the current frame rate is the frame rate according to which an image frame is currently rendered;
and increasing the current frame rate if the current frame rate is less than the maximum frame rate.
13. The method according to claim 12, wherein the method further comprises:
acquiring a first time length and a second time length when the running state parameters belong to a second state parameter interval, wherein the first time length is the time length for currently rendering one image frame, the second time length is the time length for rendering one image frame according to the current frame rate, and the smallest running state parameter in the second state parameter interval is larger than the largest running state parameter in the first state parameter interval;
and reducing the current frame rate under the condition that the first time period is longer than the second time period.
14. An image frame display device, the device comprising:
The determining module is used for determining the priority of a plurality of display areas in the display interface based on a control instruction of a virtual object in a virtual scene and an image frame currently displayed in the display interface, wherein the currently displayed image frame comprises scene images of the virtual scene, the priority indicates the influence degree of image changes of the display areas on virtual contrast, and for any two display areas in the plurality of display areas, the influence degree of image changes of the display areas with high priority on the virtual contrast is higher than the influence degree of image changes of the display areas with low priority on the virtual contrast;
The determining module is further configured to determine a rendering condition that is satisfied by the priority of each display area based on the running state parameter of the local device, the priorities of the multiple display areas, and a rendering policy, where the rendering condition includes a first rendering condition or a second rendering condition, and the rendering policy indicates that, for each priority of the running state parameters that are different from the device, the rendering condition that is satisfied by each priority satisfies the first rendering condition is higher than the priority that satisfies the second rendering condition;
a generating module, configured to generate a next frame picture of the display area based on picture data of a next frame of the display area if the priority of the display area meets the first rendering condition;
A prediction module, configured to predict a next frame picture of the display area based on a displayed image frame if the priority of the display area satisfies the second rendering condition;
And the display module is used for displaying an image frame formed by a next frame picture of the display areas in the display interface.
15. The apparatus of claim 14, wherein the means for determining determines a priority of the plurality of display regions based on the control instruction and a virtual element in the currently displayed image frame, the virtual element comprising at least one of a virtual object, a virtual control, or a virtual map.
16. The apparatus of claim 15, wherein the determining module is configured to query a prioritization policy based on the control instructions and virtual elements in the currently displayed image frame, the prioritization policy indicating a priority of a display area containing each virtual element under the action of each control instruction; under the condition that a first priority is inquired, setting the priority of a display area containing the virtual element as the first priority, wherein the first priority is the priority of the display area containing the virtual element in the priority division strategy under the action of the control instruction; and setting the priority of the rest display areas in the display interface as the lowest priority.
17. The apparatus of claim 14, wherein the rendering policy includes sub-policies corresponding to a plurality of status parameter intervals, the sub-policies corresponding to status parameter intervals indicating rendering conditions satisfied by each priority if an operational status parameter of the device belongs to the status parameter interval; the determining module is used for inquiring the rendering strategy based on the running state parameters of the local terminal equipment to obtain a target sub-strategy, wherein the running state parameters belong to a state parameter interval corresponding to the target sub-strategy; and inquiring the target sub-strategy based on the priority of the display area to obtain rendering conditions met by the priority of the display area.
18. The apparatus of claim 17, wherein the target sub-policy further indicates a rendering condition that the second priority satisfies during display of consecutive k image frames, k being an integer greater than 1; the determining module is configured to determine, based on target information and the target sub-policy, a rendering condition currently satisfied by the second priority, where the target information indicates a case of the rendering condition satisfied by the second priority in a process of generating k-1 image frames that are displayed, the k-1 image frames including the currently displayed image frame and the k-1 image frames being consecutive.
19. The apparatus of claim 14, wherein the apparatus further comprises:
The fusion module is used for fusing the plurality of first display areas to obtain a fusion display area under the condition that the plurality of first display areas have overlapping areas, wherein the first display area is any one of the plurality of display areas;
The determining module is further configured to determine that the priority of the fused display area meets the first rendering condition when the priority of any one of the first display areas meets the first rendering condition;
the determining module is further configured to determine that the priority of the fused display area satisfies the second rendering condition when the priorities of the plurality of first display areas satisfy the second rendering condition.
20. The apparatus of claim 14, wherein the determining module is further configured to determine that a priority of a second display region satisfies the second rendering condition if there is an overlapping region of a plurality of first display regions and the priority of the plurality of first display regions satisfies a different rendering condition; the second display area is a display area except a third display area in the first display areas, and the third display area is a display area with priority meeting the first rendering condition in the first display areas.
21. The apparatus of claim 14, wherein the prediction module is configured to predict a next frame of a fourth display area based on a history frame of the fourth display area and a next frame of a fifth display area, where the fourth display area is any one of the plurality of display areas whose priority satisfies the second rendering condition, and the fifth display area is a display area of the plurality of display areas whose priority satisfies the first rendering condition and is adjacent to the fourth display area, when the priority satisfies the second rendering condition.
22. The apparatus of claim 14, wherein the means for generating is configured to generate a next frame picture of the display area based on picture data of a next frame of the display area if it is determined that the next image frame is a non-key frame and the priority of the display area satisfies the first rendering condition.
23. The apparatus of claim 14, wherein the generating module is further configured to generate a next image frame based on picture data of a next frame of the display interface if the next image frame is determined to be a key frame;
the display module is further configured to display the next image frame in the display interface.
24. The apparatus of claim 23, wherein the determining module is further configured to determine that the next image frame is the keyframe if a target ratio is greater than a threshold, the target ratio being a ratio of a number of sixth display regions to a total number of the plurality of display regions, or a ratio of a sum of areas of the sixth display regions to an area of the display interface, the sixth display region being a display region of the plurality of display regions having a priority that satisfies the first rendering condition; or alternatively
The determining module is further configured to determine that the next image frame is the key frame if the displayed n image frames are obtained based on the second rendering condition only, where the n image frames include the currently displayed image frame and the n image frames are continuous, and n is an integer greater than 0.
25. The apparatus of claim 14, wherein the apparatus further comprises:
the system comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring a current frame rate and a maximum frame rate under the condition that an operation state parameter of local equipment belongs to a first state parameter interval, wherein the current frame rate is a frame rate according to which an image frame is currently rendered;
and the adjusting module is used for increasing the current frame rate under the condition that the current frame rate is smaller than the maximum frame rate.
26. The apparatus of claim 25, wherein the obtaining module is further configured to obtain a first duration and a second duration when the operation state parameter belongs to a second state parameter interval, the first duration being a duration for currently rendering one image frame, the second duration being a duration for rendering one image frame at the current frame rate, and a minimum operation state parameter in the second state parameter interval being greater than a maximum operation state parameter in the first state parameter interval;
The adjusting module is further configured to reduce the current frame rate when the first time period is longer than the second time period.
27. A computer device comprising a processor and a memory, wherein the memory has stored therein at least one computer program that is loaded and executed by the processor to perform the operations performed by the image frame display method of any of claims 1 to 13.
28. A computer readable storage medium having stored therein at least one computer program loaded and executed by a processor to implement the operations performed by the image frame display method of any one of claims 1 to 13.
29. A computer program product comprising a computer program which, when executed by a processor, performs the operations performed by the image frame display method of any one of claims 1 to 13.
CN202410280714.6A 2024-03-12 2024-03-12 Image frame display method, device, computer equipment and storage medium Active CN117899473B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410280714.6A CN117899473B (en) 2024-03-12 2024-03-12 Image frame display method, device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410280714.6A CN117899473B (en) 2024-03-12 2024-03-12 Image frame display method, device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN117899473A CN117899473A (en) 2024-04-19
CN117899473B true CN117899473B (en) 2024-06-04

Family

ID=90692241

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410280714.6A Active CN117899473B (en) 2024-03-12 2024-03-12 Image frame display method, device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117899473B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109637406A (en) * 2019-01-04 2019-04-16 京东方科技集团股份有限公司 A kind of display methods of display device, display device and readable storage medium storing program for executing
CN111228797A (en) * 2020-01-13 2020-06-05 腾讯科技(深圳)有限公司 Data processing method, data processing device, computer and readable storage medium
CN111933038A (en) * 2020-08-31 2020-11-13 京东方科技集团股份有限公司 Display device, display control method, and control device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112445315A (en) * 2019-08-28 2021-03-05 北京小米移动软件有限公司 Control method and device for screen refresh frame rate and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109637406A (en) * 2019-01-04 2019-04-16 京东方科技集团股份有限公司 A kind of display methods of display device, display device and readable storage medium storing program for executing
CN111228797A (en) * 2020-01-13 2020-06-05 腾讯科技(深圳)有限公司 Data processing method, data processing device, computer and readable storage medium
CN111933038A (en) * 2020-08-31 2020-11-13 京东方科技集团股份有限公司 Display device, display control method, and control device

Also Published As

Publication number Publication date
CN117899473A (en) 2024-04-19

Similar Documents

Publication Publication Date Title
JP7476109B2 (en) Method, device, terminal and computer program for controlling interaction between virtual objects and virtual scenes
CN110141859B (en) Virtual object control method, device, terminal and storage medium
CN111921197B (en) Method, device, terminal and storage medium for displaying game playback picture
CN112717396B (en) Interaction method, device, terminal and storage medium based on virtual pet
CN111760281B (en) Cutscene playing method and device, computer equipment and storage medium
EP4131973A1 (en) Method and apparatus for processing live-streaming data
CN113058264A (en) Virtual scene display method, virtual scene processing method, device and equipment
US20230070612A1 (en) Operation prompting method and apparatus, terminal, and storage medium
CN114415907A (en) Media resource display method, device, equipment and storage medium
JP2024509064A (en) Location mark display method, device, equipment and computer program
CN110833695B (en) Service processing method, device, equipment and storage medium based on virtual scene
CN113041613B (en) Method, device, terminal and storage medium for reviewing game
US20230347240A1 (en) Display method and apparatus of scene picture, terminal, and storage medium
CN111589117B (en) Method, device, terminal and storage medium for displaying function options
CN117899473B (en) Image frame display method, device, computer equipment and storage medium
CN113599810B (en) Virtual object-based display control method, device, equipment and medium
CN113577781B (en) Non-player character NPC control method, device, equipment and medium
CN112843703B (en) Information display method, device, terminal and storage medium
CN112750449B (en) Echo cancellation method, device, terminal, server and storage medium
CN112188268B (en) Virtual scene display method, virtual scene introduction video generation method and device
CN117771649A (en) Method, device, equipment and storage medium for controlling virtual character
CN115361292B (en) Resource packet transmitting method, device, equipment and storage medium
CN118059477A (en) Picture display method, device, computer equipment and storage medium
CN110349558B (en) Sound effect playing method, device, terminal and storage medium
CN113633978B (en) Virtual skill configuration method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant