CN113018848A - Game picture display method, related device, equipment and storage medium - Google Patents

Game picture display method, related device, equipment and storage medium Download PDF

Info

Publication number
CN113018848A
CN113018848A CN202110254911.7A CN202110254911A CN113018848A CN 113018848 A CN113018848 A CN 113018848A CN 202110254911 A CN202110254911 A CN 202110254911A CN 113018848 A CN113018848 A CN 113018848A
Authority
CN
China
Prior art keywords
game
behavior data
data
rendering
behavior
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110254911.7A
Other languages
Chinese (zh)
Other versions
CN113018848B (en
Inventor
张富春
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202110254911.7A priority Critical patent/CN113018848B/en
Publication of CN113018848A publication Critical patent/CN113018848A/en
Application granted granted Critical
Publication of CN113018848B publication Critical patent/CN113018848B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Optics & Photonics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application discloses a game picture display method based on artificial intelligence technology, which comprises the steps of obtaining game state data corresponding to a first moment; acquiring N behavior data corresponding to the second moment according to the game state data; rendering each behavior data in the N behavior data to obtain N rendering results; and sending the N rendering results to the terminal equipment, so that the terminal equipment determines a target rendering result from the N rendering results according to the target behavior data, and displays the target rendering result, wherein the target rendering result comprises a game picture of the cloud game. The application also provides a device, an apparatus and a storage medium. The cloud game server renders each behavior data in advance and obtains a corresponding game picture, when a user inputs real target behavior data, the rendered target rendering result is directly pulled, and the game picture is rendered in advance, so that shorter delay is brought to the user.

Description

Game picture display method, related device, equipment and storage medium
Technical Field
The present application relates to the field of industrial applications, and in particular, to a game screen display method, a related apparatus, a device, and a storage medium.
Background
The cloud game is a game mode based on cloud computing, all games run in corresponding servers in a running mode of the cloud game, and rendered game pictures are compressed and then transmitted to terminal equipment used by a user through a network. The terminal equipment does not need any high-end processor and display card, and only needs basic video decompression capacity.
At present, a terminal device displays a game picture in a decompression mode, a user inputs related operations through the terminal device, the terminal device sends operation parameters to a server, the server renders a next frame of game picture according to the operation parameters, the content of the game picture is compressed and then fed back to the terminal device, and the terminal device can display the frame of game picture.
However, currently, rendering of a game screen depends on an operation of a user at a terminal device, and therefore, the game screen subjected to rendering processing is delayed from the operation of a player. Rendering the screen requires certain computing resources, which causes delay in the game screen displayed on the terminal device side.
Disclosure of Invention
The embodiment of the application provides a game picture display method, a related device, equipment and a storage medium, wherein a cloud game server can render each behavior data in advance and obtain a corresponding game picture, and when a user inputs real target behavior data, a rendered target rendering result is directly pulled, namely the corresponding game picture is displayed. Since the game screen is rendered in advance, a shorter delay is brought to the user.
In view of the above, an aspect of the present application provides a method for displaying a game screen, including:
acquiring game state data corresponding to a first moment;
acquiring N behavior data corresponding to a second moment according to the game state data, wherein the second moment occurs after the first moment, and N is an integer greater than or equal to 1;
rendering each behavior data in the N behavior data to obtain N rendering results, wherein the rendering results and the behavior data have corresponding relations;
and sending the N rendering results to the terminal equipment, so that the terminal equipment determines a target rendering result from the N rendering results according to the target behavior data, and displays the target rendering result, wherein the target behavior data represents data for controlling the behavior of the game role at the second moment, and the target rendering result comprises a game picture of the cloud game.
Another aspect of the present application provides a method for displaying a game screen, including:
sending game state data corresponding to the first moment to the cloud game server so that the cloud game server obtains N behavior data corresponding to the second moment according to the game state data, wherein the second moment occurs after the first moment, and N is an integer greater than or equal to 1;
receiving N rendering results sent by the cloud game server, wherein the N rendering results are obtained after the cloud game server renders each behavior data in the N behavior data, and the rendering results and the behavior data have a corresponding relation;
receiving target behavior data corresponding to a second moment, wherein the target behavior data represents data for controlling the behavior of the game role at the second moment;
and if the N behavior data comprise target behavior data, determining a target rendering result from the N rendering results, and displaying the target rendering result, wherein the target rendering result and the target behavior data have a corresponding relation, and the target rendering result comprises a game picture of the cloud game.
Another aspect of the present application provides a game screen display device, including:
the acquisition module is used for acquiring game state data corresponding to the first moment;
the acquisition module is further used for acquiring N behavior data corresponding to a second moment according to the game state data, wherein the second moment occurs after the first moment, and N is an integer greater than or equal to 1;
the rendering module is used for rendering each behavior data in the N behavior data to obtain N rendering results, wherein the rendering results and the behavior data have corresponding relations;
and the sending module is used for sending the N rendering results to the terminal equipment so that the terminal equipment determines a target rendering result from the N rendering results according to the target behavior data and displays the target rendering result, wherein the target behavior data represents data for controlling the behavior of the game role at the second moment, and the target rendering result comprises a game picture of the cloud game.
In one possible design, in another implementation of another aspect of the embodiments of the present application, the game state data includes position data of the game character at a first time;
the acquisition module is specifically used for acquiring game map data;
determining N optional positions corresponding to the game role at a second moment according to the game map data and the position data of the game role at the first moment;
and generating N behavior data corresponding to the second moment according to the N optional positions corresponding to the game role at the second moment.
In one possible design, in another implementation of another aspect of the embodiments of the present application, the game state data includes operation data of the game character at a first time;
the obtaining module is specifically used for obtaining N operation rules from an operation rule set according to operation data of the game role at a first moment, wherein the operation rule set comprises M operation rules, and M is an integer greater than or equal to 1;
and generating N behavior data corresponding to the second moment according to the N operation rules.
In one possible design, in another implementation of another aspect of the embodiments of the present application, the game state data includes a character type of the game character and operation data of the game character at a first time;
the obtaining module is specifically used for obtaining Q optional behavior data and the occurrence probability corresponding to each optional behavior data through a behavior prediction model based on the role type of the game role and the operation data of the game role at the first moment, wherein Q is an integer greater than or equal to 1;
based on the occurrence probability corresponding to each selectable behavior data, if the occurrence probability corresponding to the N selectable behavior data is greater than or equal to the probability threshold, the N selectable behavior data is determined as the N behavior data.
In one possible design, in another implementation manner of another aspect of the embodiment of the present application, the game screen display device includes a processing module and a training module;
the obtaining module is further configured to obtain a to-be-trained sample set before obtaining, based on the role type of the game role and operation data of the game role at the first time, the Q pieces of selectable behavior data and the occurrence probability corresponding to each piece of selectable behavior data through the behavior prediction model, where the to-be-trained sample set is derived from at least two operation objects, the to-be-trained sample set includes at least two to-be-trained samples, each to-be-trained sample includes the role type of the game role to be trained, labeled operation data of the game role to be trained at the third time, and labeled operation data of the game role to be trained at the fourth time, and the fourth time occurs after the third time;
the processing module is used for calling a behavior prediction model to be trained to process the character type of the game character to be trained and the labeled operation data of the game character to be trained at the third moment aiming at each sample to be trained in the sample set to be trained so as to obtain the predicted operation data of each sample to be trained at the fourth moment;
and the training module is used for updating the model parameters of the behavior prediction model to be trained according to the prediction operation data of each sample to be trained at the fourth moment and the marking operation data at the fourth moment until the model training conditions are met, so as to obtain the behavior prediction model.
In one possible design, in another implementation manner of another aspect of the embodiment of the present application, the game state data includes an identifier of the target operation object, a character type of the game character, and operation data of the game character at the first time;
the acquisition module is specifically used for determining a behavior prediction model according to the identification of the target operation object;
based on the role type of the game role and the operation data of the game role at the first moment, Q optional behavior data and the occurrence probability corresponding to each optional behavior data are obtained through a behavior prediction model, wherein Q is an integer greater than or equal to 1;
based on the occurrence probability corresponding to each selectable behavior data, if the occurrence probability corresponding to the N selectable behavior data is greater than or equal to the probability threshold, the N selectable behavior data is determined as the N behavior data.
In one possible design, in another implementation manner of another aspect of the embodiment of the present application, the game screen display device includes a processing module and a training module;
the obtaining module is further configured to obtain a to-be-trained sample set before obtaining, based on the role type of the game role and operation data of the game role at the first time, the Q pieces of selectable behavior data and the occurrence probability corresponding to each piece of selectable behavior data through the behavior prediction model, where the to-be-trained sample set is derived from a target operation object, the to-be-trained sample set includes at least one to-be-trained sample, each to-be-trained sample includes the role type of the game role to be trained, labeled operation data of the game role to be trained at the third time, and labeled operation data of the game role to be trained at the fourth time, and the fourth time occurs after the third time;
the processing module is used for calling a behavior prediction model to be trained to process the character type of the game character to be trained and the labeled operation data of the game character to be trained at the third moment aiming at each sample to be trained in the sample set to be trained so as to obtain the predicted operation data of each sample to be trained at the fourth moment;
and the training module is used for updating the model parameters of the behavior prediction model to be trained according to the prediction operation data of each sample to be trained at the fourth moment and the marking operation data at the fourth moment until the model training conditions are met, so as to obtain the behavior prediction model.
In one possible design, in another implementation of another aspect of the embodiments of the present application, N is an integer greater than or equal to 2;
the rendering module is specifically configured to obtain behavior data corresponding to a maximum value of occurrence probability from the N behavior data according to the occurrence probability corresponding to each selectable behavior data, where each behavior data corresponds to one occurrence probability;
rendering the behavior data corresponding to the maximum occurrence probability to obtain a first rendering result;
acquiring behavior data corresponding to the second largest value of the occurrence probability from the N behavior data according to the occurrence probability corresponding to each optional behavior data;
rendering the behavior data corresponding to the occurrence probability secondary maximum value to obtain a second rendering result;
the sending module is specifically used for sending a first rendering result to the terminal equipment within a first time period;
and transmitting the second rendering result to the terminal equipment within a second time period, wherein the second time period is a time period after the first time period.
In one possible design, in another implementation manner of another aspect of the embodiment of the present application, the game screen display device includes a receiving module, a determining module, and a deleting module;
the receiving module is used for sending the N rendering results to the terminal equipment by the sending module so that the terminal equipment receives the picture rendering response sent by the terminal equipment after determining the target rendering result from the N rendering results according to the target behavior data, wherein the picture rendering response comprises an identifier corresponding to the target rendering result;
a determining module, configured to determine a target rendering result from the N rendering results according to the screen rendering response;
and the deleting module is used for deleting (N-1) rendering results from the N rendering results, wherein the (N-1) rendering results are rendering results left after the target rendering result is removed from the N rendering results.
Another aspect of the present application provides a game screen display device, including:
the cloud game server is used for acquiring N behavior data corresponding to a second moment according to the game state data, wherein the second moment occurs after the first moment, and N is an integer greater than or equal to 1;
the cloud game server is used for rendering the N behavior data, and the N rendering results are obtained after the cloud game server renders each behavior data in the N behavior data;
the receiving module is also used for receiving target behavior data corresponding to a second moment, wherein the target behavior data represents data for controlling the behavior of the game role at the second moment;
and the display module is used for determining a target rendering result from the N rendering results and displaying the target rendering result if the N behavior data comprise target behavior data, wherein the target rendering result and the target behavior data have a corresponding relation, and the target rendering result comprises a game picture of the cloud game.
Another aspect of the present application provides a cloud game server, including: a memory, a processor, and a bus system;
wherein, the memory is used for storing programs;
the processor is used for executing the program in the memory, and the processor is used for executing the method provided by the aspects according to the instructions in the program code;
the bus system is used for connecting the memory and the processor so as to enable the memory and the processor to communicate.
Another aspect of the present application provides a terminal device, including: a memory, a processor, and a bus system;
wherein, the memory is used for storing programs;
the processor is used for executing the program in the memory, and the processor is used for executing the method provided by the aspects according to the instructions in the program code;
the bus system is used for connecting the memory and the processor so as to enable the memory and the processor to communicate.
Another aspect of the present application provides a computer-readable storage medium having stored therein instructions, which when executed on a computer, cause the computer to perform the method of the above-described aspects.
In another aspect of the application, a computer program product or computer program is provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the method provided by the above aspects.
According to the technical scheme, the embodiment of the application has the following advantages:
the embodiment of the application provides a game picture display method, which includes the steps that firstly, a cloud game server obtains game state data corresponding to a first moment, then N behavior data corresponding to a second moment are obtained according to the game state data, then each behavior data in the N behavior data is rendered to obtain N rendering results, finally, the cloud game server sends the N rendering results to a terminal device, the terminal device determines a target rendering result from the N rendering results according to the target behavior data, the target rendering result is displayed, the target behavior data represent data for controlling game role behaviors at the second moment, and the target rendering result is a rendered game picture. Through the mode, the cloud game server predicts N behavior data which may appear at the future time by using the game state data at the current time, renders each behavior data in advance and obtains a corresponding game picture, namely, the game progress can be ahead of the real input of a user, and when the real target behavior data is input by the user, the rendered target rendering result is directly pulled, namely, the corresponding game picture is displayed. Since the game screen is rendered in advance, a shorter delay is brought to the user.
Drawings
FIG. 1 is a schematic diagram of an architecture of a game screen display system according to an embodiment of the present application;
FIG. 2 is an interaction diagram of displaying a game screen according to an embodiment of the present application;
FIG. 3 is a schematic diagram of an embodiment of a game screen display method according to an embodiment of the present application;
FIG. 4 is a schematic diagram of generating rendering results based on location data in an embodiment of the present application;
FIG. 5 is another diagram illustrating generation of rendering results based on location data in an embodiment of the present application;
FIG. 6 is another diagram illustrating generation of rendering results based on location data in an embodiment of the present application;
FIG. 7 is a schematic diagram of rendering results generated based on historical operations in an embodiment of the present application;
FIG. 8 is another diagram illustrating generation of rendering results based on historical operations in an embodiment of the present application;
FIG. 9 is a schematic diagram of another embodiment of a game screen display method in the embodiment of the present application;
FIG. 10 is a schematic view of a game screen display device according to an embodiment of the present application;
FIG. 11 is another schematic view of a game screen display device according to an embodiment of the present application;
FIG. 12 is a schematic structural diagram of a server in an embodiment of the present application;
fig. 13 is a schematic structural diagram of a terminal device in the embodiment of the present application.
Detailed Description
The embodiment of the application provides a game picture display method, a related device, equipment and a storage medium, wherein a cloud game server can render each behavior data in advance and obtain a corresponding game picture, and when a user inputs real target behavior data, a rendered target rendering result is directly pulled, namely the corresponding game picture is displayed. Since the game screen is rendered in advance, a shorter delay is brought to the user.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims of the present application and in the drawings described above, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "corresponding" and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Cloud games (cloud gaming) may also be referred to as game on demand (gaming). Cloud game technology enables light-end devices (thin clients) with relatively limited graphics processing and data computing capabilities to run high-quality games. In the cloud game scene, the game is not on the side of the terminal equipment used by the user, but runs in the cloud game server, and the cloud game server renders the game scene into a video and audio stream which is transmitted to the terminal equipment through the network. The terminal equipment does not need to have strong graphic operation and data processing capacity, and only needs to have basic streaming media playing capacity and capacity of acquiring user input instructions and sending the instructions to the cloud game server.
The cloud game is an online game Technology based on a cloud computing (cloud computing) Technology, wherein the cloud computing refers to a delivery and use mode of an Internet Technology (IT) infrastructure and refers to a mode of acquiring required resources in an on-demand and easily-expandable manner through a network; the generalized cloud computing refers to a delivery and use mode of a service, and refers to obtaining a required service in an on-demand and easily-extensible manner through a network. Such services may be IT and software, internet related, or other services. Cloud Computing is a product of development and fusion of traditional computer and Network Technologies, such as Grid Computing (Grid Computing), Distributed Computing (Distributed Computing), Parallel Computing (Parallel Computing), Utility Computing (Utility Computing), Network Storage (Network Storage Technologies), Virtualization (Virtualization), and Load balancing (Load Balance). With the development of diversification of internet, real-time data stream and connecting equipment and the promotion of demands of search service, social network, mobile commerce, open collaboration and the like, cloud computing is rapidly developed. Different from the prior parallel distributed computing, the generation of cloud computing can promote the revolutionary change of the whole internet mode and the enterprise management mode in concept.
Cloud computing relies on Cloud technology (Cloud technology), which is a hosting technology for unifying a series of resources such as hardware, software, and network in a wide area network or a local area network to realize computing, storing, processing, and sharing of data. The cloud technology is based on the general names of network technology, information technology, integration technology, management platform technology, application technology and the like applied in the cloud computing business model, can form a resource pool, is used as required, and is flexible and convenient. Cloud computing technology will become an important support. Background services of the technical network system require a large amount of computing and storage resources, such as video websites, picture-like websites and more web portals. With the high development and application of the internet industry, each article may have its own identification mark and needs to be transmitted to a background system for logic processing, data in different levels are processed separately, and various industrial data need strong system background support and can only be realized through cloud computing.
Because the rendering of the game picture is completed on the cloud game server side, the terminal device needs to feed back the input operation of the user to the Cloud Game Server (CGS), then the cloud game server renders the game picture based on the input operation, sends the rendering result to the terminal device after the rendering is completed, and finally the terminal device directly displays the rendering result, wherein the cloud game server can be a server node in a block chain network, and can realize the storage of data on the block chain network. Based on this, the application provides a display method of a game picture, which can render the game picture in advance, which is equivalent to that the game progress can be ahead of the real input of the user, and when the user inputs the real target behavior data, the rendered target rendering result is directly pulled, so that the delay is reduced.
Referring to fig. 1, fig. 1 is a schematic diagram of an architecture of a game screen display system according to an embodiment of the present application, where the game screen display system includes a server and a terminal device, and a client is disposed on the terminal device. The server related to the application can be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, and can also be a cloud server providing basic cloud computing services such as cloud service, a cloud database, cloud computing, a cloud function, cloud storage, Network service, cloud communication, middleware service, domain name service, safety service, Content Delivery Network (CDN), big data and an artificial intelligence platform. The present application will be described with reference to a cloud game server as an example. The terminal device may be, but is not limited to, a smart phone, a tablet computer, a notebook computer, a palm computer, a personal computer, a smart television, a smart watch, and the like. The terminal device and the server may be directly or indirectly connected through wired or wireless communication, and the application is not limited herein. The number of servers and terminal devices is not limited.
Based on this, please refer to fig. 2 for easy understanding, fig. 2 is an interactive schematic diagram showing a game screen in the embodiment of the present application, and as shown in the figure, specifically:
in step S1, the user inputs an operation for the cloud game via the terminal device, and the terminal device generates game state data corresponding to the first time from the operation input by the user according to the network protocol.
In step S2, the terminal device transmits the game state data corresponding to the first time to an access proxy (proxy) service.
In step S3, the access agent service feeds back the game state data corresponding to the first time (i.e., the current time) to the behavior prediction module in the cloud game server, and obtains N behavior data, for example, behavior data 1 and behavior data 2, through prediction by the behavior prediction module.
In step S4, copy-on-write (COW) is used to store the game state data corresponding to the first time, and if there are N pieces of behavior data, N pieces of behavior data are also stored, similar to splitting the entire game world into multiple parallel universes.
In step S5, the rendering engine simultaneously renders behavior data of a plurality of branches, for example, renders behavior data 1 to obtain rendering result 1, and renders behavior data 2 to obtain rendering result 2.
In step S6, the rendering engine encodes rendering result 1 and rendering result 2 into audio and video streams, and then sends to the access proxy service. When the access proxy service finds that the network condition is not good (for example, the delay is large or the packet loss rate is high), a certain video frame can be discarded, and smoothness is achieved by sacrificing the frame rate.
In step S7, the access proxy service transmits the audio/video stream to the terminal device and converts it into a game screen by the terminal device, thereby completing the process from input to generation of the game screen. Therefore, the rendering engine simultaneously renders a plurality of branches of game behaviors provided by the prediction engine, and the part of game pictures and sound are stored and are sequentially sent to the terminal equipment of the cloud game to wait for being finally presented to the user.
With reference to the above description, a method for displaying a game screen in the present application will be described below from the perspective of a cloud game server, and referring to fig. 3, an embodiment of the method for displaying a game screen in the present application includes:
101. the cloud game server acquires game state data corresponding to the first moment;
in this embodiment, a user starts a cloud game through a terminal device, so that the terminal device receives an input of the user at a first time, converts the input of the user into game state data corresponding to the first time based on a network protocol, and then sends the game state data to an access proxy service, and the access proxy service sends the game state data to a cloud game server. The access proxy service is usually deployed on the edge node to reduce the game delay, or the access proxy service can be directly deployed on the cloud game server.
Specifically, since the game runs in the cloud, the input of the user needs to be converted into a data format that can be supported by the network protocol, for example, the mouse or the keyboard is a Universal Serial Bus (USB) input device, and the handle may be a bluetooth device. Since these devices are all on the terminal device side, information input by these devices needs to be converted into game state data that can be simulated by the cloud game server, and thus, the game scene can be reproduced in the cloud.
The types of the terminal devices include, but are not limited to, a Windows (Windows) system, an IOS (iOS) system, an android (android) system, and a Macintosh operating system (macOS). The input method of the user includes, but is not limited to, input by a mouse, a keyboard, a handle, a touch screen, voice, and the like.
102. The cloud game server acquires N behavior data corresponding to a second moment according to the game state data, wherein the second moment occurs after the first moment, and N is an integer greater than or equal to 1;
in this embodiment, the cloud game server may obtain, according to the game state data at the first time, N behavior data corresponding to the second time based on a preset rule or a neural network model. It should be noted that, if the first time is the current time, the second time is a future time. If the second time is the current time, then the first time is some time in the past. The first time and the second time may be separated by several seconds or several image frames, e.g. by 60 frames.
Specifically, in a specific game scenario, N pieces of behavior data represent the possibility that a user controls a game character to perform some kind of operation, for example, behavior data 1 indicates that the game character turns to the left, behavior data 2 indicates that the game character turns to the right, and behavior data 3 indicates that the game character goes straight.
103. The cloud game server renders each behavior data in the N behavior data to obtain N rendering results, wherein the rendering results and the behavior data have corresponding relations;
in this embodiment, after the N behavior data obtained by prediction, the cloud game server may perform rendering processing on each behavior data, so as to obtain a corresponding rendering result, that is, the rendering result and the behavior data have a one-to-one correspondence relationship. For example, the N behavior data described in connection with step 102 may generate rendering result 1 based on behavior data 1, rendering result 2 based on behavior data 2, and rendering result 3 based on behavior data 3.
It should be noted that rendering in computer graphics refers to the process of generating an image from a model using software. A model is a description of a three-dimensional object in a well-defined language or data structure that includes geometric, viewpoint, texture, and lighting information. It should be noted that the rendering process according to the present application is not limited to rendering a picture, but includes rendering sound, vibration feedback of a gamepad, case switching of a keyboard, and the like, and therefore, the rendering result includes not only a game picture but also one or more of audio data, vibration feedback data, and other data.
104. And the cloud game server sends the N rendering results to the terminal equipment, so that the terminal equipment determines a target rendering result from the N rendering results according to the target behavior data and displays the target rendering result, wherein the target behavior data represents data for controlling the behavior of the game role at the second moment, and the target rendering result comprises a game picture of the cloud game.
In this embodiment, the cloud game server sends the N rendering results obtained after the rendering processing to the terminal device, and the terminal device stores the N rendering results in advance. When the user triggers a real operation at the second moment, the target behavior data may be determined, for example, the behavior of the user controlling the game character hits a certain predicted result, i.e., the target behavior data hits one of the N behavior data, and then the predicted rendering result is played immediately, and the other predicted rendering results are discarded. If the target behavior data does not hit any of the N behavior data, all rendering results are discarded, and the game is continued in accordance with a normal flow.
Specifically, in combination with the contents described in step 102 and step 103, assuming that the user triggers a real operation at the second time to control the game character to turn left, it is determined that the target behavior data matches with the behavior data 1 successfully, and then it is determined that the target rendering result is rendering result 1 from the N rendering results, and the target rendering result is displayed. The target rendering result at least includes a game picture obtained through rendering, and may further include audio data, handle vibration data, keyboard locking data, and the like, which is not limited this time.
In the embodiment of the application, a display method of a game picture is provided. Through the mode, the cloud game server predicts N behavior data which may appear at the future time by using the game state data at the current time, renders each behavior data in advance and obtains a corresponding game picture, namely, the game progress can be ahead of the real input of a user, and when the real target behavior data is input by the user, the rendered target rendering result is directly pulled, namely, the corresponding game picture is displayed. Since the game screen is rendered in advance, a shorter delay is brought to the user.
Optionally, on the basis of the foregoing respective embodiments corresponding to fig. 3, in another optional embodiment provided by the embodiments of the present application, the game state data includes position data of the game character at the first time;
the cloud game server obtains N behavior data corresponding to the second moment according to the game state data, and may specifically include:
the cloud game server acquires game map data;
the cloud game server determines N optional positions corresponding to the game role at a second moment according to the game map data and the position data of the game role at a first moment;
and the cloud game server generates N behavior data corresponding to the second moment according to the N optional positions corresponding to the game role at the second moment.
In this embodiment, a manner of generating a rendering result based on position data is described. In predicting the behavior data of the game character which is possible at the second moment, one or more positions which are possible to appear at the second moment can be determined based on the position data of the game character at the first moment, and then corresponding behavior data can be generated according to each possible position. Specifically, for ease of understanding, the manner in which the N behavior data are determined will be described below in conjunction with the figures.
For example, referring to fig. 4, fig. 4 is a schematic diagram illustrating the rendering result generated based on the position data in the embodiment of the present application, as shown in (a) of fig. 4, in a game scene, a game character controlled by a user is located in the fourth row and the seventh column at a first time, and based on game map data, the game character has three directions to move at a second time. As shown in fig. 4 (B), one selectable position is the seventh row of the fifth row, and based on this, one piece of behavior data corresponding to the second time is generated as "walk down". Further, the behavior data may also be "walk down 2 grid". Furthermore, it is also possible to summarize the probability of occurrence of the behavior data being "walk down" when the game character is in the fourth row and seventh column, for example, the probability of occurrence of "walk down" is 20%, in combination with the historical game data accumulated by a large number of users.
For example, referring to fig. 5, fig. 5 is another schematic diagram illustrating the rendering result generated based on the position data in the embodiment of the present application, as shown in (a) of fig. 5, in a game scene, a game character controlled by a user is located in the fourth row and the seventh column at a first time, and based on game map data, the game character has three directions to move at a second time. As shown in fig. 5 (B), one of the selectable positions is the third row and the seventh column, and based on this, one of the behavior data corresponding to the second time is generated as "go up". Further, the behavior data may also be "go 3 grid up". Furthermore, it is also possible to summarize the probability of occurrence of the behavior data being "go up" when the game character is in the third row and the seventh column, for example, the probability of occurrence of "go up" is 30%, in combination with the historical game data accumulated by a large number of users.
Referring to fig. 6, fig. 6 is another schematic diagram of generating a rendering result based on position data in an embodiment of the present application, as shown in (a) of fig. 6, in a game scene, a game character controlled by a user is located in a fourth row and a seventh column at a first time, and based on game map data, the game character has three directions to move at a second time. As shown in fig. 6 (B), one of the selectable positions is the fourth row and the sixth column, and based on this, one of the behavior data corresponding to the second time is generated as "left go". Further, the behavior data may also be "walk 4 boxes to the left". Furthermore, it is also possible to summarize the probability of occurrence of behavior data being "left-going" when the game character is in the fourth row and sixth column, for example, the probability of occurrence of "left-going" is 50%, in combination with the historical game data accumulated by a large number of users.
It is understood that the possible behavior data of the Game character at the next moment (or at a future moment) is estimated by using the position data of the Game character at the current moment and the Game map data, and the method is suitable for simple Game types, such as "box-pushing" and the like.
Secondly, in the embodiment of the present application, a manner of generating a rendering result based on the position data is provided, and by the manner, the position of the game character on the game map can be determined according to the position data of the game character, and then the position of the game character which may appear at the next moment is determined by combining the game map, so as to generate corresponding behavior data, thereby rendering a game screen corresponding to different behavior data. Since the game frames are rendered in advance, the cloud game server does not have to sacrifice frame rate and frame sharpness to balance the delay. Therefore, the cloud terminal can increase resource consumption and bring a clearer frame rate and a clearer delay of the picture, and accordingly a higher game frame rate and a clearer game picture can be brought to a user.
Optionally, on the basis of the foregoing respective embodiments corresponding to fig. 3, in another optional embodiment provided by the embodiments of the present application, the game state data includes operation data of the game character at the first time;
the cloud game server obtains N behavior data corresponding to the second moment according to the game state data, and may specifically include:
the cloud game server acquires N operation rules from an operation rule set according to operation data of a game role at a first moment, wherein the operation rule set comprises M operation rules, and M is an integer greater than or equal to 1;
and the cloud game server generates N pieces of behavior data corresponding to the second moment according to the N pieces of operation rules.
In this embodiment, a manner of generating a rendering result based on position data and operation data is described. In predicting the behavior data of the game character which is possible at the second moment, one or more possible operations at the second moment may be determined based on the operation data of the game character at the first moment, and then corresponding behavior data may be generated according to each possible operation. Specifically, for ease of understanding, the manner in which the N behavior data are determined will be described below in conjunction with the figures.
Referring to fig. 7 by way of example, fig. 7 is a schematic diagram illustrating rendering results generated based on historical operations in an embodiment of the present invention, as shown in (a) of fig. 7, in a game scene, a game Character a controlled by a user encounters a Non-Player Character (NPC) at a first time, the NPC is a "shepherd" or a "shepherd" who has a conversation with the game Character a and asks the game Character a whether the game Character a needs to replace an item, and for the game Character a, there are two options, each of which corresponds to an operation rule, and one of the options is to approve of item replacement. Based on this, generating a behavior data corresponding to the second time is that the click control is "good, i.e., i like", and then generating a rendering result based on the behavior data, i.e., the rendering result shown in (B) of fig. 7. In addition, historical game data accumulated by a large number of users can be combined, and the occurrence probability that the behavior data is click control "good, i like" can be summarized, for example, the corresponding occurrence probability is 80%.
Referring to fig. 8 by way of example, fig. 8 is another schematic diagram illustrating the rendering result generated based on the historical operations in the embodiment of the present application, as shown in (a) of fig. 8, in a game scene, a user-controlled game character a encounters an NPC at a first time, and similarly, the NPC is "shepherd", and converses with the game character a to ask whether the game character a needs to replace the item, and for the game character a, there are two options, each of which corresponds to an operation rule, and the other of which is to refuse to replace the item. Based on this, generating a behavior data corresponding to the second time as clicking the control "no, thanks", and then generating a rendering result based on the behavior data, i.e. the rendering result shown in (B) of fig. 8. In addition, historical game data accumulated by a large number of users can be combined, and the occurrence probability of the behavior data being that the control is clicked, namely 'no, thank you', can be summarized, for example, the corresponding occurrence probability is 20%.
It should be noted that, a large number of operation rules are usually stored on the cloud game server side, and these operation rules may be set based on the game scenario, may also be set based on the game terrain, and may also be set based on a normal operation mode, which is not limited herein. When a game role controlled by a user reaches a certain game scene, one or more corresponding operation rules in the operation rule set can be selected based on the game scene, for example, the operation rule is "agree to replace", and the corresponding behavior data is "good, i like" click the control ".
It is understood that the operation data of the game character at the current moment is used to estimate the possible behavior data of the game character at the next moment (or at a future moment), which is more suitable for the game type of the scenario class.
Secondly, in the embodiment of the present application, a manner of generating rendering results based on the position data and the operation data is provided, by which a behavior of a game character that may appear at a next moment can be determined according to the operation data of the game character and by combining a preset operation rule set, and based on this, corresponding behavior data can be generated, thereby rendering a game screen corresponding to different behavior data. Because the game picture is rendered well in advance, the cloud game server does not need to sacrifice the frame rate and the picture definition for balancing the delay, so that the cloud game server can increase resource consumption through the cloud end to bring a clearer picture with a higher frame rate and a lower delay, and can bring a higher game frame rate and a clearer game picture for a user.
Optionally, on the basis of the foregoing respective embodiments corresponding to fig. 3, in another optional embodiment provided by the embodiments of the present application, the game state data includes a character type of the game character and operation data of the game character at the first time;
the cloud game server obtains N behavior data corresponding to the second moment according to the game state data, and may specifically include:
the cloud game server obtains Q optional behavior data and the occurrence probability corresponding to each optional behavior data through a behavior prediction model based on the role type of the game role and the operation data of the game role at the first moment, wherein Q is an integer greater than or equal to 1;
and the cloud game server determines the N optional behavior data as the N behavior data if the occurrence probability corresponding to the N optional behavior data is greater than or equal to the probability threshold value based on the occurrence probability corresponding to each optional behavior data.
In this embodiment, a manner of generating a rendering result based on a role type and operation data is introduced. In predicting behavior data of a game character that is likely to occur at a second time, one or more operations that are likely to occur at the second time may be predicted by a behavior prediction model based on a character type of the game character and operation data of the game character at the first time, and then corresponding behavior data may be generated from each of the likely operations, respectively.
Specifically, taking the RPG as an example, the operation behaviors of the user often have a large number of similarities, for example, limited options of a game scenario, or a running direction of a game character, or, further, some common operations of the game character (for example, picking up an article, jumping, squatting, etc.). Based on the above, the character type of the game character and the operation data of the game character at the first time are input to the behavior prediction model, and the Q pieces of optional behavior data and the occurrence probability corresponding to each piece of optional behavior data are output through the behavior prediction model.
For example, at an intersection in a game, a user must select one of the directions of up, down, left and right, and please refer to table 1 in conjunction with a specific scenario, where table 1 shows Q selectable behavior data output based on a behavior prediction model and a probability of occurrence corresponding to each selectable behavior data.
TABLE 1
Optional behavior data Probability of occurrence
Go upwards 50%
Go to the left 30%
Walk to the right 18%
Return from original path 2%
The behavior prediction model may also output the duration of the optional behavior data, which is not limited herein. The cloud game server selects optional behavior data with the occurrence probability greater than or equal to a probability threshold as required behavior data based on the occurrence probability corresponding to each optional behavior data, and assuming that the probability threshold is 20%, the occurrence probability of "walk left" and the occurrence probability of "walk up" are both greater than 20%, and therefore it is determined that the N behavior data include the optional behavior data "walk up" and the optional behavior data "walk left".
It will be appreciated that the use of the character type of the game character and the operation data of the game character at the current time to infer the possible behavior data of the game character at the next time (or at some future time) is suitable for the RPG type, but in practical applications, the scheme is also suitable for use in, for example, ACT, STG, FTG, AVG, SLG, strategy games, music games, leisure games, sports games, racing games, etc. And character types include, but are not limited to, "legal player," "soldier," "tank," "shooter," and "assistance," etc.
Secondly, in the embodiment of the present application, a manner of generating rendering results based on the character type and the operation data is provided, and by the manner, behaviors of the game character which may appear at the next moment can be determined according to the character type and the operation data of the game character by using a trained behavior prediction model, and based on the behaviors, corresponding behavior data can be generated, so that game pictures corresponding to different behavior data are rendered. Because the game picture is rendered well in advance, the cloud game server does not need to sacrifice the frame rate and the picture definition for balancing the delay, so that the cloud game server can increase resource consumption through the cloud end to bring a clearer picture with a higher frame rate and a lower delay, and can bring a higher game frame rate and a clearer game picture for a user.
Optionally, on the basis of the foregoing embodiments corresponding to fig. 3, in another optional embodiment provided by the embodiment of the present application, before the obtaining, by the behavior prediction model, the Q pieces of optional behavior data and the occurrence probability corresponding to each piece of optional behavior data, based on the character type of the game character and the operation data of the game character at the first time, the method may further include:
the cloud game server obtains a to-be-trained sample set, wherein the to-be-trained sample set is derived from at least two operation objects, the to-be-trained sample set comprises at least two to-be-trained samples, each to-be-trained sample comprises a role type of a game role to be trained, labeling operation data of the game role to be trained at a third moment and labeling operation data of the game role to be trained at a fourth moment, and the fourth moment appears after the third moment;
for each sample to be trained in the sample set to be trained, the cloud game server calls a behavior prediction model to be trained to process the character type of the game character to be trained and the labeled operation data of the game character to be trained at the third moment, so as to obtain the predicted operation data of each sample to be trained at the fourth moment;
and the cloud game server updates the model parameters of the behavior prediction model to be trained according to the prediction operation data of each sample to be trained at the fourth moment and the marking operation data at the fourth moment until the model training conditions are met, so as to obtain the behavior prediction model.
In this embodiment, a way of training a behavior prediction model is introduced. The cloud game server obtains a sample set to be trained, the sample set to be trained comprises at least two samples to be trained, and the samples to be trained have different operation objects, wherein the operation objects are users (or players), each operation object has a corresponding identifier, and the identifier can be a user account, a mobile phone number, a mailbox or other identification. And the cloud game server trains the behavior prediction model based on the samples to be trained.
Specifically, each sample to be trained includes a character type of the game character to be trained, tagging operation data of the game character to be trained at a third time, and tagging operation data of the game character to be trained at a fourth time, for example, for the game character a to be trained, the corresponding character type is "legal", the tagging operation data at the third time is "intersection in the game", and the tagging operation data at the third time is "go left", wherein the fourth time occurs after the third time, and several seconds or several image frames may be separated between the third time and the fourth time, for example, 60 frames are separated. Inputting the character type of the game character to be trained in each sample to be trained and the labeled operation data of the game character to be trained at the third moment into the behavior prediction model to be trained, and outputting the predicted operation data at the fourth moment through the behavior prediction model to be trained. Based on the above, the model parameters of the behavior prediction model to be trained can be updated by using a loss function (e.g., a cross entropy loss function), and the behavior prediction model is output until the model training condition is met.
In general, there are two ways to determine whether the model training condition is satisfied, the first is that if the iteration number of the model training reaches the number threshold, the model training condition is considered to be satisfied. Second, if the result of the loss function converges, the model training condition is considered to be satisfied.
It can be understood that through long-time operation of the cloud game, with the increase of the behavior data of the user, the behavior prediction model can be trained through the behavior data of the user, the prediction precision is continuously improved, and the accuracy of the behavior prediction model is higher and higher.
It should be noted that the behavior prediction model in the present application may be a Logistic Regression (LR) model, or a Convolutional Neural Network (CNN) model, or a Deep Neural Network (DNN) model, or a Recurrent Neural Network (RNN) model, or another type of model, which is not limited herein.
The behavior prediction model is obtained through Machine Learning (ML) training, wherein ML is a multi-field cross subject and relates to multi-subjects such as probability theory, statistics, approximation theory, convex analysis and algorithm complexity theory. The special research on how a computer simulates or realizes the learning behavior of human beings so as to acquire new knowledge or skills and reorganize the existing knowledge structure to continuously improve the performance of the computer. ML is the core of artificial intelligence, is the fundamental way to make computers intelligent, and its application is spread over various fields of artificial intelligence. ML and deep learning generally include techniques such as artificial neural networks, belief networks, reinforcement learning, migratory learning, inductive learning, and formal learning. ML is implemented based on Artificial Intelligence (AI) technology, wherein AI is a theory, method, technique, and application system that simulates, extends, and expands human Intelligence, senses the environment, acquires knowledge, and uses knowledge to obtain optimal results using a digital computer or a machine controlled by a digital computer. In other words, AI is an integrated technique of computer science that attempts to understand the essence of intelligence and produces a new intelligent machine that can react in a manner similar to human intelligence. AI is to study the design principles and implementation methods of various intelligent machines, so that the machine has the functions of perception, reasoning and decision making.
The AI technology is a comprehensive subject, and relates to the field of extensive technology, both hardware level technology and software level technology. The AI base technologies generally include technologies such as sensors, dedicated AI chips, cloud computing, distributed storage, big data processing technologies, operating/interactive systems, mechatronics, and the like. The AI software technology mainly includes several major directions such as computer vision technology, speech processing technology, natural language processing technology, and ML/deep learning.
In the embodiment of the application, a method for training a behavior prediction model is provided, and in the method, samples to be trained from different operation objects are used for training the behavior prediction model, so that a general model is obtained, and more accurate behavior data can be predicted based on the behavior prediction model, so that the feasibility and the operability of the scheme are improved.
Optionally, on the basis of the respective embodiments corresponding to fig. 3, in another optional embodiment provided by the embodiments of the present application, the game state data includes an identifier of the target operation object, a character type of the game character, and operation data of the game character at the first time;
the cloud game server obtains N behavior data corresponding to the second moment according to the game state data, and may specifically include:
the cloud game server determines a behavior prediction model according to the identification of the target operation object;
the cloud game server obtains Q optional behavior data and the occurrence probability corresponding to each optional behavior data through a behavior prediction model based on the role type of the game role and the operation data of the game role at the first moment, wherein Q is an integer greater than or equal to 1;
and the cloud game server determines the N optional behavior data as the N behavior data if the occurrence probability corresponding to the N optional behavior data is greater than or equal to the probability threshold value based on the occurrence probability corresponding to each optional behavior data.
In this embodiment, a manner of generating a rendering result based on the identifier, the role type, and the operation data of the target operation object is described. When predicting the behavior data of the game character possible at the second moment, a behavior prediction model used subsequently can be selected according to the identification of the target operation object, then one or more operations which are possible to occur at the second moment are predicted through the behavior prediction model based on the character type of the game character and the operation data of the game character at the first moment, and then corresponding behavior data are generated according to each operation which is possible to occur.
Specifically, taking the RPG as an example, the operation behaviors of the user often have a large number of similarities, for example, limited options of a game scenario, or a running direction of a game character, or, further, some common operations of the game character (for example, picking up an article, jumping, squatting, etc.). Based on the above, the character type of the game character and the operation data of the game character at the first time are input to the behavior prediction model, and the Q pieces of optional behavior data and the occurrence probability corresponding to each piece of optional behavior data are output through the behavior prediction model.
For example, at an intersection in a game, a user must select one of the directions of up, down, left and right, and please refer to table 2 in conjunction with a specific scenario, where table 2 shows Q selectable behavior data output based on a behavior prediction model and a probability of occurrence corresponding to each selectable behavior data.
TABLE 2
Figure BDA0002967744910000141
Figure BDA0002967744910000151
The behavior prediction model may also output the duration of the optional behavior data, which is not limited herein. The cloud game server selects optional behavior data with the occurrence probability greater than or equal to a probability threshold as required behavior data based on the occurrence probability corresponding to each optional behavior data, and assumes that the probability threshold is 20%, and the occurrence probability of "walking right" and the occurrence probability of "walking left" are both greater than 20%, so it is determined that the N behavior data include the optional behavior data "walking right" and the optional behavior data "walking left".
It will be appreciated that the use of the character type of the game character and the operation data of the game character at the current time to infer the possible behavior data of the game character at the next time (or at some future time) is suitable for the RPG type, but in practical applications, the scheme is also suitable for use in, for example, ACT, STG, FTG, AVG, SLG, strategy games, music games, leisure games, sports games, racing games, etc. And character types include, but are not limited to, "legal player," "soldier," "tank," "shooter," and "assistance," etc.
Secondly, in the embodiment of the present application, a manner of generating a rendering result based on the identifier of the target operation object, the character type, and the operation data is provided, in which a behavior prediction model applicable to the target operation object is selected based on the identifier of the target operation object, then a behavior that may occur at the next moment of the game character is determined according to the character type and the operation data of the game character by using the trained behavior prediction model, and based on this, corresponding behavior data may be generated, thereby rendering a game screen corresponding to different behavior data. On the one hand, the behavior prediction model is trained and used in a targeted manner, the accuracy and the reliability of prediction can be improved, and on the other hand, because the game picture is well rendered in advance, the cloud game server does not need to sacrifice the frame rate and the picture definition for balancing delay, so that the cloud game server can increase resource consumption through a cloud end to bring a clearer picture with a higher frame rate and a lower delay, and can bring a clearer game picture with a higher game frame rate for a user.
Optionally, on the basis of the foregoing embodiments corresponding to fig. 3, in another optional embodiment provided by the embodiment of the present application, before the obtaining, by the behavior prediction model, the Q pieces of optional behavior data and the occurrence probability corresponding to each piece of optional behavior data, based on the character type of the game character and the operation data of the game character at the first time, the method may further include:
the cloud game server obtains a to-be-trained sample set, wherein the to-be-trained sample set is derived from a target operation object, the to-be-trained sample set comprises at least one to-be-trained sample, each to-be-trained sample comprises a role type of a game role to be trained, marking operation data of the game role to be trained at a third moment and marking operation data of the game role to be trained at a fourth moment, and the fourth moment appears after the third moment;
for each sample to be trained in the sample set to be trained, the cloud game server calls a behavior prediction model to be trained to process the character type of the game character to be trained and the labeled operation data of the game character to be trained at the third moment, so as to obtain the predicted operation data of each sample to be trained at the fourth moment;
and the cloud game server updates the model parameters of the behavior prediction model to be trained according to the prediction operation data of each sample to be trained at the fourth moment and the marking operation data at the fourth moment until the model training conditions are met, so as to obtain the behavior prediction model.
In this embodiment, another way of training a behavior prediction model is introduced. The cloud game server obtains a sample set to be trained, wherein the sample set to be trained comprises at least one sample to be trained, and the samples to be trained have the same source as an operation object, wherein the operation object is a user (or a player), namely, a corresponding behavior prediction model is trained for each user. And the cloud game server trains the behavior prediction model based on the samples to be trained.
Specifically, similar to the foregoing embodiment, each sample to be trained includes a character type of the game character to be trained, tagging operation data of the game character to be trained at a third time, and tagging operation data of the game character to be trained at a fourth time, for example, for the game character a to be trained, the corresponding character type is "legal", the tagging operation data at the third time is "intersection in the game", and the tagging operation data at the third time is "go left", wherein the fourth time occurs after the third time, and several seconds or several image frames may be separated between the third time and the fourth time, for example, 60 frames are separated. Inputting the character type of the game character to be trained in each sample to be trained and the labeled operation data of the game character to be trained at the third moment into the behavior prediction model to be trained, and outputting the predicted operation data at the fourth moment through the behavior prediction model to be trained. Based on the above, the model parameters of the behavior prediction model to be trained can be updated by using a loss function (e.g., a cross entropy loss function), and the behavior prediction model is output until the model training condition is met.
In general, there are two ways to determine whether the model training condition is satisfied, the first is that if the iteration number of the model training reaches the number threshold, the model training condition is considered to be satisfied. Second, if the result of the loss function converges, the model training condition is considered to be satisfied.
It can be understood that through long-time operation of the cloud game, as the behavior data of a certain user is more and more, the behavior prediction model can be trained through the behavior data of the user, the prediction precision is continuously improved, and the accuracy of the behavior prediction model is higher and higher. It should be noted that the behavior prediction model in the present application may be an LR model, a CNN model, a DNN model, or an RNN model, and is not limited herein.
In the embodiment of the application, another way of training the behavior prediction model is provided, and in the way, the behavior prediction model is trained by using samples to be trained from the same operation object, so that a model for type is obtained, and more accurate behavior data can be predicted based on the behavior prediction model, so that the feasibility and the operability of the scheme are improved.
Optionally, on the basis of the respective embodiments corresponding to fig. 3, in another optional embodiment provided in the embodiments of the present application, N is an integer greater than or equal to 2;
the cloud game server performs rendering processing on each behavior data in the N behavior data to obtain N rendering results, which may specifically include:
the cloud game server acquires behavior data corresponding to the maximum value of the occurrence probability from the N behavior data according to the occurrence probability corresponding to each optional behavior data, wherein each behavior data corresponds to one occurrence probability;
the cloud game server carries out rendering processing on the behavior data corresponding to the maximum occurrence probability to obtain a first rendering result;
the cloud game server acquires behavior data corresponding to the second largest value of the occurrence probability from the N behavior data according to the occurrence probability corresponding to each optional behavior data;
the cloud game server carries out rendering processing on the behavior data corresponding to the occurrence probability secondary maximum value to obtain a second rendering result;
the cloud game server sends the N rendering results to the terminal device, and may specifically include:
in a first time period, the cloud game server sends a first rendering result to the terminal equipment;
and the cloud game server sends a second rendering result to the terminal device within a second time period, wherein the second time period is a time period after the first time period.
In this embodiment, a manner of sending a rendering result to a terminal device according to an occurrence probability is described. As can be seen from the foregoing embodiment, each behavior data corresponds to an occurrence probability, and the greater the occurrence probability, the more likely the game character controlled by the user will be to behave at a future time, and therefore, the higher the probability of displaying the rendering result corresponding to the behavior data.
Specifically, for ease of understanding, the following description will be made with reference to table 3, and please refer to table 3, where table 3 is an illustration of the correspondence between behavior data and occurrence probability.
TABLE 3
Behavioral data Probability of occurrence Rendering results
Behavior data A 15% Rendering result A
Behavior data B 10% Rendering result B
Behavior data C 60% Rendering result C
Behavior data D 20% Rendering result D
It can be seen that N is assumed to be 4, that is, there are 4 behavior data, and each behavior data has a rendering result obtained after rendering. According to the occurrence probability corresponding to each behavior data, the behavior data can be rearranged according to the sequence of the occurrence probability from large to small, so as to obtain an indication of the corresponding relationship between the behavior data and the occurrence probability as shown in table 4.
TABLE 4
Behavioral data Probability of occurrence Rendering results
Behavior data C 60% Rendering result C
Behavior data D 20% Rendering result D
Behavior data A 15% Rendering result A
Behavior data B 10% Rendering result B
The behavior data C has the maximum occurrence probability, that is, the maximum occurrence probability is 60%, and therefore the first rendering result is the rendering result C. The occurrence probability corresponding to the behavior data D is only smaller than the occurrence probability corresponding to the behavior data C, that is, the second largest value of the occurrence probabilities is 20%, and thus the second rendering result is the rendering result D. Similarly, the occurrence probability corresponding to the behavior data a is only smaller than the occurrence probability corresponding to the behavior data D, that is, the second largest value of the occurrence probabilities is 15%, and therefore, the third rendering result is the rendering result a. The occurrence probability corresponding to the behavior data B is only smaller than the occurrence probability corresponding to the behavior data a, that is, the second largest value of the occurrence probability is 10%, and therefore, the fourth rendering result is the rendering result B, and so on, which is not exhaustive here.
Based on the above, the cloud game server sends the first rendering result to the terminal device within a first time period, and then sends the second rendering result to the terminal device within a second time period. For example, the cloud game server starts to transmit the behavior data C to the terminal device at 10 o' clock 12 min 08 sec. After the action data C is sent, the cloud game server starts to send the action data D to the terminal equipment at 10 points, 12 minutes and 11 seconds. After the action data D is sent, the cloud game server starts to send the action data A to the terminal equipment at 10 points, 12 minutes and 15 seconds. After the action data A is sent, the cloud game server starts to send the action data B to the terminal equipment at 10 o' clock, 12 min and 18 sec.
Further, in the embodiment of the present application, a manner of sending rendering results to a terminal device according to occurrence probabilities is provided, and with the above manner, considering that in some complex game scenes, many rendering results may be generated, and if the rendering results are sent to the terminal device used by a user at one time, network congestion or delay and other situations may be caused, therefore, the rendering results that are most likely to occur may be preferentially sent to the terminal device based on the occurrence probabilities of behavior data, and similarly, the rendering results may be sent to the terminal device in sequence from large to small according to the probability that the rendering results are likely to occur. Therefore, on one hand, the probability of hitting the rendering result can be improved, and on the other hand, phenomena such as network congestion and delay are not easy to cause.
Optionally, on the basis of each embodiment corresponding to fig. 3, in another optional embodiment provided in this embodiment of the present application, after the cloud game server sends N rendering results to the terminal device, so that the terminal device determines a target rendering result from the N rendering results according to the target behavior data, the method may further include:
the cloud game server receives a picture rendering response sent by the terminal equipment, wherein the picture rendering response comprises an identifier corresponding to a target rendering result;
the cloud game server determines a target rendering result from the N rendering results according to the picture rendering response;
and the cloud game server deletes (N-1) rendering results from the N rendering results, wherein the (N-1) rendering results are rendering results left after the target rendering result is removed from the N rendering results.
In this embodiment, a method of deleting a displayed rendering result is introduced. As can be seen from the foregoing embodiment, after the terminal device displays the target rendering result, a screen rendering response may be further sent to the cloud game server, where the screen rendering response includes an identifier corresponding to the target rendering result. Therefore, the cloud game server determines and reserves the target rendering result from the N rendering results according to the picture rendering response, and subsequent operation is conveniently carried out on the basis of the target rendering result. And other (N-1) rendering results which are not the target rendering result can be directly deleted.
Specifically, for ease of understanding, the following description will be made with reference to table 5, and please refer to table 5, where table 5 is an illustration of a corresponding relationship between behavior data and rendering results.
TABLE 5
Identification of rendering results Behavioral data Rendering results
1 Behavior data A Rendering result A
2 Behavior data B Rendering result B
3 Behavior data C Rendering result C
4 Behavior data D Rendering result D
Wherein each rendering result corresponds to an identity and each behavior data corresponds to a rendering result. Assuming that the target behavior data is behavior data B, the target rendering result is a rendering result B, and based on this, the screen rendering response carries an identifier "2", so that the cloud game server deletes the rendering result a corresponding to the behavior data a, "deletes the rendering result C corresponding to the behavior data C," and deletes the rendering result D corresponding to the behavior data D "according to the screen rendering response.
Secondly, in the embodiment of the present application, a method for deleting a displayed rendering result is provided, and in the above manner, the cloud game server may delete a rendering result that is not displayed based on a rendering result that is actually displayed by the terminal device, so as to save more storage space.
With reference to the above description, a method for displaying a game screen in the present application will be described below from the perspective of a terminal device, and referring to fig. 9, another embodiment of the method for displaying a game screen in the present application embodiment includes:
201. the terminal device sends game state data corresponding to a first moment to the cloud game server so that the cloud game server obtains N behavior data corresponding to a second moment according to the game state data, wherein the second moment occurs after the first moment, and N is an integer greater than or equal to 1;
in this embodiment, a user starts a cloud game through a terminal device, so that the terminal device receives an input of the user at a first time, converts the input of the user into game state data corresponding to the first time based on a network protocol, and then sends the game state data to an access proxy service, and the access proxy service sends the game state data to a cloud game server. The access proxy service is usually deployed on the edge node to reduce the game delay, or the access proxy service can be directly deployed on the cloud game server.
Specifically, since the game runs in the cloud, the input of the user needs to be converted into a data format that can be supported by the network protocol, for example, the mouse or the keyboard is a USB input device, and the handle may be a bluetooth device. Since these devices are all on the terminal device side, information input by these devices needs to be converted into game state data that can be simulated by the cloud game server, and thus, the game scene can be reproduced in the cloud. The types of terminal devices include, but are not limited to, Windows systems, iOS, android systems, and macOS. The input method of the user includes, but is not limited to, input by a mouse, a keyboard, a handle, a touch screen, voice, and the like.
202. The terminal equipment receives N rendering results sent by the cloud game server, wherein the N rendering results are obtained after the cloud game server renders each behavior data in the N behavior data, and the rendering results and the behavior data have corresponding relations;
in this embodiment, the cloud game server may obtain, according to the game state data at the first time, N behavior data corresponding to the second time based on a preset rule or a neural network model. It should be noted that, if the first time is the current time, the second time is a future time. If the second time is the current time, then the first time is some time in the past.
Specifically, in a specific game scenario, N pieces of behavior data represent the possibility that a user controls a game character to perform some kind of operation, for example, behavior data 1 indicates that the game character turns to the left, behavior data 2 indicates that the game character turns to the right, and behavior data 3 indicates that the game character goes straight.
After the N behavior data obtained by prediction, the cloud game server may perform rendering processing on each behavior data, so as to obtain a corresponding rendering result, that is, the rendering result and the behavior data have a one-to-one correspondence relationship. For example, the N behavior data described in connection with step 102 may generate rendering result 1 based on behavior data 1, rendering result 2 based on behavior data 2, and rendering result 3 based on behavior data 3.
It should be noted that rendering in computer graphics refers to the process of generating an image from a model using software. A model is a description of a three-dimensional object in a well-defined language or data structure that includes geometric, viewpoint, texture, and lighting information. It should be noted that the rendering process according to the present application is not limited to rendering a picture, but includes rendering sound, vibration feedback of a gamepad, case switching of a keyboard, and the like, and therefore, the rendering result includes not only a game picture but also one or more of audio data, vibration feedback data, and other data.
203. The terminal equipment receives target behavior data corresponding to the second moment, wherein the target behavior data represent data for controlling the behavior of the game role at the second moment;
in this embodiment, when the terminal device receives a real operation triggered by the user at the second time, the target behavior data may be determined. Specifically, in connection with the description of step 202, assuming that the user triggers the real operation to control the game character to turn left at the second time, it is determined that the target behavior data matches behavior data 1 successfully.
204. And if the N behavior data comprise target behavior data, the terminal equipment determines a target rendering result from the N rendering results and displays the target rendering result, wherein the target rendering result and the target behavior data have a corresponding relation, and the target rendering result comprises a game picture of the cloud game.
In this embodiment, the cloud game server sends the N rendering results obtained after the rendering processing to the terminal device, and the terminal device stores the N rendering results in advance. Specifically, with reference to the contents described in step 202 and step 203, if the N behavior data includes the target behavior data, the terminal device determines the target rendering result as rendering result 1 from the N rendering results, and displays the target rendering result. The target rendering result at least includes a game picture obtained through rendering, and may further include audio data, handle vibration data, keyboard locking data, and the like, which is not limited this time.
In the embodiment of the application, a display method of a game picture is provided. Through the mode, the cloud game server predicts N behavior data which may appear at the future time by using the game state data at the current time, then renders each behavior data in advance and obtains a corresponding game picture, which is equivalent to that the game progress can be ahead of the real input of a user, and when the real target behavior data is input by the user, the terminal equipment directly pulls the rendered target rendering result, namely displays the corresponding game picture. Since the game screen is rendered in advance, a shorter delay is brought to the user.
Referring to fig. 10, fig. 10 is a schematic view of an embodiment of a game screen display device in an embodiment of the present application, and the game screen display device 30 includes:
an obtaining module 301, configured to obtain game state data corresponding to a first moment;
the obtaining module 301 is further configured to obtain, according to the game state data, N behavior data corresponding to a second time, where the second time occurs after the first time, and N is an integer greater than or equal to 1;
the rendering module 302 is configured to perform rendering processing on each behavior data of the N behavior data to obtain N rendering results, where the rendering results have a corresponding relationship with the behavior data;
a sending module 303, configured to send the N rendering results to the terminal device, so that the terminal device determines a target rendering result from the N rendering results according to the target behavior data, and displays the target rendering result, where the target behavior data represents data for controlling a behavior of a game role at the second time, and the target rendering result includes a game picture of the cloud game.
In the embodiment of the application, a game picture display device is provided, and by adopting the device, a cloud game server predicts N behavior data which may appear at a future moment by using game state data at the current moment, and then renders each behavior data in advance to obtain a corresponding game picture, which is equivalent to that game progress can be input by a user in advance, and when the user inputs real target behavior data, a rendered target rendering result is directly pulled, namely, the corresponding game picture is displayed. Since the game screen is rendered in advance, a shorter delay is brought to the user.
Optionally, on the basis of the embodiment corresponding to fig. 10, in another embodiment of the game screen display device 30 provided in the embodiment of the present application, the game state data includes position data of the game character at the first time;
an obtaining module 301, specifically configured to obtain game map data;
determining N optional positions corresponding to the game role at a second moment according to the game map data and the position data of the game role at the first moment;
and generating N behavior data corresponding to the second moment according to the N optional positions corresponding to the game role at the second moment.
In the embodiment of the application, a game screen display device is provided, and by adopting the device, the position of a game role on a game map can be determined according to position data of the game role, and then the position of the game role which is possibly appeared at the next moment is determined by combining the game map, so that corresponding behavior data is generated, and a game screen corresponding to different behavior data is rendered. Since the game frames are rendered in advance, the cloud game server does not have to sacrifice frame rate and frame sharpness to balance the delay. Therefore, the cloud terminal can increase resource consumption and bring a clearer frame rate and a clearer delay of the picture, and accordingly a higher game frame rate and a clearer game picture can be brought to a user.
Optionally, on the basis of the embodiment corresponding to fig. 10, in another embodiment of the game screen display device 30 provided in the embodiment of the present application, the game state data includes operation data of the game character at the first time;
an obtaining module 301, configured to obtain N operation rules from an operation rule set according to operation data of a game character at a first time, where the operation rule set includes M operation rules, and M is an integer greater than or equal to 1;
and generating N behavior data corresponding to the second moment according to the N operation rules.
In the embodiment of the present application, a game screen display device is provided, which is capable of determining a behavior of a game character that may appear at a next time according to operation data of the game character and by combining a preset operation rule set, and based on the behavior, generating corresponding behavior data, thereby rendering a game screen corresponding to different behavior data. Because the game picture is rendered well in advance, the cloud game server does not need to sacrifice the frame rate and the picture definition for balancing the delay, so that the cloud game server can increase resource consumption through the cloud end to bring a clearer picture with a higher frame rate and a lower delay, and can bring a higher game frame rate and a clearer game picture for a user.
Optionally, on the basis of the embodiment corresponding to fig. 10, in another embodiment of the game screen display device 30 provided in the embodiment of the present application, the game state data includes a character type of the game character and operation data of the game character at the first time;
the obtaining module 301 is specifically configured to obtain Q optional behavior data and an occurrence probability corresponding to each optional behavior data through a behavior prediction model based on a role type of a game role and operation data of the game role at a first time, where Q is an integer greater than or equal to 1;
based on the occurrence probability corresponding to each selectable behavior data, if the occurrence probability corresponding to the N selectable behavior data is greater than or equal to the probability threshold, the N selectable behavior data is determined as the N behavior data.
In the embodiment of the present application, a game screen display device is provided, which is capable of determining a behavior of a game character that may appear at a next moment according to a character type and operation data of the game character and a trained behavior prediction model, and based on the behavior, generating corresponding behavior data, thereby rendering a game screen corresponding to different behavior data. Because the game picture is rendered well in advance, the cloud game server does not need to sacrifice the frame rate and the picture definition for balancing the delay, so that the cloud game server can increase resource consumption through the cloud end to bring a clearer picture with a higher frame rate and a lower delay, and can bring a higher game frame rate and a clearer game picture for a user.
Optionally, on the basis of the embodiment corresponding to fig. 10, in another embodiment of the game screen display device 30 provided in the embodiment of the present application, the game screen display device 30 includes a processing module 304 and a training module 305;
the obtaining module 301 is further configured to obtain a to-be-trained sample set before obtaining, based on a role type of a game role and operation data of the game role at a first time, Q pieces of selectable behavior data and occurrence probability corresponding to each piece of selectable behavior data through a behavior prediction model, where the to-be-trained sample set is derived from at least two operation objects, the to-be-trained sample set includes at least two to-be-trained samples, each to-be-trained sample includes a role type of the game role to be trained, labeled operation data of the game role to be trained at a third time, and labeled operation data of the game role to be trained at a fourth time, and the fourth time occurs after the third time;
the processing module 304 is configured to, for each to-be-trained sample in the to-be-trained sample set, invoke a to-be-trained behavior prediction model to process the character type of the to-be-trained game character and the labeled operation data of the to-be-trained game character at the third time, so as to obtain predicted operation data of each to-be-trained sample at the fourth time;
the training module 305 is configured to update the model parameters of the behavior prediction model to be trained according to the prediction operation data of each sample to be trained at the fourth time and the labeled operation data at the fourth time until the model training conditions are met, so as to obtain the behavior prediction model.
In the embodiment of the application, the game picture display device is provided, and by adopting the device, the behavior prediction model is trained by using samples to be trained from different operation objects, so that a universal model is obtained, more accurate behavior data can be predicted based on the behavior prediction model, and therefore the feasibility and operability of the scheme are improved.
Optionally, on the basis of the embodiment corresponding to fig. 10, in another embodiment of the game screen display device 30 provided in the embodiment of the present application, the game state data includes an identifier of the target operation object, a character type of the game character, and operation data of the game character at the first time;
an obtaining module 301, configured to determine a behavior prediction model according to the identifier of the target operation object;
based on the role type of the game role and the operation data of the game role at the first moment, Q optional behavior data and the occurrence probability corresponding to each optional behavior data are obtained through a behavior prediction model, wherein Q is an integer greater than or equal to 1;
based on the occurrence probability corresponding to each selectable behavior data, if the occurrence probability corresponding to the N selectable behavior data is greater than or equal to the probability threshold, the N selectable behavior data is determined as the N behavior data.
In the embodiment of the present application, a game screen display device is provided, where the device is adopted, a behavior prediction model applicable to a target operation object is selected based on an identifier of the target operation object, then a behavior of a game character that may appear at a next moment is determined according to a character type and operation data of the game character and the trained behavior prediction model, and based on the behavior prediction model, corresponding behavior data may be generated, so as to render a game screen corresponding to different behavior data. On the one hand, the behavior prediction model is trained and used in a targeted manner, the accuracy and the reliability of prediction can be improved, and on the other hand, because the game picture is well rendered in advance, the cloud game server does not need to sacrifice the frame rate and the picture definition for balancing delay, so that the cloud game server can increase resource consumption through a cloud end to bring a clearer picture with a higher frame rate and a lower delay, and can bring a clearer game picture with a higher game frame rate for a user.
Optionally, on the basis of the embodiment corresponding to fig. 10, in another embodiment of the game screen display device 30 provided in the embodiment of the present application, the game screen display device 30 includes a processing module 304 and a training module 305;
the obtaining module 301 is further configured to obtain a to-be-trained sample set before obtaining, by using a behavior prediction model, Q pieces of selectable behavior data and occurrence probability corresponding to each piece of selectable behavior data based on a role type of a game role and operation data of the game role at a first time, where the to-be-trained sample set is derived from a target operation object, the to-be-trained sample set includes at least one to-be-trained sample, each to-be-trained sample includes a role type of the game role to be trained, labeled operation data of the game role to be trained at a third time, and labeled operation data of the game role to be trained at a fourth time, and the fourth time occurs after the third time;
the processing module 304 is configured to, for each to-be-trained sample in the to-be-trained sample set, invoke a to-be-trained behavior prediction model to process the character type of the to-be-trained game character and the labeled operation data of the to-be-trained game character at the third time, so as to obtain predicted operation data of each to-be-trained sample at the fourth time;
the training module 305 is configured to update the model parameters of the behavior prediction model to be trained according to the prediction operation data of each sample to be trained at the fourth time and the labeled operation data at the fourth time until the model training conditions are met, so as to obtain the behavior prediction model.
In the embodiment of the application, the game picture display device is provided, and by adopting the device, the behavior prediction model is trained by using the samples to be trained from the same operation object, so that a model for type is obtained, more accurate behavior data can be predicted based on the behavior prediction model, and therefore the feasibility and the operability of the scheme are improved.
Optionally, on the basis of the embodiment corresponding to fig. 10, in another embodiment of the game screen display device 30 provided in the embodiment of the present application, N is an integer greater than or equal to 2;
the rendering module 302 is specifically configured to obtain behavior data corresponding to a maximum value of occurrence probability from the N behavior data according to the occurrence probability corresponding to each selectable behavior data, where each behavior data corresponds to one occurrence probability;
rendering the behavior data corresponding to the maximum occurrence probability to obtain a first rendering result;
acquiring behavior data corresponding to the second largest value of the occurrence probability from the N behavior data according to the occurrence probability corresponding to each optional behavior data;
rendering the behavior data corresponding to the occurrence probability secondary maximum value to obtain a second rendering result;
a sending module 303, configured to send a first rendering result to the terminal device in a first time period;
and transmitting the second rendering result to the terminal equipment within a second time period, wherein the second time period is a time period after the first time period.
In the embodiment of the application, a game screen display device is provided, and by adopting the device, in consideration of the fact that many rendering results may be generated in some complicated game scenes, if the rendering results are sent to terminal equipment used by a user at one time, network congestion or delay and the like may be caused, so that the rendering results which are most likely to occur can be preferentially sent to the terminal equipment based on the occurrence probability of behavior data, and similarly, the rendering results are sequentially sent to the terminal equipment from large to small according to the possible occurrence probability of the rendering results. Therefore, on one hand, the probability of hitting the rendering result can be improved, and on the other hand, phenomena such as network congestion and delay are not easy to cause.
Optionally, on the basis of the embodiment corresponding to fig. 10, in another embodiment of the game screen display device 30 provided in the embodiment of the present application, the game screen display device 30 includes a receiving module 306, a determining module 307, and a deleting module 308;
a receiving module 306, configured to send the N rendering results to the terminal device by the sending module 303, so that the terminal device receives an image rendering response sent by the terminal device after determining a target rendering result from the N rendering results according to the target behavior data, where the image rendering response includes an identifier corresponding to the target rendering result;
a determining module 307, configured to determine a target rendering result from the N rendering results according to the screen rendering response;
and a deleting module 308, configured to delete (N-1) rendering results from the N rendering results, where the (N-1) rendering results are rendering results left after the target rendering result is removed from the N rendering results.
In the embodiment of the application, a game picture display device is provided, and by adopting the device, a cloud game server deletes the rendering result which is not displayed based on the rendering result which is actually displayed by the terminal equipment, so that more storage space is saved.
Referring to fig. 11, fig. 11 is a schematic view of another embodiment of the game screen display device in the embodiment of the present application, and the game screen display device 40 includes:
a sending module 401, configured to send game state data corresponding to a first time to a cloud game server, so that the cloud game server obtains N behavior data corresponding to a second time according to the game state data, where the second time occurs after the first time, and N is an integer greater than or equal to 1;
a receiving module 402, configured to receive N rendering results sent by the cloud game server, where the N rendering results are obtained after the cloud game server performs rendering processing on each behavior data in the N behavior data, and the rendering results and the behavior data have a corresponding relationship;
the receiving module 402 is further configured to receive target behavior data corresponding to a second time, where the target behavior data represents data for controlling a behavior of a game character at the second time;
the display module 403 is configured to determine a target rendering result from the N rendering results and display the target rendering result if the N behavior data includes target behavior data, where the target rendering result and the target behavior data have a corresponding relationship, and the target rendering result includes a game screen of the cloud game.
In the embodiment of the application, a game picture display device is provided, and by adopting the device, a cloud game server predicts N behavior data which may appear at a future moment by using game state data at the current moment, and then renders each behavior data in advance to obtain a corresponding game picture, which is equivalent to that game progress can be input by a user actually in advance, and when the user inputs actual target behavior data, a terminal device directly pulls a rendered target rendering result, namely displays the corresponding game picture. Since the game screen is rendered in advance, a shorter delay is brought to the user.
The embodiment of the application also provides another game picture display device, wherein the game picture display device is deployed in a server, and the server can be a cloud game server. For ease of understanding, referring to fig. 12, fig. 12 is a schematic structural diagram of a server provided in the present embodiment, where the server 500 may have a relatively large difference due to different configurations or performances, and may include one or more Central Processing Units (CPUs) 522 (e.g., one or more processors) and a memory 532, and one or more storage media 530 (e.g., one or more mass storage devices) storing an application 542 or data 544. Memory 532 and storage media 530 may be, among other things, transient storage or persistent storage. The program stored on the storage medium 530 may include one or more modules (not shown), each of which may include a series of instruction operations for the server. Still further, the central processor 522 may be configured to communicate with the storage medium 530, and execute a series of instruction operations in the storage medium 530 on the server 500.
The Server 500 may also include one or more power supplies 526, one or more wired or wireless network interfaces 550, one or more input-output interfaces 558, and/or one or more operating systems 541, such as a Windows ServerTM,Mac OS XTM,UnixTM,LinuxTM,FreeBSDTMAnd so on.
The steps performed by the server in the above embodiment may be based on the server structure shown in fig. 12.
The embodiment of the application also provides another game picture display device, and the game picture display device is arranged on the terminal equipment. For the convenience of understanding, please refer to fig. 13, as shown in fig. 13, for the convenience of description, only the portion related to the embodiment of the present application is shown, and details of the specific technology are not disclosed, please refer to the method portion of the embodiment of the present application. In the embodiment of the present application, a terminal device is taken as an example to explain:
fig. 13 is a block diagram illustrating a partial structure of a smartphone related to a terminal device provided in an embodiment of the present application. Referring to fig. 13, the smart phone includes: radio Frequency (RF) circuitry 610, memory 620, input unit 630, display unit 640, sensor 650, audio circuitry 660, wireless fidelity (WiFi) module 670, processor 680, and power supply 690. Those skilled in the art will appreciate that the smartphone configuration shown in fig. 13 is not limiting and may include more or fewer components than shown, or some components in combination, or a different arrangement of components.
The following describes each component of the smart phone in detail with reference to fig. 13:
the RF circuit 610 may be used for receiving and transmitting signals during information transmission and reception or during a call, and in particular, receives downlink information of a base station and then processes the received downlink information to the processor 680; in addition, the data for designing uplink is transmitted to the base station. In general, the RF circuit 610 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the RF circuitry 610 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to global system for mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Message Service (SMS), etc.
The memory 620 may be used to store software programs and modules, and the processor 680 may execute various functional applications and data processing of the smart phone by operating the software programs and modules stored in the memory 620. The memory 620 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the smartphone, and the like. Further, the memory 620 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The input unit 630 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the smartphone. Specifically, the input unit 630 may include a touch panel 631 and other input devices 632. The touch panel 631, also referred to as a touch screen, may collect touch operations of a user (e.g., operations of the user on the touch panel 631 or near the touch panel 631 by using any suitable object or accessory such as a finger or a stylus) thereon or nearby, and drive the corresponding connection device according to a preset program. Alternatively, the touch panel 631 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 680, and can receive and execute commands sent by the processor 680. In addition, the touch panel 631 may be implemented using various types, such as resistive, capacitive, infrared, and surface acoustic wave. The input unit 630 may include other input devices 632 in addition to the touch panel 631. In particular, other input devices 632 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 640 may be used to display information input by or provided to the user and various menus of the smartphone. The display unit 640 may include a display panel 641, and optionally, the display panel 641 may be configured in the form of a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), or the like. Further, the touch panel 631 can cover the display panel 641, and when the touch panel 631 detects a touch operation thereon or nearby, the touch panel is transmitted to the processor 680 to determine the type of the touch event, and then the processor 680 provides a corresponding visual output on the display panel 641 according to the type of the touch event. Although in fig. 13, the touch panel 631 and the display panel 641 are two separate components to implement the input and output functions of the smart phone, in some embodiments, the touch panel 631 and the display panel 641 may be integrated to implement the input and output functions of the smart phone.
The smartphone may also include at least one sensor 650, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel 641 according to the brightness of ambient light, and a proximity sensor that may turn off the display panel 641 and/or the backlight when the smartphone is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications (such as horizontal and vertical screen switching, related games, magnetometer attitude calibration) for recognizing the attitude of the smartphone, and related functions (such as pedometer and tapping) for vibration recognition; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the smart phone, further description is omitted here.
Audio circuit 660, speaker 661, microphone 662 can provide an audio interface between the user and the smartphone. The audio circuit 660 may transmit the electrical signal converted from the received audio data to the speaker 661, and convert the electrical signal into an audio signal through the speaker 661 for output; on the other hand, the microphone 662 converts the collected sound signals into electrical signals, which are received by the audio circuit 660 and converted into audio data, which are processed by the audio data output processor 680 and then passed through the RF circuit 610 to be sent to, for example, another smartphone or output to the memory 620 for further processing.
WiFi belongs to short-distance wireless transmission technology, and the smart phone can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 670, and provides wireless broadband internet access for the user. Although fig. 13 shows the WiFi module 670, it is understood that it does not belong to the essential constitution of the smartphone and can be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 680 is a control center of the smart phone, connects various parts of the entire smart phone using various interfaces and lines, and performs various functions of the smart phone and processes data by operating or executing software programs and/or modules stored in the memory 620 and calling data stored in the memory 620, thereby integrally monitoring the smart phone. Optionally, processor 680 may include one or more processing units; optionally, the processor 680 may integrate an application processor and a modem processor, wherein the application processor mainly handles operating systems, user interfaces, application programs, and the like, and the modem processor mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 680.
The smartphone also includes a power supply 690 (e.g., a battery) that provides power to the various components, optionally, the power supply may be logically connected to the processor 680 via a power management system, so that functions such as managing charging, discharging, and power consumption are implemented via the power management system.
Although not shown, the smart phone may further include a camera, a bluetooth module, and the like, which are not described herein.
The steps performed by the terminal device in the above-described embodiment may be based on the terminal device structure shown in fig. 13.
Embodiments of the present application also provide a computer-readable storage medium, in which a computer program is stored, and when the computer program runs on a computer, the computer is caused to execute the method described in the foregoing embodiments.
Embodiments of the present application also provide a computer program product including a program, which, when run on a computer, causes the computer to perform the methods described in the foregoing embodiments.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (15)

1. A method for displaying a game screen, comprising:
acquiring game state data corresponding to a first moment;
acquiring N behavior data corresponding to a second moment according to the game state data, wherein the second moment occurs after the first moment, and N is an integer greater than or equal to 1;
rendering each behavior data in the N behavior data to obtain N rendering results, wherein the rendering results and the behavior data have a corresponding relation;
and sending the N rendering results to the terminal equipment, so that the terminal equipment determines a target rendering result from the N rendering results according to target behavior data, and displays the target rendering result, wherein the target behavior data represents data for controlling the behavior of the game role at the second moment, and the target rendering result comprises a game picture of the cloud game.
2. The display method according to claim 1, wherein the game state data includes position data of the game character at the first time;
the obtaining N behavior data corresponding to a second time according to the game state data includes:
acquiring game map data;
determining N optional positions corresponding to the game role at the second moment according to the game map data and the position data of the game role at the first moment;
and generating N behavior data corresponding to the second moment according to the N optional positions corresponding to the game role at the second moment.
3. The display method according to claim 1, wherein the game state data includes operation data of the game character at the first timing;
the obtaining N behavior data corresponding to a second time according to the game state data includes:
acquiring N operation rules from an operation rule set according to operation data of the game role at the first moment, wherein the operation rule set comprises M operation rules, and M is an integer greater than or equal to 1;
and generating N behavior data corresponding to the second moment according to the N operation rules.
4. The display method according to claim 1, wherein the game state data includes a character type of the game character and operation data of the game character at the first time;
the obtaining N behavior data corresponding to a second time according to the game state data includes:
acquiring Q optional behavior data and the occurrence probability corresponding to each optional behavior data through a behavior prediction model based on the character type of the game character and the operation data of the game character at the first moment, wherein Q is an integer greater than or equal to 1;
and determining the N selectable behavior data as the N behavior data if the occurrence probability corresponding to the N selectable behavior data is greater than or equal to a probability threshold value based on the occurrence probability corresponding to each selectable behavior data.
5. The display method according to claim 4, wherein before the obtaining of the Q optional behavior data and the occurrence probability corresponding to each optional behavior data by the behavior prediction model based on the character type of the game character and the operation data of the game character at the first time, the method further comprises:
obtaining a sample set to be trained, wherein the sample set to be trained is derived from at least two operation objects, the sample set to be trained comprises at least two samples to be trained, each sample to be trained comprises a role type of a game role to be trained, marking operation data of the game role to be trained at a third moment and marking operation data of the game role to be trained at a fourth moment, and the fourth moment appears after the third moment;
for each sample to be trained in the sample set to be trained, calling a behavior prediction model to be trained to process the character type of the game character to be trained and the labeled operation data of the game character to be trained at the third moment, so as to obtain the predicted operation data of each sample to be trained at the fourth moment;
and updating the model parameters of the behavior prediction model to be trained according to the prediction operation data of each sample to be trained at the fourth moment and the labeled operation data of each sample to be trained at the fourth moment until model training conditions are met, so as to obtain the behavior prediction model.
6. The display method according to claim 1, wherein the game state data includes an identification of the target operation object, a character type of the game character, and operation data of the game character at the first time;
the obtaining N behavior data corresponding to a second time according to the game state data includes:
determining a behavior prediction model according to the identification of the target operation object;
based on the role type of the game role and the operation data of the game role at the first moment, Q optional behavior data and the occurrence probability corresponding to each optional behavior data are obtained through the behavior prediction model, wherein Q is an integer greater than or equal to 1;
and determining the N selectable behavior data as the N behavior data if the occurrence probability corresponding to the N selectable behavior data is greater than or equal to a probability threshold value based on the occurrence probability corresponding to each selectable behavior data.
7. The display method according to claim 6, wherein before the obtaining, by the behavior prediction model, the Q pieces of selectable behavior data and the occurrence probability corresponding to each piece of selectable behavior data based on the character type of the game character and the operation data of the game character at the first time, the method further comprises:
obtaining a sample set to be trained, wherein the sample set to be trained is derived from the target operation object, the sample set to be trained comprises at least one sample to be trained, each sample to be trained comprises a role type of a game role to be trained, marking operation data of the game role to be trained at a third moment and marking operation data of the game role to be trained at a fourth moment, and the fourth moment appears after the third moment;
for each sample to be trained in the sample set to be trained, calling a behavior prediction model to be trained to process the character type of the game character to be trained and the labeled operation data of the game character to be trained at the third moment, so as to obtain the predicted operation data of each sample to be trained at the fourth moment;
and updating the model parameters of the behavior prediction model to be trained according to the prediction operation data of each sample to be trained at the fourth moment and the labeled operation data of each sample to be trained at the fourth moment until model training conditions are met, so as to obtain the behavior prediction model.
8. The display method according to any one of claims 4 to 7, wherein N is an integer greater than or equal to 2;
the rendering processing is performed on each behavior data in the N behavior data to obtain N rendering results, including:
acquiring behavior data corresponding to the maximum value of the occurrence probability from the N behavior data according to the occurrence probability corresponding to each optional behavior data, wherein each behavior data corresponds to one occurrence probability;
rendering the behavior data corresponding to the maximum occurrence probability to obtain a first rendering result;
acquiring behavior data corresponding to the second largest value of the occurrence probability from the N behavior data according to the occurrence probability corresponding to each optional behavior data;
rendering the behavior data corresponding to the occurrence probability secondary maximum value to obtain a second rendering result;
the sending the N rendering results to the terminal device includes:
sending the first rendering result to the terminal equipment within a first time period;
and sending the second rendering result to the terminal equipment within a second time period, wherein the second time period is a time period after the first time period.
9. The display method according to claim 1, wherein after sending the N rendering results to the terminal device to enable the terminal device to determine a target rendering result from the N rendering results according to target behavior data, the method further comprises:
receiving a picture rendering response sent by the terminal equipment, wherein the picture rendering response comprises an identifier corresponding to the target rendering result;
determining the target rendering result from the N rendering results according to the picture rendering response;
deleting (N-1) rendering results from the N rendering results, wherein the (N-1) rendering results are rendering results left after the target rendering result is removed from the N rendering results.
10. A method for displaying a game screen, comprising:
sending game state data corresponding to a first moment to a cloud game server so that the cloud game server obtains N behavior data corresponding to a second moment according to the game state data, wherein the second moment occurs after the first moment, and N is an integer greater than or equal to 1;
receiving N rendering results sent by the cloud game server, wherein the N rendering results are obtained after the cloud game server performs rendering processing on each behavior data in the N behavior data, and the rendering results and the behavior data have a corresponding relation;
receiving target behavior data corresponding to the second moment, wherein the target behavior data represents data for controlling the behavior of the game role at the second moment;
if the N behavior data comprise the target behavior data, determining a target rendering result from the N rendering results, and displaying the target rendering result, wherein the target rendering result and the target behavior data have a corresponding relation, and the target rendering result comprises a game picture of a cloud game.
11. A game screen display device, comprising:
the acquisition module is used for acquiring game state data corresponding to the first moment;
the obtaining module is further configured to obtain N behavior data corresponding to a second time according to the game state data, where the second time occurs after the first time, and N is an integer greater than or equal to 1;
the rendering module is used for rendering each behavior data in the N behavior data to obtain N rendering results, wherein the rendering results and the behavior data have a corresponding relation;
a sending module, configured to send the N rendering results to the terminal device, so that the terminal device determines a target rendering result from the N rendering results according to target behavior data, and displays the target rendering result, where the target behavior data represents data for controlling a behavior of a game role at the second time, and the target rendering result includes a game screen of a cloud game.
12. A game screen display device, comprising:
the cloud game server comprises a sending module, a receiving module and a processing module, wherein the sending module is used for sending game state data corresponding to a first moment to the cloud game server so as to enable the cloud game server to obtain N behavior data corresponding to a second moment according to the game state data, the second moment occurs after the first moment, and N is an integer greater than or equal to 1;
a receiving module, configured to receive N rendering results sent by the cloud game server, where the N rendering results are obtained after the cloud game server performs rendering processing on each behavior data in the N behavior data, and the rendering results and the behavior data have a corresponding relationship;
the receiving module is further configured to receive target behavior data corresponding to the second time, where the target behavior data represents data for controlling a game role behavior at the second time;
a display module, configured to determine a target rendering result from the N rendering results and display the target rendering result if the N behavior data includes the target behavior data, where the target rendering result and the target behavior data have a corresponding relationship, and the target rendering result includes a game screen of a cloud game.
13. A cloud gaming server, comprising: a memory, a processor, and a bus system;
wherein the memory is used for storing programs;
the processor is configured to execute a program in the memory, the processor is configured to execute the display method of any one of claims 1 to 9 according to instructions in program code;
the bus system is used for connecting the memory and the processor so as to enable the memory and the processor to communicate.
14. A terminal device, comprising: a memory, a processor, and a bus system;
wherein the memory is used for storing programs;
the processor for executing the program in the memory, the processor for performing the display method of claim 10 according to instructions in program code;
the bus system is used for connecting the memory and the processor so as to enable the memory and the processor to communicate.
15. A computer-readable storage medium comprising instructions that, when executed on a computer, cause the computer to perform the display method of any one of claims 1 to 9, or perform the display method of claim 10.
CN202110254911.7A 2021-03-09 2021-03-09 Game picture display method, related device, equipment and storage medium Active CN113018848B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110254911.7A CN113018848B (en) 2021-03-09 2021-03-09 Game picture display method, related device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110254911.7A CN113018848B (en) 2021-03-09 2021-03-09 Game picture display method, related device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113018848A true CN113018848A (en) 2021-06-25
CN113018848B CN113018848B (en) 2022-12-02

Family

ID=76467247

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110254911.7A Active CN113018848B (en) 2021-03-09 2021-03-09 Game picture display method, related device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113018848B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113398581A (en) * 2021-06-30 2021-09-17 广州方硅信息技术有限公司 Game picture display method and device, electronic equipment and readable storage medium
CN113596598A (en) * 2021-07-22 2021-11-02 网易(杭州)网络有限公司 Game information processing method, device, equipment and storage medium
CN113628311A (en) * 2021-07-30 2021-11-09 北京百度网讯科技有限公司 Image rendering method, image rendering device, electronic device, and storage medium
CN114205351A (en) * 2021-11-01 2022-03-18 北京中合谷投资有限公司 Cloud game service system and method for family customers
CN114748872A (en) * 2022-06-13 2022-07-15 深圳市乐易网络股份有限公司 Game rendering updating method based on information fusion
US11745109B2 (en) 2022-02-08 2023-09-05 Sony Group Corporation Methods for controlling use of computing resources, such as virtual game consoles
CN117861217A (en) * 2024-03-11 2024-04-12 深圳尚米网络技术有限公司 Game guidance data processing method and system based on user behavior feedback

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104881273A (en) * 2014-02-27 2015-09-02 腾讯科技(深圳)有限公司 Webpage rendering analysis method and terminal device
US20180176097A1 (en) * 2016-12-21 2018-06-21 You I Labs Inc. System and method for cloud-based user interface application deployment
CN108379832A (en) * 2018-01-29 2018-08-10 珠海金山网络游戏科技有限公司 A kind of game synchronization method and apparatus

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104881273A (en) * 2014-02-27 2015-09-02 腾讯科技(深圳)有限公司 Webpage rendering analysis method and terminal device
US20180176097A1 (en) * 2016-12-21 2018-06-21 You I Labs Inc. System and method for cloud-based user interface application deployment
CN108379832A (en) * 2018-01-29 2018-08-10 珠海金山网络游戏科技有限公司 A kind of game synchronization method and apparatus

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113398581A (en) * 2021-06-30 2021-09-17 广州方硅信息技术有限公司 Game picture display method and device, electronic equipment and readable storage medium
CN113398581B (en) * 2021-06-30 2023-09-19 广州方硅信息技术有限公司 Game picture display method, game picture display device, electronic equipment and readable storage medium
CN113596598A (en) * 2021-07-22 2021-11-02 网易(杭州)网络有限公司 Game information processing method, device, equipment and storage medium
CN113628311A (en) * 2021-07-30 2021-11-09 北京百度网讯科技有限公司 Image rendering method, image rendering device, electronic device, and storage medium
CN113628311B (en) * 2021-07-30 2022-12-09 北京百度网讯科技有限公司 Image rendering method, image rendering device, electronic device, and storage medium
CN114205351A (en) * 2021-11-01 2022-03-18 北京中合谷投资有限公司 Cloud game service system and method for family customers
US11745109B2 (en) 2022-02-08 2023-09-05 Sony Group Corporation Methods for controlling use of computing resources, such as virtual game consoles
CN114748872A (en) * 2022-06-13 2022-07-15 深圳市乐易网络股份有限公司 Game rendering updating method based on information fusion
CN117861217A (en) * 2024-03-11 2024-04-12 深圳尚米网络技术有限公司 Game guidance data processing method and system based on user behavior feedback
CN117861217B (en) * 2024-03-11 2024-06-07 深圳尚米网络技术有限公司 Game guidance data processing method and system based on user behavior feedback

Also Published As

Publication number Publication date
CN113018848B (en) 2022-12-02

Similar Documents

Publication Publication Date Title
CN113018848B (en) Game picture display method, related device, equipment and storage medium
CN109107161B (en) Game object control method, device, medium and equipment
KR102319206B1 (en) Information processing method and device and server
CN111773696B (en) Virtual object display method, related device and storage medium
CN105879391B (en) The control method for movement and server and client of role in a kind of game
CN108379834B (en) Information processing method and related equipment
CN109492698A (en) A kind of method of model training, the method for object detection and relevant apparatus
CN106693367B (en) Processing method for displaying data at client, server and client
CN111985640A (en) Model training method based on reinforcement learning and related device
CN110738211A (en) object detection method, related device and equipment
CN111598169B (en) Model training method, game testing method, simulation operation method and simulation operation device
CN112132635B (en) Multi-platform linkage method and related device
US8244804B1 (en) Validation of device activity via logic sharing
CN111686447B (en) Method and related device for processing data in virtual scene
CN112437338B (en) Virtual resource transfer method, device, electronic equipment and storage medium
CN104657203A (en) Task execution method, device and system
CN111368171B (en) Keyword recommendation method, related device and storage medium
CN113810732B (en) Live content display method, device, terminal, storage medium and program product
CN110162603B (en) Intelligent dialogue method, dynamic storage method and device
JP2021531907A (en) Target object control methods, devices, equipment and programs
CN113010825A (en) Data processing method and related device
CN112169327A (en) Control method of cloud game and related device
CN113617027A (en) Cloud game processing method, device, equipment and medium
CN117085314A (en) Auxiliary control method and device for cloud game, storage medium and electronic equipment
CN111611369A (en) Interactive method based on artificial intelligence and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40046467

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant