CN111127576A - Game picture rendering method and device and electronic equipment - Google Patents

Game picture rendering method and device and electronic equipment Download PDF

Info

Publication number
CN111127576A
CN111127576A CN201911315201.XA CN201911315201A CN111127576A CN 111127576 A CN111127576 A CN 111127576A CN 201911315201 A CN201911315201 A CN 201911315201A CN 111127576 A CN111127576 A CN 111127576A
Authority
CN
China
Prior art keywords
rendering
texture
target
preset
color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911315201.XA
Other languages
Chinese (zh)
Other versions
CN111127576B (en
Inventor
罗树权
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Pixel Software Technology Co Ltd
Original Assignee
Beijing Pixel Software Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Pixel Software Technology Co Ltd filed Critical Beijing Pixel Software Technology Co Ltd
Priority to CN201911315201.XA priority Critical patent/CN111127576B/en
Publication of CN111127576A publication Critical patent/CN111127576A/en
Application granted granted Critical
Publication of CN111127576B publication Critical patent/CN111127576B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Generation (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention provides a rendering method and a rendering device of a game picture and electronic equipment, and relates to the technical field of games, wherein the rendering method of the game picture comprises the following steps: extracting the color of the vegetation texture in the initial game picture, mapping the color of the vegetation texture to a preset background texture to obtain a target texture, determining target rendering strength based on a preset attenuation starting distance and a preset rendering distance, interpolating in the initial game picture based on the target texture and the target rendering strength, and rendering to obtain a final game picture. The invention can effectively improve the connection effect of vegetation and terrain and improve the reality of game pictures.

Description

Game picture rendering method and device and electronic equipment
Technical Field
The present invention relates to the field of game technologies, and in particular, to a method and an apparatus for rendering a game screen, and an electronic device.
Background
In many games, a large amount of vegetation is often required to be rendered in order to enhance the reality of the game picture. Due to the particularity of the plant leaves, detailed modeling cannot be carried out on the plant leaves, and a large number of vertexes are needed during rendering, so that great performance loss is brought, the effect of the vegetation in a large range cannot be rendered in a game picture, the display distance of the vegetation can be shortened, and the vegetation covers a distance which is close to a camera. However, this affects the effect of rendering the entire game screen, and particularly, in a place where vegetation is lost, the ground is exposed, which causes unnatural connections.
Aiming at the problem that the engagement is unnatural when the vegetation effect is rendered on the game picture, an effective solution is not provided at present.
Disclosure of Invention
The invention aims to provide a rendering method and device of a game picture and electronic equipment, which can effectively improve the vegetation and terrain engagement effect and improve the reality of the game picture.
In a first aspect, an embodiment provides a rendering method of a game screen, including: extracting the color of vegetation texture in the initial game picture; mapping the colors of the vegetation textures to preset background textures to obtain target textures; determining target rendering strength based on a preset attenuation starting distance and a preset rendering distance; and based on the target texture and the target rendering strength, performing interpolation in the initial game picture, and rendering to obtain a final game picture.
In an alternative embodiment, the step of extracting the color of the vegetation texture in the initial game picture includes: extracting the color of plant leaves in the initial vegetation texture in the initial game picture; and performing down-sampling on the colors of the plant leaves in the extracted vegetation texture to obtain a first rendering color with the resolution of 1 multiplied by 1.
In an alternative embodiment, the step of mapping the color of the vegetation texture to a preset background texture to obtain a target texture includes: determining a second rendering color of each plant in a preset background texture based on the first rendering color and a preset vegetation vertex color; the plant is at least one plant; determining the rendering position of each plant in a preset background texture; and filling the second rendering color of each plant into the corresponding rendering position to obtain the target texture.
In an alternative embodiment, the step of determining a rendering position of each plant in the preset background texture comprises: obtaining an axis alignment bounding box of the vegetation template, and determining a projection radius based on the maximum coordinate and the minimum coordinate of the axis alignment bounding box projected to a horizontal plane; acquiring a first rendering position of each plant in the target texture; determining the rendering range of each plant in the target texture based on the projection radius and the scaling coefficient; and determining a target rendering position of each plant in the preset background texture based on the first rendering position and the rendering range.
In an alternative embodiment, the method further comprises: and carrying out preset fuzzy operation on the target texture.
In an alternative embodiment, the step of determining the target rendering intensity based on the preset attenuation starting distance and the preset rendering distance includes: determining the distance between the rendering position of the final game picture and the camera as a rendering distance l; based on rendering distance l and preset attenuation starting distance lfAnd a preset rendering distance lrAccording to the formula
Figure BDA0002324015210000021
A target rendering strength f is determined.
In an alternative embodiment, the method further comprises: judging whether the terrain has vegetation coverage, and if so, setting a coverage coefficient to be 1; if not, the coverage coefficient is set to 0; based on the target texture and the target rendering strength, interpolating in the initial game picture, and rendering to obtain a final game picture, wherein the steps comprise: obtaining the initial attenuation starting distance l of the initial game picturefAnd a preset rendering distance lrThe top color of the terrain in between; performing interpolation according to the target texture, the terrain vertex color, the target rendering intensity f and the coverage coefficient and preset interpolation operation to obtain a vertex color to be rendered; the preset interpolation operation is C-lerp (C)t,CeF × c); wherein, CtThe color of the top point of the terrain; ceThe color of the target texture; c is a coverage coefficient; c is a vertex color which is rendered by the planting; and rendering based on the vertex color which is subjected to the plant rendering to obtain a final game picture.
In a second aspect, an embodiment provides an apparatus for rendering a game screen, the apparatus including: the color extraction module is used for extracting the color of the vegetation texture in the initial game picture; the color mapping module is used for mapping the colors of the vegetation textures to preset background textures to obtain target textures; the rendering intensity determining module is used for determining the target rendering intensity based on the preset attenuation starting distance and the preset rendering distance; and the rendering module is used for interpolating in the initial game picture based on the target texture and the target rendering strength and rendering to obtain a final game picture.
In a third aspect, an embodiment provides an electronic device, including a processor and a memory; the memory has stored thereon a computer program which, when executed by the processor, performs the method according to any of the preceding embodiments.
In a fourth aspect, embodiments provide a computer readable storage medium for storing computer software instructions for a method according to any one of the preceding embodiments.
The rendering method of the game picture comprises the steps of firstly extracting the color of vegetation texture in an initial game picture, obtaining target texture by mapping the color of the vegetation texture to preset background texture, determining target rendering strength based on a preset attenuation initial distance and a preset rendering distance, interpolating in the initial game picture based on the target texture and the target rendering strength, and rendering to obtain a final game picture. The vegetation color of the initial game picture is extracted, the extracted vegetation texture is mapped to obtain the target texture, interpolation is carried out between the preset attenuation starting distance and the preset rendering distance according to the target rendering intensity through the obtained target texture to obtain the final rendered game picture, so that the vegetation effect can still be seen at a distance far away from the camera (namely the distance between the preset attenuation starting distance and the preset rendering distance), the vegetation and the terrain have a vivid connection effect, and the reality sense of the game picture is improved. Therefore, the embodiment of the invention can effectively improve the connection effect of the vegetation and the terrain and improve the sense of reality of the game picture.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a schematic flowchart of a method for rendering a game screen according to an embodiment of the present invention;
FIG. 2 is a diagram illustrating the result of a target texture according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating the result of another target texture provided by an embodiment of the present invention;
fig. 4 is a positional relationship between an attenuation start distance and a preset rendering distance according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of a game screen rendering apparatus according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
In the description of the present invention, it should be noted that the terms "first", "second", and the like are used only for distinguishing the description, and are not intended to indicate or imply relative importance.
Furthermore, the term "horizontal" merely means that its orientation is more horizontal than "vertical" and does not mean that the structure must be perfectly horizontal, but may be slightly inclined.
Some embodiments of the invention are described in detail below with reference to the accompanying drawings. The embodiments described below and the features of the embodiments can be combined with each other without conflict.
Considering that the existing game picture has an unnatural connection condition at the place where the vegetation disappears, and in the real nature, when people are farther away from the place where the vegetation is covered, the details of the vegetation, such as the structure of the plant, the veins of the leaves and the like, can not be seen slowly, and finally only the terrain can be seen to be covered by a piece of color, so the invention provides the rendering method, the device and the electronic equipment of the game picture, which can effectively improve the connection effect of the vegetation and the terrain and improve the reality sense of the game picture.
For convenience of understanding, first, a method for rendering a game screen according to an embodiment of the present invention is described in detail, referring to a flowchart of a method for rendering a game screen shown in fig. 1, where the method mainly includes the following steps S102 to S108:
step S102: and extracting the color of the vegetation texture in the initial game picture.
In one embodiment, the initial game scene may include a game scene that has been rendered, but since there may be an unnatural situation in some game scenes where the vegetation is far away from the camera (i.e., where the vegetation disappears), it is necessary to render the vegetation on the scene far away from the camera to obtain a naturally-connected and realistic game scene, first extract the color of the plant leaf in the initial vegetation texture in the initial game scene, and down-sample the color of the plant leaf in the extracted vegetation texture, for example, by using a Box Filter to down-sample the resolution by half each time until a first rendering color (which may also be called a texture color) with a resolution of 1 × 1 is obtained, so as to provide the initial rendering color for the remote game scene rendering.
Step S104: and mapping the colors of the vegetation textures to preset background textures to obtain target textures.
In an embodiment, the preset background texture may be a texture set according to the size of the vegetation texture, the texture includes three channels of RGB, color data is initialized to 0 at first, that is, the background texture is obtained, the unit of the background texture is a pixel, the size may be set to w, the color of the extracted vegetation texture is subjected to color processing and blurring, a vegetation effect similar to a far position in reality can be obtained, the processed vegetation texture is mapped on the preset background texture, and a target texture filled with the color based on the vegetation texture according to the position and the range of the vegetation is obtained.
Step S106: and determining the target rendering intensity based on the preset attenuation starting distance and the preset rendering distance.
In an embodiment, the preset attenuation starting distance may be a distance from a camera to a place where vegetation begins to disappear in an initial game picture, and the preset rendering distance may be a farthest distance that can be displayed in the picture.
Step S108: and based on the target texture and the target rendering strength, performing interpolation in the initial game picture, and rendering to obtain a final game picture.
The obtained target texture and the target rendering strength need to be rendered between the preset attenuation distance and the preset rendering distance in an interpolation mode, so that the vivid effect of vegetation connection in the game picture is achieved.
The method for rendering the game picture comprises the steps of firstly extracting the color of vegetation texture in an initial game picture, mapping the color of the vegetation texture to preset background texture to obtain target texture, determining target rendering strength based on a preset attenuation initial distance and a preset rendering distance, interpolating in the initial game picture based on the target texture and the target rendering strength, and rendering to obtain a final game picture. The vegetation color of the initial game picture is extracted, the extracted vegetation texture is mapped to obtain the target texture, interpolation is carried out between the preset attenuation starting distance and the preset rendering distance according to the target rendering intensity through the obtained target texture to obtain the final rendered game picture, so that the vegetation effect can still be seen at a distance far away from the camera (namely the distance between the preset attenuation starting distance and the preset rendering distance), the vegetation and the terrain have a vivid connection effect, and the reality sense of the game picture is improved. Therefore, the embodiment of the invention can effectively improve the connection effect of the vegetation and the terrain and improve the sense of reality of the game picture.
In one embodiment, the game scene may be divided into small blocks according to a fixed size, the vegetation data may be organized in a block unit and stored in a file, and one vegetation data may represent a vegetation covering a certain range of the terrain. To facilitate understanding of the step S104, the present invention provides a specific implementation manner for mapping the color of the vegetation texture to a preset background texture to obtain a target texture, which mainly includes the following steps 1 to 3:
step 1, determining a second rendering color of each plant in a preset background texture based on the first rendering color and a preset vegetation vertex color. The plant is at least one plant; the preset vegetation vertex color is typically drawn by the designer at the time of editing, such as noting the first rendered color as CtColor of vegetation vertex CvThen the color of each plant in the background texture (i.e., the second rendered color) may be Cr=Cv×Ct
And 2, determining the rendering position of each plant in the preset background texture. In practical application, the vegetation in the game picture is obtained from a plurality of plants or a plurality of clusters of plants, and the color of each plant or each cluster of plants is different for rendering a real effect, so that after the color required for rendering each plant is determined, the rendering position of each plant is calculated, and the real rendering effect is displayed.
And 3, filling the second rendering color of each plant into the corresponding rendering position to obtain a target texture, which is shown in a result schematic diagram of one target texture shown in fig. 2.
In addition, since the texture calculated as described above has a relatively hard boundary at the edge where the vegetation is filled with the color, in order to obtain a smooth and natural effect, a preset blurring operation needs to be performed on the target texture, in this embodiment, the target texture may be blurred by, for example, gaussian blurring, and the target texture after gaussian blurring is shown in another result diagram of the target texture as shown in fig. 3.
The step of determining the rendering position of each plant in the preset background texture may further include the following steps 2.1 to 2.4:
step 2.1, obtaining an axis alignment bounding box of the vegetation template, and determining a projection radius r based on a maximum coordinate max (x, z) and a minimum coordinate min (x, z) of the axis alignment bounding box projected to a horizontal plane, wherein the projection radius r can be represented by a formula
Figure BDA0002324015210000081
And (4) performing representation.
Step 2.2, a first rendering position in the target texture of each plant is obtained. Since the position of the vegetation is a local coordinate value relative to the block in which it is located, it needs to be converted into a pixel coordinate on the target texture, such as can be calculated using the following formula, where h is the size of the vegetation texture, w is the size of the background texture, (t) is the size of the background texturex,ty) I.e., the pixel coordinates in the target texture, where floor is the operation of rounding down.
Figure BDA0002324015210000082
Figure BDA0002324015210000083
Step 2.3, determining the rendering range of each plant in the target texture based on the projection radius and the scaling factor, for example, the rendering range can apply a formula
Figure BDA0002324015210000091
A determination is made, where s is a scaling factor.
Step 2.4, determining the target rendering position of each plant in the preset background texture based on the first rendering position and the rendering range, namely based on the pixel coordinate (t) obtained by the calculationx,ty) And a rendering radius rvAnd determining the target rendering position of each plant in the preset background texture.
In one embodiment, after obtaining the target texture of the vegetation, it is required to apply it as the vertex color of the terrain to the rendering of the ground (i.e. the game picture) of the game scene, the real vegetation will be rendered at a position close to the camera, and the target texture will only begin to appear at a position where the vegetation is far away, so it is required to start to appear according to the preset attenuation starting distance lfAnd a preset rendering distance lrDetermining a rendering intensity, wherein a predetermined decay start distance lfAnd a preset rendering distance lrReferring to a relationship between the attenuation start distance and the preset rendering distance shown in fig. 4, the specific embodiment of determining the target rendering strength based on the preset attenuation start distance and the preset rendering distance includes the following steps a and B:
step A, determining the distance between the rendering position of the final game picture and the camera as a rendering distance l;
step B, based on the rendering distance l and the preset attenuation starting distance lfAnd a preset rendering distance lrIn the above-described manner, the positional relationship of (a),according to the formula
Figure BDA0002324015210000092
A target rendering strength f is determined.
In one embodiment, the method includes the steps of first determining whether the terrain has vegetation coverage, if so, setting the coverage coefficient to 1, otherwise, setting the coverage coefficient to 0, so as to improve the reality of the game scene, and not rendering the vegetation on the originally uncovered part, wherein the step of interpolating in the initial game picture based on the target texture and the target rendering strength to render the initial game picture to obtain the final game picture includes the following steps (1) to (3):
step (1), obtaining the initial attenuation starting distance l of the initial game picturefAnd a preset rendering distance lrThe top color of the terrain in between. In one embodiment, since different terrain vertex colors are set in different seasons, such as green in summer and yellow in autumn, in order to achieve realistic effects of the game picture, a more realistic game picture effect is obtained according to the different terrain vertex colors.
And (2) interpolating according to the target texture, the terrain vertex color, the target rendering intensity f and the coverage coefficient according to a preset interpolation operation to obtain the vertex color to be rendered, wherein the preset interpolation operation can adopt a formula C-lerp (C)t,CeF × C) are interpolated, where CtFor the topographical vertex color, CeThe color of the target texture, C is the coverage coefficient, and C is the vertex color of the plant being rendered.
And (3) rendering based on the vertex color which is subjected to the rendering to obtain a final game picture. And obtaining a vertex color C which is implanted and rendered through the interpolation operation, and filling the corresponding vertex color in each rendering position to obtain a final game picture.
For the above rendering method of a game screen, an embodiment of the present invention provides a rendering apparatus of a game screen, referring to a schematic structural diagram of a rendering apparatus of a game screen shown in fig. 5, the apparatus mainly includes the following components:
a color extraction module 502, configured to extract colors of vegetation textures in an initial game picture;
a color mapping module 504, configured to map a color of a vegetation texture to a preset background texture to obtain a target texture;
a rendering strength determination module 506, configured to determine a target rendering strength based on the preset attenuation starting distance and the preset rendering distance;
and a rendering module 508, configured to interpolate in the initial game image based on the target texture and the target rendering strength, and render to obtain a final game image.
The rendering device of the game picture extracts the color of the vegetation texture in the initial game picture, obtains the target texture by mapping the color of the vegetation texture to the preset background texture, determines the target rendering strength based on the preset attenuation initial distance and the preset rendering distance, interpolates in the initial game picture based on the target texture and the target rendering strength, and renders to obtain the final game picture. The vegetation color of the initial game picture is extracted, the extracted vegetation texture is mapped to obtain the target texture, interpolation is carried out between the preset attenuation starting distance and the preset rendering distance according to the target rendering intensity through the obtained target texture to obtain the final rendered game picture, so that the vegetation effect can still be seen at a distance far away from the camera (namely the distance between the preset attenuation starting distance and the preset rendering distance), the vegetation and the terrain have a vivid connection effect, and the reality sense of the game picture is improved. Therefore, the embodiment of the invention can effectively improve the connection effect of the vegetation and the terrain and improve the sense of reality of the game picture.
In an embodiment, the color extracting module 502 is further configured to extract the color of the plant leaf in the initial vegetation texture in the initial game screen; and performing down-sampling on the colors of the plant leaves in the extracted vegetation texture to obtain a first rendering color with the resolution of 1 multiplied by 1.
In an embodiment, the color mapping module 504 is further configured to determine a second rendering color of each plant in a preset background texture based on the first rendering color and a preset vegetation vertex color; the plant is at least one plant; determining the rendering position of each plant in a preset background texture; and filling the second rendering color of each plant into the corresponding rendering position to obtain the target texture.
In one embodiment, the above apparatus further comprises: the rendering position determining module is used for acquiring an axis alignment bounding box of the vegetation template and determining a projection radius based on the maximum coordinate and the minimum coordinate of the axis alignment bounding box projected to a horizontal plane; acquiring a first rendering position of each plant in the target texture; determining the rendering range of each plant in the target texture based on the projection radius and the scaling coefficient; and determining a target rendering position of each plant in the preset background texture based on the first rendering position and the rendering range.
In an embodiment, the apparatus further includes a blurring module, configured to perform a predetermined blurring operation on the target texture.
In an embodiment, the rendering strength determining module 506 is further configured to determine that a distance between a rendering position of the final game picture and the camera is a rendering distance l; based on rendering distance l and preset attenuation starting distance lfAnd a preset rendering distance lrAccording to the formula
Figure BDA0002324015210000121
A target rendering strength f is determined.
In one embodiment, the above apparatus further comprises: the judging module is used for judging whether the terrain has vegetation coverage, and if so, the coverage coefficient is set to be 1; if not, the coverage coefficient is set to 0; the rendering module 508 is further configured to obtain an initial attenuation starting distance l of the initial game imagefAnd a preset rendering distance lrThe top color of the terrain in between;
interpolating according to the target texture, the terrain vertex color, the target rendering intensity f and the coverage coefficient according to the preset interpolation operation to obtain the plant-renderedVertex color; the preset interpolation operation is C-lerp (C)t,CeF × c); wherein, CtThe color of the top point of the terrain; ceThe color of the target texture; c is a coverage coefficient; c is a vertex color which is rendered by the planting; and rendering based on the vertex color which is subjected to the plant rendering to obtain a final game picture.
The device provided by the embodiment of the present invention has the same implementation principle and technical effect as the method embodiments, and for the sake of brief description, reference may be made to the corresponding contents in the method embodiments without reference to the device embodiments.
The device is an electronic device, and particularly, the electronic device comprises a processor and a storage device; the storage means has stored thereon a computer program which, when executed by the processor, performs the method of any of the above described embodiments.
Fig. 6 is a schematic structural diagram of an electronic device 100 according to an embodiment of the present invention, where the electronic device 100 includes: a processor 60, a memory 61, a bus 62 and a communication interface 63, wherein the processor 60, the communication interface 63 and the memory 61 are connected through the bus 62; the processor 60 is arranged to execute executable modules, such as computer programs, stored in the memory 61.
The memory 61 may include a high-speed Random Access Memory (RAM) and may also include a non-volatile memory (non-volatile memory), such as at least one disk memory. The communication connection between the network element of the system and at least one other network element is realized through at least one communication interface 63 (which may be wired or wireless), and the internet, a wide area network, a local network, a metropolitan area network, and the like can be used.
The bus 62 may be an ISA bus, PCI bus, EISA bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one double-headed arrow is shown in FIG. 6, but that does not indicate only one bus or one type of bus.
The memory 61 is used for storing a program, the processor 60 executes the program after receiving an execution instruction, and the method executed by the apparatus defined by the flow process disclosed in any of the foregoing embodiments of the present invention may be applied to the processor 60, or implemented by the processor 60.
The processor 60 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 60. The Processor 60 may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the device can also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, or a discrete hardware component. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory 61, and the processor 60 reads the information in the memory 61 and, in combination with its hardware, performs the steps of the above method.
The method and apparatus for rendering a game screen and the computer program product of the electronic device provided in the embodiments of the present invention include a computer-readable storage medium storing a non-volatile program code executable by a processor, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by the processor, the method described in the foregoing method embodiments is executed.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process of the system described above may refer to the corresponding process in the foregoing embodiments, and is not described herein again.
The computer program product of the readable storage medium provided in the embodiment of the present invention includes a computer readable storage medium storing a program code, where instructions included in the program code may be used to execute the method described in the foregoing method embodiment, and specific implementation may refer to the method embodiment, which is not described herein again.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (10)

1. A rendering method of a game screen is characterized by comprising the following steps:
extracting the color of vegetation texture in the initial game picture;
mapping the colors of the vegetation textures to preset background textures to obtain target textures;
determining target rendering strength based on a preset attenuation starting distance and a preset rendering distance;
and based on the target texture and the target rendering strength, performing interpolation in the initial game picture, and rendering to obtain a final game picture.
2. The method of claim 1, wherein the step of extracting the color of the vegetation texture in the initial game frame comprises:
extracting the color of plant leaves in the initial vegetation texture in the initial game picture;
and performing down-sampling on the extracted colors of the plant leaves in the vegetation texture to obtain a first rendering color with the resolution of 1 x 1.
3. The method of claim 2, wherein the step of mapping the color of the vegetation texture to a preset background texture to obtain a target texture comprises:
determining a second rendering color of each plant in the preset background texture based on the first rendering color and a preset vegetation vertex color; the plant is at least one;
determining a rendering position of each plant in the preset background texture;
and filling the second rendering color of each plant into the corresponding rendering position to obtain a target texture.
4. The method according to claim 3, wherein the step of determining a rendering position of each of the plants in the preset background texture comprises:
obtaining an axis alignment bounding box of the vegetation template, and determining a projection radius based on the maximum coordinate and the minimum coordinate of the axis alignment bounding box projected to a horizontal plane;
acquiring a first rendering position of each plant in the target texture;
determining a rendering range of each of the plants in the target texture based on the projection radius and a scaling factor;
and determining a target rendering position of each plant in the preset background texture based on the first rendering position and the rendering range.
5. The method of claim 3, further comprising: and carrying out preset fuzzy operation on the target texture.
6. The method of claim 1, wherein the step of determining the target rendering strength based on the preset attenuation start distance and the preset rendering distance comprises:
determining the distance between the rendering position of the final game picture and the camera as a rendering distance l;
based on the rendering distance l and the preset attenuation starting distance lfAnd the preset rendering distance lrAccording to the formula
Figure FDA0002324015200000021
A target rendering strength f is determined.
7. The method of claim 6, further comprising: judging whether the terrain has vegetation coverage, and if so, setting a coverage coefficient to be 1; if not, setting the coverage coefficient to 0;
the step of interpolating and rendering the initial game picture to obtain a final game picture based on the target texture and the target rendering strength includes:
obtaining the initial attenuation starting distance l of the game picture in the preset attenuation starting distancefAnd the preset rendering distance lrThe top color of the terrain in between;
interpolating according to preset interpolation operation according to the target texture, the terrain vertex color, the target rendering intensity f and the coverage coefficient to obtain a vertex color to be implanted and rendered; the preset interpolation operation is C ═ lerp (C)t,CeF × c); wherein, CtThe terrain vertex color is obtained; ceIs the color of the target texture; c is the coverage coefficient; c is the vertex color of the plant rendering;
rendering is carried out based on the vertex color which is subjected to the plant rendering, and the final game picture is obtained.
8. An apparatus for rendering a game screen, the apparatus comprising:
the color extraction module is used for extracting the color of the vegetation texture in the initial game picture;
the color mapping module is used for mapping the colors of the vegetation textures to preset background textures to obtain target textures;
the rendering intensity determining module is used for determining the target rendering intensity based on the preset attenuation starting distance and the preset rendering distance;
and the rendering module is used for interpolating in the initial game picture based on the target texture and the target rendering strength and rendering to obtain a final game picture.
9. An electronic device comprising a processor and a memory;
the memory has stored thereon a computer program which, when executed by the processor, performs the method of any of claims 1 to 6.
10. A computer readable storage medium for storing computer software instructions for use in the method of any one of claims 1 to 6.
CN201911315201.XA 2019-12-18 2019-12-18 Game picture rendering method and device and electronic equipment Active CN111127576B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911315201.XA CN111127576B (en) 2019-12-18 2019-12-18 Game picture rendering method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911315201.XA CN111127576B (en) 2019-12-18 2019-12-18 Game picture rendering method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN111127576A true CN111127576A (en) 2020-05-08
CN111127576B CN111127576B (en) 2023-11-17

Family

ID=70500101

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911315201.XA Active CN111127576B (en) 2019-12-18 2019-12-18 Game picture rendering method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN111127576B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111760290A (en) * 2020-06-11 2020-10-13 网易(杭州)网络有限公司 Information processing method and device, computer equipment and storage medium
CN111798554A (en) * 2020-07-24 2020-10-20 上海米哈游天命科技有限公司 Rendering parameter determination method, device, equipment and storage medium
CN112215968A (en) * 2020-10-29 2021-01-12 网易(杭州)网络有限公司 Model paste processing method and device, storage medium and electronic equipment
CN112206528A (en) * 2020-10-12 2021-01-12 网易(杭州)网络有限公司 Vegetation model rendering method, device, equipment and storage medium
CN112669425A (en) * 2020-12-23 2021-04-16 北京像素软件科技股份有限公司 Hair rendering method, hair rendering device, electronic equipment and readable storage medium
CN112807685A (en) * 2021-01-22 2021-05-18 珠海天燕科技有限公司 Grassland rendering method, grassland rendering device and grassland rendering equipment based on game role track
CN113450443A (en) * 2021-07-08 2021-09-28 网易(杭州)网络有限公司 Rendering method and device of sea surface model

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040263512A1 (en) * 2002-03-11 2004-12-30 Microsoft Corporation Efficient scenery object rendering
CN107952241A (en) * 2017-12-05 2018-04-24 北京像素软件科技股份有限公司 Render control method, device and readable storage medium storing program for executing
WO2019015591A1 (en) * 2017-07-21 2019-01-24 腾讯科技(深圳)有限公司 Method for rendering game, and method, apparatus and device for generating game resource file
CN110115841A (en) * 2019-05-10 2019-08-13 网易(杭州)网络有限公司 The rendering method and device of vegetation object in a kind of scene of game

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040263512A1 (en) * 2002-03-11 2004-12-30 Microsoft Corporation Efficient scenery object rendering
WO2019015591A1 (en) * 2017-07-21 2019-01-24 腾讯科技(深圳)有限公司 Method for rendering game, and method, apparatus and device for generating game resource file
CN107952241A (en) * 2017-12-05 2018-04-24 北京像素软件科技股份有限公司 Render control method, device and readable storage medium storing program for executing
CN110115841A (en) * 2019-05-10 2019-08-13 网易(杭州)网络有限公司 The rendering method and device of vegetation object in a kind of scene of game

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
汪丽萍;何火娇;: "植物叶片渲染方法研究进展", no. 08 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111760290A (en) * 2020-06-11 2020-10-13 网易(杭州)网络有限公司 Information processing method and device, computer equipment and storage medium
CN111798554A (en) * 2020-07-24 2020-10-20 上海米哈游天命科技有限公司 Rendering parameter determination method, device, equipment and storage medium
CN112206528A (en) * 2020-10-12 2021-01-12 网易(杭州)网络有限公司 Vegetation model rendering method, device, equipment and storage medium
CN112206528B (en) * 2020-10-12 2024-03-01 网易(杭州)网络有限公司 Vegetation model rendering method, device, equipment and storage medium
CN112215968A (en) * 2020-10-29 2021-01-12 网易(杭州)网络有限公司 Model paste processing method and device, storage medium and electronic equipment
CN112669425A (en) * 2020-12-23 2021-04-16 北京像素软件科技股份有限公司 Hair rendering method, hair rendering device, electronic equipment and readable storage medium
CN112807685A (en) * 2021-01-22 2021-05-18 珠海天燕科技有限公司 Grassland rendering method, grassland rendering device and grassland rendering equipment based on game role track
CN113450443A (en) * 2021-07-08 2021-09-28 网易(杭州)网络有限公司 Rendering method and device of sea surface model
CN113450443B (en) * 2021-07-08 2023-03-24 网易(杭州)网络有限公司 Rendering method and device of sea surface model

Also Published As

Publication number Publication date
CN111127576B (en) 2023-11-17

Similar Documents

Publication Publication Date Title
CN111127576A (en) Game picture rendering method and device and electronic equipment
US10692197B2 (en) Systems and techniques for automatic image haze removal across multiple video frames
US7557812B2 (en) Multilevel texture processing method for mapping multiple images onto 3D models
US6717586B2 (en) Apparatus, method, program code, and storage medium for image processing
US10055885B2 (en) Systems and methods for digital elevation map filters for three dimensional point clouds
CN103544685B (en) A kind of image composition beautification method adjusted based on main body and system
JP2003044870A (en) Method and apparatus for generating confidence data
CN108765520B (en) Text information rendering method and device, storage medium and electronic device
CN112215934A (en) Rendering method and device of game model, storage medium and electronic device
JP2002183761A (en) Image generation method and device
CN108830923B (en) Image rendering method and device and storage medium
CN113628100A (en) Video enhancement method, device, terminal and storage medium
CN114820292A (en) Image synthesis method, device, equipment and storage medium
KR102352092B1 (en) Image processing method and apparatus, storage medium and electronic device
CN108876729B (en) Method and system for supplementing sky in panorama
CN112085855B (en) Interactive image editing method, device, storage medium and computer equipment
WO2021042552A1 (en) Regional three-dimensional reconstruction method and apparatus, and computer readable storage medium
CN114549732A (en) Model rendering method and device and electronic equipment
CN112053434B (en) Disparity map generation method, three-dimensional reconstruction method and related device
CN109729285B (en) Fuse grid special effect generation method and device, electronic equipment and storage medium
CN107945201B (en) Video landscape processing method and device based on self-adaptive threshold segmentation
CN112700538A (en) LOD generation method and system
AU2011200830B2 (en) Method, apparatus and system for modifying quality of an image
CN111111177A (en) Method and device for disturbing background by special effect of game and electronic equipment
CN113436306B (en) Image rendering method, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant