CN117475053A - Grassland rendering method and device - Google Patents

Grassland rendering method and device Download PDF

Info

Publication number
CN117475053A
CN117475053A CN202311172742.8A CN202311172742A CN117475053A CN 117475053 A CN117475053 A CN 117475053A CN 202311172742 A CN202311172742 A CN 202311172742A CN 117475053 A CN117475053 A CN 117475053A
Authority
CN
China
Prior art keywords
rendering
grass
unit area
determining
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311172742.8A
Other languages
Chinese (zh)
Inventor
刘立
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Yiju Future Network Technology Co ltd
Original Assignee
Guangzhou Yiju Future Network Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Yiju Future Network Technology Co ltd filed Critical Guangzhou Yiju Future Network Technology Co ltd
Priority to CN202311172742.8A priority Critical patent/CN117475053A/en
Publication of CN117475053A publication Critical patent/CN117475053A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Multimedia (AREA)
  • Image Generation (AREA)

Abstract

The application discloses a grassland rendering method and device, which acquire a rendering center in a rendering area; determining a grass insert based on a rendering distance between a unit area in the rendering area and the rendering center, the grass insert including at least one vertex, the number of vertices in the grass insert being inversely related to the rendering distance; a grass blade is inserted into the unit area. The method reduces the risk of processing expenditure caused by excessive number of the vertexes of the rendering due to more vertexes of each inserted grass insert sheet, also reduces the risk of poor rendering effect caused by excessive number of the vertexes of the rendering due to too few vertexes of each inserted grass insert sheet, gives consideration to the rendering effect and the processing expenditure of the rendering, improves the performance and the resource utilization efficiency of a computer as far as possible without sacrificing the rendering effect, and improves the density of grass at the same time, so that the virtual scene is more vivid and fine.

Description

Grassland rendering method and device
Technical Field
The application relates to the technical field of computers, in particular to a grassland rendering method and device.
Background
To render the effect of grass in the terrain, a rendering area is typically manually selected among the rendering areas, and a grass blade is inserted in the selected rendering area. Each grass blade is a polygonal mesh with a fixed number of vertices. For example, each of the inserted grass blades is either a triangle with three vertices or each of the inserted grass blades is a diamond of four vertices and two triangles.
However, when the above method is adopted, if each of the inserted grass blades is a triangle having three vertices, rendering effect is poor, and it is difficult to simulate the shape of grass realistically; if each of the inserted grass blades is diamond-shaped with four vertices, the number of vertices that the hardware responsible for rendering needs to render in the computing device is excessive, resulting in excessive processing overhead for the computing device due to the rendering.
In view of this, in the case of rendering a scene of a grass, further research and discussion are required for how to consider the rendering effect and the processing overhead of rendering to some extent.
Disclosure of Invention
The application provides a grassland rendering method and device, which can give consideration to rendering effects and processing expenditure of rendering to a certain extent. The technical scheme is as follows.
In a first aspect, there is provided a method of grass rendering, the method comprising:
acquiring a rendering center in a rendering area;
determining a grass insert based on a rendering distance between a unit area in the rendering area and the rendering center, the grass insert comprising at least one vertex, the number of vertices in the grass insert being inversely related to the rendering distance;
the grass blades are inserted into the unit areas.
In one possible implementation, the grass blades include a first grass blade and a second grass blade, the number of vertices in the first grass blade being greater than the number of vertices in the second grass blade, the determining a grass blade based on a rendering distance between a unit area in the rendering area and the rendering center, comprising:
in response to determining that the rendering distance is less than or equal to a distance threshold, determining the first grass insert; or,
in response to determining that the rendering distance is greater than or equal to a distance threshold, the second grass insert sheet is determined.
In one possible implementation, the determining a grass blade based on a rendering distance between a unit area in the rendering area and the rendering center includes:
and determining a grass insert based on the rendering distance and a first corresponding relation, wherein the first corresponding relation indicates a corresponding relation between the rendering distance and a first quantity, and the quantity of vertexes in the grass insert is the first quantity.
In one possible implementation, the inserting the grass blades into the unit area includes:
determining the number of grass blades based on the rendering data within the unit area;
Inserting said number of said grass blades into said unit area.
In one possible implementation, the rendering data includes a duty cycle of a grass texture within the unit area, and the determining the number of grass blades based on the rendering data of the unit area includes:
and determining the number of the grass blades based on the duty ratio of the grass texture in the unit area, wherein the number of the grass blades is positively correlated with the duty ratio of the grass texture.
In one possible implementation, the rendering data includes color data of pixels in the unit area, and the determining the number of the grass blades based on the rendering data in the unit area includes:
and determining the number of the grass blades based on the ratio between the color data of the pixel points and the grass color threshold.
In one possible implementation, the determining the number of the grass blades based on a ratio between the color data of the pixel point and a grass color threshold includes:
determining the number of the grass blades based on the ratio between the average value of the color data of each pixel point in the unit area and the grass color threshold value; or,
Determining the number of the grass blades based on the ratio between the color data of the central pixel point in the unit area and the grass color threshold; or,
and determining the number of the grass inserting sheets based on the ratio between the average value of the color data of the pixel points of each angular point in the unit area and the grassland color threshold value.
In one possible implementation, before the determining the grass insert based on the rendering distance between the unit area in the rendering area and the rendering center, the method further includes:
determining that the rendered data for the unit area satisfies a grass texture feature.
In one possible implementation manner, before the acquiring the rendering center in the rendering area, the method further includes:
and determining a rendering area based on the position of the virtual object and a preset radius, wherein the difference between the boundary of the rendering area and the position of the virtual object is the preset radius.
In one possible implementation manner, the acquiring a rendering center in the rendering area includes:
and acquiring the position of the virtual object as a rendering center.
In a second aspect, there is provided a lawn rendering apparatus, the apparatus comprising:
The acquisition module is used for acquiring a rendering center in the rendering area;
a determining module for determining a grass blade based on a rendering distance between a unit area in the rendering area and the rendering center, the grass blade comprising at least one vertex, the number of vertices in the grass blade being inversely related to the rendering distance;
and the rendering module is used for inserting the grass inserting sheet into the unit area.
In one possible implementation, the grass blades include a first grass blade and a second grass blade, the number of vertices in the first grass blade being greater than the number of vertices in the second grass blade, the determining module to determine the first grass blade in response to determining that the rendering distance is less than or equal to a distance threshold; alternatively, the second grass blade is determined in response to determining that the rendering distance is greater than or equal to a distance threshold.
In one possible implementation manner, the determining module is configured to determine a grass insert based on the rendering distance and a first correspondence, where the first correspondence indicates a correspondence between the rendering distance and a first number, and the number of vertices in the grass insert is the first number.
In one possible implementation, the determining module is configured to determine the number of the grass blades based on rendering data in the unit area; the rendering module is used for inserting the quantity of the grass inserting sheets into the unit area.
In one possible implementation, the rendering data includes a duty cycle of the grass texture within the unit area, and the determining module is configured to determine the number of grass blades based on the duty cycle of the grass texture within the unit area, the number of grass blades being positively correlated with the duty cycle of the grass texture.
In one possible implementation, the rendering data includes color data of pixels in the unit area, and the determining module is configured to determine the number of the grass blades based on a ratio between the color data of the pixels and a grass color threshold.
In one possible implementation manner, the determining module is configured to determine the number of the grass blades based on a ratio between an average value of color data of each pixel point in the unit area and a grass color threshold; or determining the number of the grass blades based on the ratio between the color data of the central pixel point in the unit area and the grass color threshold; or determining the number of the grass inserting sheets based on the ratio between the average value of the color data of the pixel points of each angular point in the unit area and the grassland color threshold value.
In one possible implementation, the determining module is configured to determine that the rendering data of the unit area satisfies a grassland texture feature.
In one possible implementation manner, the determining module is further configured to determine a rendering area based on a location where the virtual object is located and a preset radius, where a difference between a boundary of the rendering area and the location where the virtual object is located is the preset radius.
In one possible implementation manner, the obtaining module is configured to obtain, as a rendering center, a location where the virtual object is located.
In a third aspect, a server is provided, the server comprising: a processor coupled to a memory having stored therein at least one computer program instruction that is loaded and executed by the processor to cause the server to implement the method of the first aspect or any of the alternatives of the first aspect.
In a fourth aspect, there is provided a computer readable storage medium having stored therein at least one instruction which when executed on a computer causes the computer to perform the method of the first aspect or any of the alternatives of the first aspect.
In a fifth aspect, there is provided a computer program product comprising one or more computer program instructions which, when loaded and run by a computer, cause the computer to carry out the method of the first aspect or any of the alternatives of the first aspect.
In a sixth aspect, there is provided a chip comprising programmable logic circuitry and/or program instructions for implementing the method of the first aspect or any of the alternatives of the first aspect, when the chip is run.
In a seventh aspect, a server cluster is provided, where the server cluster includes a first server and a second server, and the first server and the second server are configured to cooperatively implement a method according to the first aspect or any of the alternatives of the first aspect.
From this, the embodiment of the application has the following beneficial effects:
because the number of the vertexes in the grass inserting sheets is inversely related to the rendering distance, the unit area far away from the rendering center can be inserted with fewer grass inserting sheets, and the unit area near the rendering center is inserted with more grass inserting sheets, so that the risk of processing expenditure caused by rendering due to excessive number of the vertexes of each inserted grass inserting sheet is reduced, the risk of poor rendering effect due to excessive number of the vertexes of each inserted grass inserting sheet due to insufficient number of the vertexes of each grass inserting sheet is also reduced, the rendering effect and the processing expenditure of the rendering are both considered, the performance and the resource utilization efficiency of a computer are improved under the condition that the rendering effect is not sacrificed as much as possible, and meanwhile, the density of grass is improved, so that a virtual scene is more vivid and fine.
Drawings
FIG. 1 is a flow chart of a method of meadow rendering provided in an embodiment of the present application;
FIG. 2 is a schematic diagram of terrain mesh data provided by an embodiment of the present application;
fig. 3 is a schematic structural diagram of a lawn rendering device according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a server according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Fig. 1 is a flowchart of a grassland rendering method according to an embodiment of the present application. The method shown in fig. 1 includes the following steps S110 to S130.
Step S110, a rendering center in the rendering area is acquired.
The rendering area may be a sub-area in the terrain mesh area. One terrain mesh region includes one or more rendering regions. In one possible implementation, the terrain mesh area is partitioned according to a preset rule to obtain one or more rendering areas.
In one possible implementation of determining the rendering region, the size of the rendering region is determined. And determining the number of the rendering areas to be divided according to the total size of the terrain grid area and the size of the rendering areas. And dividing the complete terrain grid area according to the determined size and the number of the rendering areas. The terrain mesh area may be divided into rendering areas of the same size using an equal division method. The terrain mesh area may also be unevenly divided according to specific needs and terrain features to better accommodate different portions of the terrain. For example, when the terrain is changing more or the details are more, the rendering area may be sized smaller to more finely depict the details of the terrain. And when the terrain variation is gentle or the details are small, the size of the rendering area can be set larger to reduce the calculation amount of rendering.
By dividing the terrain mesh region into one or more rendering regions, it facilitates splitting a complex terrain rendering process (rendering for the entire terrain mesh region) into multiple smaller rendering tasks (rendering for one rendering region), thereby facilitating parallel processing (e.g., rendering multiple rendering regions in parallel), thereby improving rendering performance and efficiency. And rendering each rendering area, so that the rendering of the whole terrain grid area is realized.
In another possible implementation of determining the rendering area, the rendering area is determined from the terrain mesh area as needed. For example, a rendering area is determined from the terrain mesh area based on a location of the virtual object in the terrain mesh area and a preset radius, and a difference between a boundary of the rendering area and the location of the virtual object is the preset radius. Optionally, in the case that N virtual objects exist in the terrain mesh area, determining N rendering areas based on the N virtual objects, where an ith rendering area in the N rendering areas includes the ith virtual object, where N is a positive integer greater than or equal to 1, and i is a positive integer less than or equal to N.
The virtual object is, for example, an object in a game. The virtual object is, for example, a virtual character, e.g., the virtual object is a game character. Game characters have models, animations, and interactive behaviors that can move freely in the game world and interact with other objects. As another example, the virtual object is a virtual prop item, e.g., the virtual object is a collectible collection, weapon, equipment, or other gain item. Alternatively, the virtual object is an enemy or monster that is designed to fight or otherwise interact with the player. Alternatively, the virtual object is an environmental object in the game scene, such as a virtual tree, a virtual building, a virtual rock, or the like. Alternatively, the virtual object is an NPC (non-player character), which is a virtual object controlled by a game program. The NPC may be a merchant, resident, or other character that provides services such as tasks, conversations, or purchases, with which a user may interact, obtain tasks, obtain information, or exchange items. Alternatively, the virtual object is a game task target.
The position where the virtual object is located is, for example, two-dimensional coordinates or three-dimensional coordinates of the virtual object. As one possible implementation manner of determining the rendering area based on the position where the virtual object is located, the position of the virtual object is used as a center of sphere, a radius is set, and all pixel points with a distance from the center of sphere being less than or equal to the radius are selected as the pixel points in the rendering area. The distance between the pixel point and the sphere center can be obtained by calculating the Euclidean distance. For example, the virtual object is located at (x, y, z) and the preset radius is r, and then the boundary of the rendering area may be defined as a spherical area with the virtual object location as the center and the radius of r. As another possible implementation manner of determining the rendering area based on the position of the virtual object, the position of the virtual object is used as a rectangular center point, a width and a length are set, and whether a pixel point is in the rectangular area or not is determined by calculating a lateral distance and a longitudinal distance between the virtual object and the pixel point, so as to determine whether the pixel point belongs to the rendering area. As still another possible implementation manner of determining the rendering area based on the position where the virtual object is located, the position of the virtual object is used as a circle center, a radius is set, and a pixel point with a distance from the circle center being equal to or smaller than the radius is selected as a pixel point in the rendering area.
Because the rendering area is limited to the area around the virtual object, compared with the case of rendering the whole terrain grid area, the rendering of the area far away from the virtual object is not needed, so that the calculated amount of the processor rendering is reduced, the memory space occupied by the rendering process is saved, and the computer hardware requirement required by the rendering is reduced. In addition, the region around the virtual character is rendered with the grass texture, so that the region around the virtual character is in sharp contrast with other regions, for example, the green color and rich details of the grass texture can make the virtual character more obvious, the player can focus attention on the virtual character, and the sense of existence of the virtual character in a game scene is improved.
In one possible implementation, a position of the virtual object in the rendering area is obtained, and the position of the virtual object in the rendering area is taken as a rendering center. By taking the position of the virtual object in the rendering area as the rendering center, when the rendering area is determined based on the position of the virtual object, the parameters of the position of the virtual object are used when the terrain mesh area is divided into a plurality of rendering areas, so that the position of the virtual object acquired before can be directly multiplexed when the rendering center is determined, the position of the virtual object is not required to be acquired again, the data processing amount is reduced, and the efficiency is improved.
Step S120, determining a grass blade based on a rendering distance between the unit area and the rendering center in the rendering area.
Grass blades are small sheet elements used to represent grass. The grass blades represent a small grass mat or leaf in a real grass. Grass blades generally take the form of planar or near-planar geometries. The grass insert includes at least one vertex. For example, one grass blade may have four vertices and the grass blade may have a diamond geometry or a rectangular geometry. A single grass blade may also have three vertices and the grass blade may have a triangular geometry. Of course, three vertices and four vertices are merely examples of the number of vertices of a grass blade, one grass blade may have five vertices, a grass blade may have an irregular polygonal shape, or one grass blade may have six vertices, a grass blade may have a convex hexagonal shape. The grass blades may also be 3D models.
In one possible implementation of obtaining a grass blade, determining a number of vertices of the grass blade, creating a geometric body having the number of vertices based on the number of vertices of the grass blade; mapping the map of grass onto the geometry results in a grass insert. Among other things, the map of grass includes characteristics of the grass such as the color of the grass, the texture and transparency of the grass, etc.
In general, the more the number of the vertexes of the grass insert sheet is, the richer geometric details, smoother curved surfaces and finer texture mapping can be provided, the higher the precision is, the better the rendering effect is, but the more the number of the vertexes of the grass insert sheet is increased, the more the vertexes are calculated and rendered, and the higher the requirement on hardware is caused; conversely, the fewer the number of vertices of the grass blades, the simpler or coarser the appearance of the grass, and the lower the accuracy, resulting in a reduced rendering effect, but also reducing the vertex calculation and rendering operations, thereby reducing the hardware requirements.
For example, when four vertices are used to form two triangular diamond shaped grass blades, so that the grass blades more closely resemble the appearance of real grass, but each grass blade requires rendering four vertices, which is more demanding on hardware. In contrast, the grass insert pieces with three vertexes forming a triangle have a slight difference in realistic effect, but the requirements on hardware can be reduced.
In order to give consideration to the rendering effect and the processing overhead of the rendering, a grass insert to be inserted into the unit area may be determined based on the rendering distance between the unit area and the rendering center in the rendering area, and the number of vertices in the grass insert is inversely related to the rendering distance between the unit area and the rendering center. In other words, the closer the rendering distance between the unit area and the rendering center is, the more the number of the vertexes of the grass blades inserted into the unit area is, so that the rendering effect is more realistic, and the more the rendering distance between the unit area and the rendering center is, the fewer the number of the vertexes of the grass blades inserted into the unit area is, so that the processing overhead generated by rendering is reduced, and the hardware requirement is lowered. Particularly, under the condition that the position of the virtual character is taken as a rendering center, the grassland far away from the virtual character presents a form with less details and less quantity of vertexes, and the grassland near the virtual character presents a form with more details and quantity of vertexes, so that a visual perspective effect is simulated, and the reality and three-dimensional sense of a game are improved.
In one possible implementation, the grass blades include a first grass blade and a second grass blade, the number of vertices in the first grass blade being greater than the number of vertices in the second grass blade, the rendering distance being compared to a distance threshold; responsive to determining that the rendering distance is less than or equal to the distance threshold, determining a first grass insert; alternatively, in response to determining that the rendering distance is greater than or equal to the distance threshold, a second grass blade is determined. For example, when it is determined that the rendering distance is less than or equal to the distance threshold, a grass blade having four vertices (first grass blade) is determined, a grass blade having four vertices is inserted into the unit area, and when it is determined that the rendering distance is greater than or equal to the distance threshold, a grass blade having three vertices (second grass blade) is determined, a grass blade having three vertices is inserted into the unit area. The distance threshold may be set according to the size of the terrain mesh area or rendering area or the requirements of the rendering effect.
In another possible implementation, the grass blades are determined based on the rendering distance and the first correspondence. The first correspondence indicates a correspondence between the rendering distance and the first number, the number of vertices in the grass blades being the first number. The correspondence may be represented using an array, a table, or a function. For example, using the rendering distance as an index, searching a first corresponding relation in a table form to obtain a first quantity; the candidate grass blades are screened for grass blades having a first number of vertices. The correspondence between the rendering distance and the first number may be calculated using linear interpolation. Alternatively, the correspondence between the rendering distance and the number of vertices is represented using an exponential function, allowing the number of vertices at a far distance to decrease exponentially. Alternatively, the first correspondence in the form of a nonlinear function is fitted using a curve function, a polynomial, or the like, to obtain a more accurate number of vertices.
Step S130, inserting a grass insert into the unit area.
In one possible implementation, before inserting the grass blades, determining whether the rendered data for the unit area satisfies a grass texture feature; if the rendering data of the unit area satisfies the grassland texture feature, performing a step of determining a grass inserting sheet based on the rendering distance; if the rendering data of the unit area does not satisfy the grass texture feature, the step of determining the grass blade is canceled. Because the grassland area is not required to be selected manually, the labor cost brought by manually selecting the grassland area is saved, and the rendering efficiency is improved. In addition, when the rendering data of the unit area does not meet the texture characteristics of the grassland, the grass inserting sheet does not need to be inserted, which is equivalent to filtering out some areas unsuitable for inserting the grass inserting sheet before inserting the grass inserting sheet, so that the probability of inserting the grass inserting sheet in the unnecessary areas is reduced, and the rendering load and the resource consumption are reduced.
In one possible implementation of determining that the rendering data of the unit area satisfies the grassland texture feature, determining the number of target pixel points in the unit area based on the color data of the pixel points in the unit area and the grassland color interval; and determining that the rendering data of the unit area meets the texture characteristics of the grassland based on the number of target pixel points in the unit area, and inserting a grass insert into the unit area.
The grass color interval refers to a range for describing the grass color characteristics. For example, under RGB space, a grass color interval may include a range of R, G, B three components; under HSV space, a grass color interval may include ranges of H (hue), S (saturation), and V (brightness) components. In one possible implementation of determining a grass color interval, a real grass map tile is sampled to determine the grass color interval. For example, a variety of real grass photos are collected, including grass photos of different scenes, under different lighting conditions. Sample images representing grass colors are selected from these real grass tiles, and color data is extracted from the sample images as grass color sections. Optionally, the range of the determined grassland color interval is adjusted based on the environmental conditions corresponding to the grassland image, such as sunlight intensity, shadow, reflection of surrounding objects, and the like, so that the grassland color interval can comprise grassland color changes under different environments.
The target pixel point is a pixel point of which the color data belongs to a grassland color interval. For example, the color data of one pixel is compared with the upper and lower bounds of the grassland color section. If the color data of the pixel is within the grassland color interval, the pixel is the target pixel. Considering that color data is typically defined by a plurality of color channels, in one possible implementation, a value of each color channel in the color data of a pixel is compared with a range upper and lower bound of a corresponding channel in a grassland color range, and if the value of each color channel in the color data of a pixel belongs to the range upper and lower bound of the corresponding channel, the pixel is the target pixel. Taking an example in RGB space, for example, if a pixel is within the upper and lower bounds of the red channel, the value of which is within the grassland color interval, and the value of which is within the upper and lower bounds of the green channel, the value of which is within the grassland color interval, and the value of which is within the upper and lower bounds of the blue channel, the value of which is within the grassland color interval, the pixel is the target pixel.
In one possible implementation, each pixel in the unit area is traversed, if the color data of the pixel is within the defined grassland color interval, the pixel is determined as a target pixel, the number of recorded target pixels is increased by one until the last pixel in the unit area is traversed, and the number of recorded target pixels is output.
In one possible implementation, the number of target pixel points within a unit area is compared to a number threshold. And when the number of the target pixel points in the unit area is larger than or equal to the number threshold, determining that the rendering data of the unit area meets the texture characteristics of the grassland, and inserting the grassland inserting sheet into the unit area. When the number of the target pixel points in the unit area is smaller than the number threshold, the rendering data of the unit area is determined to not meet the texture characteristics of the grassland, and the grass inserting sheet is not required to be inserted into the unit area. By comparing the number of the target pixel points in the unit area with the number threshold value, whether the grassland texture characteristics are met or not can be rapidly judged, and the method is relatively simple and easy to realize. And compared with manual processing, the method can automatically select the area to be rendered into the grassland, so that the method is faster and more accurate.
In another possible implementation manner, a ratio between the number of target pixel points in the unit area and the total number of pixel points in the unit area is obtained, and the ratio is compared with a first ratio threshold; and when the ratio is greater than or equal to the first ratio threshold, determining that the rendering data of the unit area meets the texture characteristics of the grassland, and inserting a grass insert into the unit area. And when the ratio is smaller than the first ratio threshold, determining that the rendering data of the unit area does not meet the texture characteristics of the grassland, and inserting a grass insert into the unit area is not needed. In one exemplary scenario, a plurality of topographical texture maps (e.g., 5 topographical texture maps) are used in a unit area to render, the plurality of topographical texture maps comprising a grass texture map, the ratio of the grass texture map determining whether to insert a grass blade into the unit area. The duty ratio of the grass texture map is, for example, the ratio between the number of pixels (target pixels) occupied by the grass texture and the total number of pixels, that is, the ratio between the number of target pixels in the unit area and the total number of pixels in the unit area. If the grass texture occupies a large portion of the pixels, the grass texture will have a higher duty cycle. For example, the first ratio threshold is 20%, and when the lawn texture ratio is greater than 20%, which indicates that enough lawn texture exists in the unit area, the grass blades are inserted into the unit area, so that the sense of realism of the grass is increased. In addition, by comparing the ratio between the number of the target pixels and the total number of the pixels in the unit area, the relative proportion of the grasslands in the rendering data can be better grasped, in addition, the total number of the pixels in the unit area can be changed under different zoom levels or visual angles, and the ratio between the number of the target pixels and the total number of the pixels in the unit area has relative invariance, so that the rendering requirements under different scales can be met. And compared with manual processing, the method can automatically select the area to be rendered into the grassland, so that the method is faster and more accurate.
In one possible implementation, based on the color data of each pixel point in the unit area, acquiring a statistical value of the color data of the pixel point in the unit area, determining a ratio between the statistical value of the color data and a grassland color threshold value, and comparing the ratio between the statistical value of the color data and the grassland color threshold value with a second ratio threshold value; if the ratio between the statistical value of the color data and the grassland color threshold is greater than or equal to a second ratio threshold, determining that the rendering data of the unit area meets grassland texture characteristics, and inserting a grassland inserting sheet into the unit area. If the ratio between the statistical value of the color data and the grassland color threshold is smaller than the second ratio threshold, determining that the rendering data of the unit area does not meet the grassland texture characteristics, and no grass inserting sheet is required to be inserted into the unit area. The ratio between the statistical value of the color data and the grass color threshold can be understood as the grass texture duty cycle. Taking the second ratio threshold of 20% as an example, if the lawn texture ratio is greater than or equal to 20%, a grass blade is inserted into the unit area.
The manner of obtaining the statistical value of the color data of the pixel points in the unit area is, for example: an average value of color data of each pixel point in the unit area is acquired. For example, adding the color data of all the pixels in the unit area, dividing the sum of the color data by the number of the pixels to obtain an average value of the color data; accordingly, if the average value of the color data in the unit area is greater than or equal to the second ratio threshold, a grass blade is inserted into the unit area. The color data of all pixel points in the unit area are utilized, which is equivalent to considering the color characteristics of the whole unit area, so that the method is more accurate. Meanwhile, the average value is calculated to serve as a quantization condition for judging whether the grass area or the grass inserting sheet should be selected, so that the influence of noise interference is reduced, and the result is smoother and stable.
The manner of obtaining the statistical value of the color data of the pixel points in the unit area is as follows: and acquiring color data of a central pixel point in the unit area. Correspondingly, if the color data of the central pixel point in the unit area is larger than or equal to the second ratio threshold value, inserting a grass insert sheet into the unit area. Because the color data of the central pixel point is utilized, and the color data of all the pixel points in the unit area are not required to be utilized, the data quantity required to be processed is reduced, and the calculation efficiency is improved. And moreover, the color characteristics of corners of the unit area can be better captured, and the method is suitable for scenes with obvious grassland edges.
The method for obtaining the statistic value of the color data of the pixel points in the unit area is as follows: and obtaining the average value of the color data of the pixel points of each corner point in the unit area. For example, the average value of the color data of the four corner points of the upper left corner point, the lower left corner point, the upper right corner point, and the lower right corner point in the unit area is acquired. Correspondingly, if the average value of the color data of the pixel points of each corner point in the unit area is larger than or equal to the second ratio threshold value, inserting a grass insert sheet into the unit area.
In another possible implementation, the rendering data includes texture data of pixels within a unit area. Whether to insert the grass insert into the unit area can be judged according to the texture data of the pixel points in the unit area. Grass materials are often of a degree of fineness and may include fine textured elements (e.g., grass blades, stems, etc.). It is possible to detect whether or not texture elements conforming to the characteristics of grasslands (such as fine and regular textures, for example, elongated spots, mottled shapes, etc.) are present in the texture data of the unit area, and if texture elements conforming to the characteristics of grasslands are present in the texture data of the unit area, the unit area is judged as the grassland area, and a grass blade is inserted into the unit area.
In another possible implementation, the rendering data includes illumination data for pixels within a unit area. Whether to insert the grass insert sheet into the unit area can be judged according to the illumination data of the pixel points in the unit area. Grasslands generally exhibit a change in shade upon illumination, with a certain highlighting and shading effect. The illumination data in the unit area can be detected to determine whether the illumination distribution and the shadow effect conforming to the characteristics of the grassland exist, thereby determining whether to insert the grass insert into the unit area. For example, if the illumination data in a unit area is represented to have a brighter illumination reflection characteristic and a reflection change occurs under different illumination angles, the unit area is judged as a grassland area, and a grass insert is inserted into the unit area.
In another possible implementation, at least two of color data, texture data, illumination data, texture data and geometric data of the unit area are combined to determine whether the unit area meets the grassland texture feature, and when the unit area meets the grassland texture feature, the grass insert is inserted into the unit area, and when the unit area does not meet the grassland texture feature, the grass insert is not inserted into the unit area.
In one possible implementation, the number of grass blades is determined based on rendering data within the unit area; obtaining the number of grass blades; a number of grass blades are inserted into the unit area. In this way, the number of the grass blades in the unit area is more matched with the rendering data of the unit area, so that the distribution of the grass blades is coordinated with the whole environment of the unit area. Moreover, the number of the grass inserting sheets is proper, the probability of increasing the rendering burden and reducing the performance due to too many inserted grass inserting sheets is reduced, and the probability of poor rendering effect due to too few grass inserting sheets is reduced.
In one possible implementation, the rendering data includes a duty cycle of the grass texture within the unit area, and the number of grass blades is determined based on the duty cycle of the grass texture within the unit area. The number of grass blades is positively correlated with the duty cycle of the grass texture. That is, the larger the duty cycle of the grass texture, the greater the number of grass blades. Illustratively, a second correspondence between the number of grass blades and the duty cycle of the grass texture is preset, an input parameter of the second correspondence includes the duty cycle of the grass texture in the unit area, and an output parameter of the second correspondence includes the number of grass blades. And determining the number of the grass blades based on the duty ratio of the grass texture in the unit area and the second corresponding relation. The second correspondence between the number of grass blades and the duty ratio of the grass texture may be a function, may be a linear second correspondence, or may be an exponential function, a logarithmic function, a piecewise function, or the like to take a nonlinear function as the second correspondence. Because the quantity of the grass inserting sheets is positively correlated with the duty ratio of the grass textures, the situation that the grass corresponding to greener places of the grass in reality is simulated is realized, and the quantity of the grass is more adaptive to the colors of the grass.
In one possible implementation, the rendering data includes color data of pixel points within a unit area, a ratio between the color data of the pixel points and a grass color threshold is obtained, and a number of grass blades is determined based on the ratio between the color data of the pixel points and the grass color threshold. The grass color threshold is used to determine a color range belonging to the grass. The ratio between the color data of the pixel point and the grass color threshold is obtained, for example, by comparing the distances, for example, calculating the distance (such as euclidean distance, difference degree, etc.) between the color value of the pixel point and the grass color threshold, and converting the distance into the ratio. The method of obtaining the ratio between the color data of the pixel point and the grass color threshold value is, for example, to calculate the similarity between the color value of the pixel point and the grass color threshold value, and the similarity is converted into the ratio. The method of obtaining the ratio between the color data of the pixel point and the grassland color threshold value is, for example, to compare the color value of the pixel point with the grassland color threshold value to obtain a boolean value, where the boolean value indicates whether the color value of the pixel point is in the color range indicated by the grassland color threshold value.
In one possible implementation, the number of grass blades is determined based on the ratio between the average of the color data for each pixel point within the unit area and the grass color threshold. For example, a third correspondence relationship between the average value of the color data of the pixel points and the number of grass blades is set, and the number of grass blades is determined based on the average value of the color data of the pixel points and the third correspondence relationship. For another example, when the ratio between the average value of the color data of each pixel point in the unit area and the grassland color threshold value is greater than or equal to the threshold value, determining the number of the grass blades as the first number; and when the ratio of the average value of the color data of each pixel point in the unit area to the grassland color threshold value is smaller than the threshold value, determining the number of the grass blades as a second number. The number of grass blades may be positively correlated with the ratio between the average of the color data for each pixel point and the grass color threshold. Thus, as the average of the color data increases, the number of grass blades increases accordingly, thereby simulating more grass in greener areas.
In one possible implementation, the number of grass blades is determined based on the ratio between the color data of the center pixel point within the unit area and the grass color threshold. For example, a fourth correspondence between the color data of the center pixel point and the number of grass blades is set, and the number of grass blades is determined based on the color data of the center pixel point and the fourth correspondence. For another example, when the ratio between the color data of the central pixel point in the unit area and the grassland color threshold value is greater than or equal to the threshold value, determining the number of the grass blades as a first number; and when the ratio of the color data of the central pixel point in the unit area to the grassland color threshold value is smaller than the threshold value, determining the number of the grass inserting sheets as a second number. The number of grass blades may be positively correlated with the ratio between the average of the color data for each pixel point and the grass color threshold. The number of grass blades may be positively correlated with the ratio between the color data of the center pixel point and the grass color threshold. Thus, when the color data of the central pixel point is increased, the number of the grass inserting sheets is correspondingly increased, so that the grass corresponding to the greener place of the simulated grass is more.
In one possible implementation, the number of grass blades is determined based on the ratio between the average of the color data of the pixel points of the respective corner points within the unit area and the grass color threshold. For example, a fifth correspondence between the average value of the color data of the pixel points of each corner point and the number of grass blades is set, and the number of grass blades is determined based on the average value of the color data of the pixel points of each corner point and the fifth correspondence. For another example, when the ratio between the average value of the color data of the pixel points of each corner point and the grassland color threshold value is greater than or equal to the threshold value, determining the number of the grass blades as a first number; and when the ratio of the average value of the color data of the pixel points of each corner point to the grassland color threshold value is smaller than the threshold value, determining the number of the grass inserting sheets as a second number. The number of grass blades may be positively correlated with the ratio between the average of the color data of the pixel points of the respective corner points and the grass color threshold. Thus, when the color data of each corner point is increased, the number of the grass inserting sheets is correspondingly increased, so that the grass corresponding to the greener place of the simulated grass is more.
According to the method provided by the embodiment, as the number of the vertexes in the grass inserting sheets is inversely related to the rendering distance, the unit area far away from the rendering center can be inserted with fewer grass inserting sheets, the unit area near the rendering center is inserted with more grass inserting sheets, so that the risk of processing expenditure caused by rendering due to excessive number of the vertexes of each inserted grass inserting sheet caused by more vertexes is reduced, the risk of poor rendering effect due to excessive number of the vertexes of each inserted grass inserting sheet caused by too few vertexes is also reduced, the rendering effect and the processing expenditure of rendering are considered, the performance and the resource utilization efficiency of a computer are improved under the condition that the rendering effect is not sacrificed as much as possible, and meanwhile, the density of grass is improved, so that a virtual scene is more vivid and fine.
In addition, the rendering region selective rendering and the grass blade precision rendering may be performed simultaneously when dynamically rendering around the virtual object. For example, a first virtual object and a second virtual object exist in a terrain mesh area, a first thread performs rendering by taking an area in a preset radius range where the first virtual object is located as a rendering area, a second thread determines the number of vertexes corresponding to the rendering distance based on the rendering distance between a unit area in the rendering area where the second virtual object is located and the second virtual object, and a second thread inserts a grass insert sheet with the number of vertexes.
In another possible implementation, the camera position is acquired as the rendering center. Determining a grass insert based on a rendering distance between the unit area and the camera position in the rendering area, the grass insert comprising at least one vertex, the number of vertices in the grass insert being inversely related to the rendering distance; a grass blade is inserted into the unit area. Because the camera position is related to the view angle of the user, the number of the vertexes of the grass inserting sheets is inversely related to the rendering distance between the camera positions by taking the camera position as the center, so that the precision of the grass inserting sheets can be changed along with the fact that the view angle is away from or close to a unit area, a gradual transition effect can be visually realized, the detail rendering of a scene is more real, in addition, the grass inserting sheets with small number of the vertexes are used when the view angle is away from the unit area, the processing cost generated by rendering is reduced, and the hardware requirement is reduced.
In another possible implementation, the light source position is obtained as the light source position. Determining a grass insert based on a rendering distance between the unit area and the light source position in the rendering area, the grass insert including at least one vertex, the number of vertices in the grass insert being inversely related to the rendering distance; a grass blade is inserted into the unit area. In other words, the closer the rendering distance between the unit area and the light source position is, the more the number of the vertexes of the grass inserting sheet inserted into the unit area is, so that the rendering effect is more realistic, and the farther the rendering distance between the unit area and the light source position is, the fewer the number of the vertexes of the grass inserting sheet inserted into the unit area is, so that the processing expenditure generated by rendering is reduced, and the hardware requirement is lowered. In addition, the area close to the light source is provided with the grass inserting sheet with higher detail, so that the area around the light source can more truly and naturally present grass distribution, and the visual effect is improved.
In another possible implementation, the user interaction location is obtained as the user interaction location. Determining a grass insert based on a rendering distance between the unit area in the rendering area and the user interaction position, the grass insert including at least one vertex, the number of vertices in the grass insert being inversely related to the rendering distance; a grass blade is inserted into the unit area. In other words, the closer the rendering distance between the unit area and the user interaction position is, the more the number of the vertexes of the grass inserting sheet inserted into the unit area is, so that the rendering effect is more realistic, and the farther the rendering distance between the unit area and the user interaction position is, the fewer the number of the vertexes of the grass inserting sheet inserted into the unit area is, so that the processing overhead generated by rendering is reduced, and the hardware requirement is lowered. In addition, when the user moves or operates, the number of the vertexes of the grass blades can change correspondingly, so that the user can feel that the change in the scene occurs based on the position and the action of the user, and the grass blades are more dynamic. In addition, the user can decide the rendering center through own interaction selection, so that customized rendering according to the actual demands of the user is facilitated.
In one possible implementation, a resolution of the texture map is determined based on a distance between a unit area in the rendering area and the rendering center, the resolution of the texture map being inversely related to the distance; a grass insert having a resolution of the texture map is determined.
The resolution of the texture map is the number of pixels comprised by the fingerprint map, the higher the resolution of the texture map, the denser the pixels in the texture image, and the more clear the details of the grass blades. Since the resolution of the texture map is dynamically determined according to the distance from the unit area to the rendering center, a unit area closer to the rendering center requires a higher resolution texture map, while a unit area farther away may use a lower resolution texture map. For a unit area far away from a rendering center, the low-resolution texture mapping is used, so that the memory occupation and the rendering cost can be reduced, and the rendering performance is improved. For a unit area close to a rendering center, the high-resolution texture mapping is used, so that the grass inserting sheet has higher-precision details, and a more real visual effect is created.
In one possible implementation, boundaries between multiple rendering regions in a terrain mesh region are smoothed. Smoothing refers to processing boundaries between rendering regions to reduce discontinuities, jagged or noticeable transitions, making the rendering boundaries more natural and smooth. The manner of smoothing includes, but is not limited to, at least one of fusion seams, texture transitions, normal averaging, vertex scaling, or edge blurring.
Blending seams, such as by blending texture, color, or other properties of adjacent rendered regions, may be accomplished by drawing a seam region at the boundary, performing texture interpolation, color blending, etc. within the seam region. The seam region may use fade and transition effects to gradually merge features of adjacent rendering regions. Texture transitions, for example, use and transition multiple textures through blending techniques to smooth the boundaries between rendering regions. Texture transition can be realized by using techniques such as map fusion and weight blending, and the transition effect between different textures can be controlled by adjusting the transparency of the textures or using Alpha maps. The normal average, for example, smoothes the normal vector between adjacent rendering regions. By calculating the normal direction of the adjacent surface and performing averaging or interpolation processing, the normal difference can be reduced, the boundary can be smoother, and a more realistic rendering effect can be produced using the smoothed normal. Vertex scaling is achieved by, for example, adjusting the vertex positions on the boundaries of adjacent rendering regions to make gradual transitions smooth between adjacent regions, and vertex scaling may be achieved by interpolation or weighting, etc., to make the height and shape changes of adjacent regions transition smoothly. Edge blurring, for example, applies blurring effects around rendered boundaries to reduce jaggy feel and hard edge phenomena of edges. Edge blurring may be performed on edge pixels using gaussian blur or the like algorithms and smooth blending of the color of the edge pixels with surrounding pixels.
The process of constructing a terrain mesh area is illustrated below.
A terrain mesh area is a discretized area used to represent terrain, typically for terrain rendering and simulation. In computer graphics, a topographical grid area may be considered a two-dimensional or three-dimensional grid consisting of a series of adjacent vertices and edges connecting them. In the two-dimensional case, the terrain mesh area is a mesh plane, consisting of a series of vertices and edges. Each vertex represents a point of the topographical surface and the edges represent the connection between adjacent points. Each vertex may be assigned a height value to simulate a change in height of the terrain. The use of interpolation algorithms can infer the height of unknown points by known height values to form a smooth terrain model. In three dimensions, a topographical grid region is a grid of triangles (or other polygons) connected to resemble the surface of a grid. Each vertex represents a point of the terrain surface and each triangle represents a small segment of the terrain. Each vertex may be assigned a height value to simulate the change in height of the terrain, and a continuous terrain surface model may be obtained by interpolating and smoothing the height values over each triangle. The fineness of the terrain mesh region depends on the resolution of the mesh, i.e., the number of vertices. Higher resolution may provide more detailed and realistic terrain manifestations, but may also increase computational and rendering complexity. Lower resolution may improve performance but may result in reduced definition of the terrain surface.
In one possible implementation, in the selected grid, the height value of each pixel point in the grid and the position information of each pixel point are determined through a preset sequence array. And constructing a terrain grid area based on the height value of the pixel point and the position information of the pixel point.
In one possible implementation, the terrain mesh data behaves like a gray scale map. For example, the height change of the terrain is represented using the color shading (gray scale) in the height topography. Specifically, when an area of an area in the altitude map is whiter, the greater the altitude value of the area is indicated, that is, the higher the terrain of the area is. Conversely, when a region in a high topography is darker or darker, it is indicated that the region has a smaller height value, i.e., the topography is lower. For example, referring to fig. 2, fig. 2 is a schematic diagram of terrain mesh data according to an embodiment of the present application, and by observing the brightness of the color in fig. 2, the height change situation of the terrain can be intuitively understood.
A sequence array refers to an array in which sequences are used as data. In other words, each element in the sequence array is a sequence type of data. The number of bits per element (i.e., sequence) in the sequence array may be set according to the accuracy requirements. For example, the number of bits of the sequence may be 8 bits or 16 bits, etc. The greater the number of bits of the sequence, the finer the height value representation can be provided. The value of the sequence corresponds to the height value. The sequence is in binary format.
The storage form of the array means that the distribution of the sequences is ordered. The number of the sequence data of each row and each column of the array corresponds to the number of the pixel points in the terrain grid. For example, the number of sequence data per row of the array indicates the number of pixels per row in the terrain grid. The number of sequence data per column of the array indicates the number of pixels per column in the terrain mesh. For example, the array has 1080 sequences per row, the array has 960 sequences per column, the data representing the terrain grid contains 1080×960 pixels, and the terrain grid has 1080 pixels per row, and 960 pixels per column. The horizontal distance between each pixel point is the same, so that the sequence is used as the height value of the pixel point to render the height topographic map.
The process of constructing a rendering region is illustrated below.
In one possible implementation, the terrain mesh region is rendered based on vertex data in the terrain mesh data and texture feature configuration, resulting in a rendered region.
In one possible implementation, vertex data in the terrain mesh data is obtained based on coordinate information of pixel points in the terrain mesh data and height values of the pixel points. Each pixel point in the sampled image serves as a vertex in the terrain mesh area. The data of the vertex includes coordinate information and a height value, which is subsequently used for texture rendering of the vertex.
And the server acquires all vertex data of the terrain grid area, and then performs texture rendering on all vertices of the terrain grid area according to texture feature configuration required to be rendered.
In general, texture rendering is required to be performed on the whole terrain grid area, but some terrains are limited by hardware because of larger terrains, the whole terrain grid area cannot be simultaneously rendered at one time, or the whole terrain grid area cannot be rendered at one time without occupying too much hardware. Therefore, the complete terrain grid area is divided into a plurality of rendering areas, and then the rendering areas are sequentially rendered according to a preset rule, so that the rendering of the whole terrain grid area is realized. Alternatively, instead of rendering the entire terrain mesh area at the same time, a specific rendering area may be rendered as needed, for example, a rendering area related to a position of a designated virtual character within a preset radius may be used as a specific rendering area, so that a hardware requirement required for rendering may be reduced.
The texture feature configuration is performed by using a topographic texture map as topographic texture data. Specifically, the topographic texture map is analyzed to obtain color distribution data of all pixel points in the topographic texture map, the color data are RGB data, and then the color distribution data are used as topographic texture data (including the number of the pixel points and the colors corresponding to the pixel points).
In one possible implementation, the terrain mesh region to be rendered is rendered entirely by closely paving terrain texture data in the terrain mesh region.
By tiling is meant tiling and filling the topographical texture data (or map) within the topographical grid area such that the entire area surface of the topographical grid area is completely covered by the map.
In one possible implementation, the precision of tiling is adjusted by configuring the tiling parameters. For example, the pixel size of the topographic texture map is 20×20, a tile is required in a rendering area with a size of 40×40, and according to the difference of tiling degrees, 1 can be selected in the rendering area with a size of 40×40: 1, wherein at the moment, each pixel point in a rendering area of 40 x 40 is covered and rendered one by one, and at the moment, the number of tiled topographic texture maps is 4; when 1 is selected: 4, only one quarter of the pixels in the rendering area of 40×40 are covered, i.e. one pixel out of every 4 pixels is rendered.
Alternatively, the method of FIG. 1 is performed by a computing device. Alternatively, the method shown in FIG. 1 is performed cooperatively by a computing device cluster that includes a plurality of computing devices. For example, computing device a performs S110 in the method shown in fig. 1, and computing device B performs S120 in the method shown in fig. 1. The computing device is, for example, a terminal or a server. In one possible implementation, the method shown in FIG. 1 is performed by a computing device through an operating application. The application program is, for example, browser software or client software, and the execution subject of the method shown in fig. 1 is not limited in this embodiment.
Fig. 3 is a schematic structural diagram of a lawn rendering device according to an embodiment of the present application, where the device 200 shown in fig. 3 includes:
an obtaining module 210, configured to obtain a rendering center in the rendering area;
a determining module 220 for determining a grass insert based on a rendering distance between the unit area and the rendering center in the rendering area, the grass insert including at least one vertex, the number of vertices in the grass insert being inversely related to the rendering distance;
and a rendering module 230 for inserting the grass blades into the unit area.
In one possible implementation, the grass blades include a first grass blade and a second grass blade, the number of vertices in the first grass blade being greater than the number of vertices in the second grass blade, the determining module 220 to determine the first grass blade in response to determining that the rendering distance is less than or equal to the distance threshold; alternatively, in response to determining that the rendering distance is greater than or equal to the distance threshold, a second grass blade is determined.
In one possible implementation, the determining module 220 is configured to determine the grass insert based on the rendering distance and a first correspondence, where the first correspondence indicates a correspondence between the rendering distance and a first number, and the number of vertices in the grass insert is the first number.
In one possible implementation, the determining module 220 is configured to determine the number of grass blades based on rendering data within the unit area; and a rendering module 230 for inserting a number of grass blades into the unit area.
In one possible implementation, the rendering data includes a duty cycle of the grass texture within the unit area, and the determining module 220 is configured to determine a number of grass blades based on the duty cycle of the grass texture within the unit area, the number of grass blades being positively correlated with the duty cycle of the grass texture.
In one possible implementation, the rendering data includes color data of pixels in a unit area, and the determining module 220 is configured to determine the number of grass blades based on a ratio between the color data of the pixels and a grass color threshold.
In one possible implementation, the determining module 220 is configured to determine the number of grass blades based on a ratio between an average value of color data of each pixel point in the unit area and a grass color threshold; or determining the number of the grass blades based on the ratio between the color data of the central pixel point in the unit area and the grass color threshold; or determining the number of the grass blades based on the ratio between the average value of the color data of the pixel points of each angular point in the unit area and the grassland color threshold value.
In one possible implementation, the determining module 220 is configured to determine that the rendering data of the unit area satisfies the grass texture feature.
In a possible implementation manner, the determining module 220 is further configured to determine a rendering area based on a location where the virtual object is located and a preset radius, where a difference between a boundary of the rendering area and the location where the virtual object is located is the preset radius.
In one possible implementation, the obtaining module 210 is configured to obtain, as a rendering center, a location where the virtual object is located.
Fig. 4 is a schematic structural diagram of a server according to an embodiment of the present application, where the server 300 includes: processor 301, processor 301 being coupled to memory 302, memory 302 having stored therein at least one computer program instruction that is loaded and executed by processor 301 to cause server 300 to implement the method provided by the embodiment of fig. 1.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are referred to each other, and each embodiment is mainly described as a difference from other embodiments.
A refers to B, referring to a simple variation where A is the same as B or A is B.
The terms "first" and "second" and the like in the description and in the claims of embodiments of the present application are used for distinguishing between different objects and not necessarily for describing a particular sequential or chronological order of the objects, and should not be interpreted to indicate or imply relative importance. For example, the first and second grass blades are used to distinguish between different grass blades, rather than to describe a particular sequence of grass blades, nor is it to be understood that the first grass blade is more important than the second grass blade.
Information (including but not limited to user equipment information, user personal information, etc.), data (including but not limited to data for analysis, stored data, presented data, etc.), and signals, which are all authorized by the user or sufficiently authorized by the parties, and the collection, use, and processing of relevant data requires compliance with relevant laws and regulations and standards of the relevant countries and regions.
The above-described embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces, in whole or in part, the procedures or functions described in accordance with embodiments of the present application. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in or transmitted from one computer-readable storage medium to another, for example, by wired (e.g., coaxial cable, optical fiber, digital Subscriber Line (DSL)), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid State Disk (SSD)), etc.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting thereof; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the corresponding technical solutions from the scope of the technical solutions of the embodiments of the present application.

Claims (15)

1. A method of grass rendering, the method comprising:
acquiring a rendering center in a rendering area;
determining a grass insert based on a rendering distance between a unit area in the rendering area and the rendering center, the grass insert comprising at least one vertex, the number of vertices in the grass insert being inversely related to the rendering distance;
the grass blades are inserted into the unit areas.
2. The method of claim 1, wherein the grass blades include a first grass blade and a second grass blade, the number of vertices in the first grass blade being greater than the number of vertices in the second grass blade, the determining a grass blade based on a rendering distance between a unit area in the rendering area and the rendering center comprising:
In response to determining that the rendering distance is less than or equal to a distance threshold, determining the first grass insert; or,
in response to determining that the rendering distance is greater than or equal to a distance threshold, the second grass insert sheet is determined.
3. The method of claim 1, wherein the determining a grass insert based on a rendering distance between a unit area of the rendering area and the rendering center comprises:
and determining a grass insert based on the rendering distance and a first corresponding relation, wherein the first corresponding relation indicates a corresponding relation between the rendering distance and a first quantity, and the quantity of vertexes in the grass insert is the first quantity.
4. The method of claim 1, wherein said inserting the grass blades into the unit area comprises:
determining the number of grass blades based on the rendering data within the unit area;
inserting said number of said grass blades into said unit area.
5. The method of claim 4, wherein the rendering data includes a duty cycle of a grass texture within the unit area, and wherein determining the number of grass blades based on the rendering data of the unit area includes:
And determining the number of the grass blades based on the duty ratio of the grass texture in the unit area, wherein the number of the grass blades is positively correlated with the duty ratio of the grass texture.
6. The method of claim 4, wherein the rendering data includes color data for pixels within the unit area, and wherein the determining the number of grass blades based on the rendering data within the unit area includes:
and determining the number of the grass blades based on the ratio between the color data of the pixel points and the grass color threshold.
7. The method of claim 4, wherein the determining the number of grass blades based on a ratio between the color data of the pixel points and a grass color threshold comprises:
determining the number of the grass blades based on the ratio between the average value of the color data of each pixel point in the unit area and the grass color threshold value; or,
determining the number of the grass blades based on the ratio between the color data of the central pixel point in the unit area and the grass color threshold; or,
and determining the number of the grass inserting sheets based on the ratio between the average value of the color data of the pixel points of each angular point in the unit area and the grassland color threshold value.
8. The method of claim 1, wherein the determining a grass blade is preceded by determining a grass blade based on a rendering distance between a unit area of the rendering area and the rendering center, the method further comprising:
determining that the rendered data for the unit area satisfies a grass texture feature.
9. The method of claim 1, wherein prior to the acquiring the rendering center in the rendering region, the method further comprises:
and determining a rendering area based on the position of the virtual object and a preset radius, wherein the difference between the boundary of the rendering area and the position of the virtual object is the preset radius.
10. The method of claim 1, wherein the obtaining a rendering center in a rendering region comprises:
and acquiring the position of the virtual object as a rendering center.
11. The method of claim 1, wherein the obtaining a rendering center in a rendering region comprises:
acquiring a camera position as a rendering center; or,
acquiring a light source position as a rendering center; or,
and acquiring the user interaction position as a rendering center.
12. The method of claim 1, wherein the determining a grass insert based on a distance between a unit area in the rendering area and the rendering center comprises:
Determining a resolution of a texture map based on a distance between a unit area in the rendering area and the rendering center, the resolution of the texture map being inversely related to the distance;
a grass insert having a resolution of the texture map is determined.
13. A lawn rendering device, comprising:
the acquisition module is used for acquiring a rendering center in the rendering area;
a determining module for determining a grass blade based on a rendering distance between a unit area in the rendering area and the rendering center, the grass blade comprising at least one vertex, the number of vertices in the grass blade being inversely related to the rendering distance;
an insertion module for inserting the grass blades into the unit area.
14. A server, the server comprising: a processor coupled to a memory having stored therein at least one computer program instruction that is loaded and executed by the processor to cause the server to implement the method of any of claims 1-12.
15. A computer readable storage medium, characterized in that at least one instruction is stored in the storage medium, which instructions, when run on a computer, cause the computer to perform the method according to any of claims 1-12.
CN202311172742.8A 2023-09-12 2023-09-12 Grassland rendering method and device Pending CN117475053A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311172742.8A CN117475053A (en) 2023-09-12 2023-09-12 Grassland rendering method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311172742.8A CN117475053A (en) 2023-09-12 2023-09-12 Grassland rendering method and device

Publications (1)

Publication Number Publication Date
CN117475053A true CN117475053A (en) 2024-01-30

Family

ID=89622835

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311172742.8A Pending CN117475053A (en) 2023-09-12 2023-09-12 Grassland rendering method and device

Country Status (1)

Country Link
CN (1) CN117475053A (en)

Similar Documents

Publication Publication Date Title
US7265761B2 (en) Multilevel texture processing method for mapping multiple images onto 3D models
CN108919954B (en) Dynamic change scene virtual and real object collision interaction method
CN108305312A (en) The generation method and device of 3D virtual images
JP2002183761A (en) Image generation method and device
JP3626144B2 (en) Method and program for generating 2D image of cartoon expression from 3D object data
EP3371784A2 (en) 2d image processing for extrusion into 3d objects
CN110115841B (en) Rendering method and device for vegetation object in game scene
CN113658316B (en) Rendering method and device of three-dimensional model, storage medium and computer equipment
CN109087369A (en) Virtual objects display methods, device, electronic device and storage medium
CN106898040A (en) Virtual resource object rendering intent and device
KR20080018404A (en) Computer readable recording medium having background making program for making game
CN111476877A (en) Shadow rendering method and device, electronic equipment and storage medium
CN110610504A (en) Pencil drawing generation method and device based on skeleton and tone
CA2227502C (en) Method and system for determining and or using illumination maps in rendering images
CN117475053A (en) Grassland rendering method and device
KR100602739B1 (en) Semi-automatic field based image metamorphosis using recursive control-line matching
CN117197276A (en) Grassland rendering method and device
CN117197275A (en) Terrain rendering method and device
CN111462343B (en) Data processing method and device, electronic equipment and storage medium
Kennelly et al. Non-photorealistic rendering and terrain representation
US20240193864A1 (en) Method for 3d visualization of sensor data
KR102606373B1 (en) Method and apparatus for adjusting facial landmarks detected in images
CN115439314A (en) Stylization method, equipment and storage medium
CN107481184A (en) A kind of low polygon style figure generation interactive system
CN106384346A (en) Image processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination