CN117504280A - Fragment effect rendering method, device, equipment and storage medium - Google Patents

Fragment effect rendering method, device, equipment and storage medium Download PDF

Info

Publication number
CN117504280A
CN117504280A CN202311221544.6A CN202311221544A CN117504280A CN 117504280 A CN117504280 A CN 117504280A CN 202311221544 A CN202311221544 A CN 202311221544A CN 117504280 A CN117504280 A CN 117504280A
Authority
CN
China
Prior art keywords
illumination
information
target model
vertex
calculating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311221544.6A
Other languages
Chinese (zh)
Inventor
李家辉
陈纾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202311221544.6A priority Critical patent/CN117504280A/en
Publication of CN117504280A publication Critical patent/CN117504280A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)
  • Image Generation (AREA)

Abstract

The invention provides a fragment effect rendering method, a fragment effect rendering device, fragment effect rendering equipment and a storage medium. According to the method, the illumination information matrix for recording all illumination positions in the debris diffusion process is constructed, and then the approximate lamplight information required by the debris diffusion is calculated based on the matrix, so that the simultaneous calculation of information of multiple illumination points is realized, the calculation amount of the debris effect rendering is greatly reduced, the occupancy rate of the equipment performance is reduced, the processing is smoother, the time delay is reduced, the pause and frustration caused by the blocking are avoided, and the display effect is improved.

Description

Fragment effect rendering method, device, equipment and storage medium
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to a method, an apparatus, a device, and a storage medium for rendering a fragment effect.
Background
With the continuous development of games, the requirements of players on the pictures of the games are higher and higher, and especially the production of special effect effects is required to consider not only the authenticity, but also whether equipment is supported or not, and especially the rendering of the corresponding explosion fragment effects.
At present, the rendering of fragment effects, especially the fragment effects of explosion special effects, is generally realized based on illumination generated by explosion, and the illumination generated by explosion is divided into real-time illumination and baking illumination, when the real-time illumination mode is adopted for realizing, the illumination effect of each pixel needs to be calculated in real time, the illumination effect sum of each single illumination point needs to be calculated for each vertex, larger performance resources need to be consumed, and the method cannot be suitable for a mobile terminal. When the method of baking illumination is adopted, the illumination needs to be baked on the corresponding texture in advance, and the illumination effect which is independent and unchanged exists for any pixel point on each model.
Disclosure of Invention
Accordingly, the present invention is directed to a method, apparatus, device and storage medium for rendering a fragment effect, so as to solve the problems of large performance consumption of the device and poor rendering effect in the existing fragment effect implementation scheme.
In a first aspect, an embodiment of the present invention provides a method for rendering a fragment effect, including:
determining a target model and a diffusion animation of fragments matched with the target model;
calculating the diffusion animation to obtain an illumination information matrix, wherein illumination information corresponding to each illumination position is recorded in the illumination information matrix;
and calculating approximate lamplight information of a vertex corresponding to the illumination position in the target model based on illumination information in the illumination information matrix and vertex information of the target model, and generating a fragment effect on the target model based on each approximate lamplight information.
In a second aspect, an embodiment of the present invention provides a rendering apparatus for a fragment effect, including:
the acquisition module is used for determining a target model and a diffusion animation of fragments matched with the target model;
the calculating module is used for calculating the diffusion animation to obtain an illumination information matrix, wherein illumination information corresponding to each illumination position is recorded in the illumination information matrix;
The calculation module is used for calculating approximate lamplight information of the vertex corresponding to the illumination position in the target model based on the illumination information in the illumination information matrix and the vertex information of the target model;
and the rendering module is used for generating a fragment effect on the target model based on each piece of approximate lamplight information.
In a third aspect, an embodiment of the present invention provides an electronic device, including a processor and a memory, where the memory stores machine executable instructions executable by the processor, and the processor executes the machine executable instructions to implement the method for rendering the fragment effect provided above.
In a fourth aspect, embodiments of the present invention provide a computer-readable storage medium storing computer-executable instructions that, when invoked and executed by a processor, cause the processor to implement a method of rendering a fragmentation effect as provided above.
The embodiment of the invention has the following beneficial effects:
the method, the device, the equipment and the storage medium for rendering the fragment effect. According to the method, illumination information of each illumination position in the diffusion animation is analyzed, an illumination information matrix is constructed, approximate light information of corresponding vertexes on the target model is calculated based on the illumination information matrix, and finally colors of the vertexes on the target model are modified based on the approximate light information, so that fragment effect rendering is achieved. The plurality of illumination points are recorded in the illumination information matrix mode, so that the simultaneous calculation of rendering parameters of the plurality of illumination points is realized, the simultaneous rendering of the area range is realized, the performance consumption during the independent calculation of the plurality of points is reduced, and the performance and the display effect during the rendering are simultaneously realized.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
In order to make the above objects, features and advantages of the present invention more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings that are needed in the description of the embodiments or the prior art will be briefly described, it being obvious that the drawings in the description below are some embodiments of the invention and that other drawings may be obtained from these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flowchart of a method for rendering a fragment effect according to an embodiment of the present invention;
FIG. 2 is another flow chart of a method for rendering a fragment effect according to an embodiment of the present invention;
FIG. 3 is a schematic structural diagram of a fragment effect rendering apparatus according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of another structure of a fragment effect rendering apparatus according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
First, application scenarios applicable to the present application will be described. The method and the device can be applied to the technical field of image processing.
In a three-dimensional game scene, in order to more realistically present a scene change in the game scene or a special game operation performed by a target virtual character, various special effects need to be presented in the game scene, and an explosion effect is one of them, especially, the rendering of an explosion fragment effect in a fight scene of a multi-player fight game is indispensable.
Based on the above, the method, the device, the equipment and the storage medium for rendering the fragment effect provided by the embodiment of the invention can be used in software or equipment provided with any virtual special effects; the techniques may also be used in specific scene applications, such as gaming programs, instant messaging programs, or other applications that require rendering fragmenting effects of explosive-type effects.
The fragment effect rendering method provided by the invention can be operated on the terminal equipment or the server. The terminal device may be a local terminal device. When the fragment effect rendering method is operated on the server, the method can be realized and executed based on a cloud interaction system, wherein the cloud interaction system comprises the server and client equipment, and corresponding application programs such as game programs, APP and the like are operated on the client equipment.
In a possible implementation manner, the embodiment of the invention provides a rendering method of a fragment effect, and a game scene is provided through terminal equipment, wherein the terminal equipment can be local terminal equipment or client equipment in a cloud interaction system. A game scene is provided through the terminal equipment, and the game scene can be added with a fragment special effect according to specific requirements or special object requirements of the scene.
In order to facilitate understanding of the present embodiment, first, a detailed description will be given of a fragment effect rendering method disclosed in the present embodiment, as shown in fig. 1, a game scene is provided by a terminal device, a fragment effect is added to a certain target in the game scene, for example, when an explosion occurs, and the fragment effect rendering method includes the following steps:
step S101, determining a target model and a diffusion animation of fragments matched with the target model;
in this step, the target model may be understood as an object in a game scene, such as an hostile virtual game object, stone, tree, etc., and the method determines a model corresponding to the game operation requirement according to the real-time game operation requirement of a player, for example, a mine is required to be thrown away at a game scene position where the hostile virtual object of the hostile player is located in the process of the player fight, after a certain period of time, the mine explodes to generate an explosion effect, and the explosion effect is a scoped special effect, wherein the special effect is actually a combined special effect of light generated by explosion and fragments flying outwards, the game operation of the player is detected to determine the position where the explosion is located, the scene model of the position, such as an object model, is acquired, the target model is obtained, and the scope of the fragment effect is determined according to the actual requirement of the operation. Of course, the effect of the special effect of the scope skills can be also used, for example, the scope attack skills generated by disassembling a certain device, the release of the skills needs to require the disassembled device fragments to be spread to the maximum attack scope all around, and the realization of the special effect can be realized by the rendering method of the fragment effect provided by the embodiment.
Further, searching the material of the diffusion animation of the fragments matched with the material from the special effect material library of the explosion class based on the actual requirement, or directly using an animation tool, and calling a simulation program for forcing the fragments to diffuse to the periphery due to the diffusion force to generate a section of diffusion animation of the fragments.
In practical application, in the process of generating the diffusion animation, the range parameters of the fragment effect required on the target model and the effect requirement of fragment diffusion, such as the range size related to multi-fragment fly-out and explosion, can be determined first, and then the parameters are input into a simulation program of fragment diffusion to generate the diffusion animation of the corresponding fragment.
Step S102, resolving the diffusion animation to obtain an illumination information matrix, wherein illumination information corresponding to each illumination position is recorded in the illumination information matrix;
in this embodiment, the solution is actually to extract the animation content in the diffusion animation of the fragments, such as the position where the explosion is generated, the position where the explosion emits light, and the fragments, etc. Preferably, the illumination information of the position where the fragment emits illumination when being spread is selected, namely the illumination position and the light on the illumination position, and then the calculated information is constructed into a matrix, namely the matrix for storing the illumination information.
Specifically, when resolving the diffusion animation of the fragments, firstly identifying all illumination positions generated when the fragments are diffused, then collecting light emitted by each illumination position to form an animation with light change, then calculating the light influence range of each animation to obtain the maximum value and the minimum value of the illumination influence, calling a preset standard function to generate an illumination function of the illumination position based on the maximum value and the minimum value, and recording the illumination function of each illumination position into a matrix to obtain an illumination information matrix.
Step S103, approximate lamplight information of the vertexes corresponding to the illumination positions in the target model is calculated based on the illumination information in the illumination information matrix and the vertex information of the target model, and a fragment effect is generated on the target model based on each approximate lamplight information.
In this step, the approximate light information corresponding to the object model is calculated based on each illumination information recorded in the illumination information matrix, and the approximate light information can be understood as an illumination coefficient at each illumination position when the fragments are spread.
When the fragment effect is generated on the target model, the fragment-diffused lamplight is generated by combining the vertex colors of the corresponding vertexes on the target model based on the illumination coefficients of all illumination positions, and finally the lamplights of all vertexes are combined and combined to obtain the fragment effect.
According to the fragment effect rendering method, the diffusion animation of the fragments is resolved to obtain the illumination information matrix, wherein the illumination information matrix is recorded with illumination information corresponding to each illumination position; and calculating approximate lamplight information of the vertexes corresponding to the illumination positions in the target model based on the illumination information in the illumination information matrix and the vertex information of the target model, and generating fragment effects on the target model based on the approximate lamplight information. Compared with the prior art, the method has the advantages that the illumination information matrix for recording all illumination positions in the fragment diffusion is constructed, and then the approximate light information required by the fragment diffusion is calculated based on the matrix, so that the simultaneous calculation of the information of multiple illumination points is realized, the calculation amount of fragment effect rendering is greatly reduced, the occupancy rate to equipment performance is reduced, the processing is smoother, the time delay is reduced, the pause and frustration caused by jamming is avoided, and the display effect is improved.
The following embodiments provide a specific implementation manner of a fragment effect rendering method, and in this embodiment, the method is applied to a game device, where the game device includes a client and a server, the client displays a game screen and a fragment effect, and the server runs a game program and displays a game scene for the user according to the client after responding to various operations, where the game scene includes a special effect screen of the fragment. Based on the structure, a fragment effect rendering method is provided, the method is exemplified by fragment effects generated during explosion, as shown in fig. 2, and the method specifically comprises the following steps:
Step S201, determining a target model of the fragment effect to be rendered, and utilizing a vertex animation tool to manufacture a diffusion animation meeting the fragment effect requirement on the target model;
specifically, based on a control operation of a user with respect to a virtual object in a game scene, a target model, which refers to a model of an object to be rendered with respect to a fragment effect, is determined from the game scene based on the control operation. The fragmentation effect here is preferably chosen to be an explosive fragmentation effect, i.e. it is a special effect that fragments at the location of the explosion are spread out as a result of the explosion taking place.
In practical application, the user can also select a specific scene directly to determine a scene area needing to increase the effect of explosion fragments, and then extract the scene information of the area to construct a target model.
Extracting parameters from the target model to obtain a region needing to render the fragment effect, and calling an animation simulation program to generate a diffusion animation of the corresponding fragment based on the related information of the region and the required parameters of the fragment effect, such as an exploded fragment track and the like, for example, calling the explosion simulation program to generate the corresponding explosion animation, wherein the explosion animation is used as the diffusion animation and comprises the movement track of the fragment due to the action of external force and the illumination information of the explosion effect generating the external force.
The animation simulation program may be a vertex animation tool, which may be any software capable of implementing dynamic fabrication, such as 3D Max, PS, etc., and may be an animation virtual engine, such as UE, etc., where the parameters are input into the tool, and external force simulating the diffusion of fragments by the tool drives the diffusion of fragments to generate a diffusion animation of fragments.
In another embodiment, the fragment effect may be a fragment effect generated when the skill is released, in addition to a fragment effect generated by explosion, and should be a skill fragment effect, such as a scope skill, when the game character releases the scope skill, after recognizing that an enemy character exists in the attack scope of the skill, the enemy character is attacked, and the attack generates an attack effect, which is to release the skill fragment. Further, the fragmentation effect may also be an effect of equipment or object disintegration.
Step S202, extracting animation of each illumination position in the diffusion animation;
in this embodiment, by analyzing the illumination points that generate illumination when fragments in the diffusion animation diffuse, the illumination information of each illumination point is collected according to a preset dynamic time based on the illumination points, and the corresponding animation for recording the light information is generated. For example, in an explosion scene, by analyzing illumination points which generate illumination when explosion occurs in an explosion animation, illumination information of each illumination point is collected according to preset dynamic time based on the illumination points, and the corresponding animation for recording the illumination information is generated.
Step S203, calculating parameters of the minimum bounding volume of the diffusion animation based on each animation;
in the step, the maximum influence range and the minimum influence range of each illumination position are calculated based on each animation; and determining a minimum bounding volume of the diffusion animation based on the maximum influence range and the minimum influence range, and determining corresponding parameters based on the minimum bounding volume. In an explosive scene, the minimum bounding volume of the diffusion animation is the minimum bounding volume of fragments in the explosive animation.
Wherein the step of determining the corresponding parameter based on the minimum bounding volume includes:
traversing each mold surface in the minimum bounding volume, and calculating a contribution coefficient of the mold surface based on the maximum influence range and the minimum influence range of the mold surface to obtain a parameter of the minimum bounding volume, wherein the contribution coefficient is a light contribution value of illumination information of the mold surface to a corresponding illumination position.
In this embodiment, the step of calculating the contribution coefficient of the model surface based on the maximum influence range and the minimum influence range of the model surface to obtain the parameter of the minimum bounding volume includes:
Calculating illumination influence factors based on the distance between each molding surface and the corresponding illumination position;
calculating a contribution coefficient of the corresponding model surface based on the illumination influence factor and the maximum influence range and the minimum influence range of the corresponding model surface;
and generating parameters of the minimum bounding volume based on the contribution coefficients of the model surfaces.
Wherein, the step of calculating the illumination influence factor based on the distance between each mold surface and the corresponding illumination position includes:
judging the length from the illumination position distance to each mold surface;
constructing a spherical body of lamplight based on the length and the illumination position;
analyzing the illumination distribution on the spherical body, and obtaining an illumination influence factor based on the illumination distribution.
In practical application, the length from the illumination position distance to each mold surface can be judged;
according to the rule that the illumination effect of the opposite light source gradually weakens along with the increase of the distance, constructing a spherical body with the illumination position as the center and the radius equal to the length;
and calculating the illumination distribution on the spherical body, and obtaining an illumination influence factor based on the illumination distribution.
Counting the surfaces of each sphere, and constructing a function based on the counted surfaces;
And solving the basis function to obtain the parameters of the minimum bounding volume.
In this embodiment, the minimum bounding volume is divided into a plurality of small cubes, each cube is used as a separate illumination, and the actual influence factor of each light source on the light source is calculated. Wherein the calculation logic of the influence factors is as follows:
first, the length of the light source distance to the cube is determined, with an exponential decay for the photon default.
Second, there is lambert coloring for each light source at the same time, i.e., when the faces of the cube are opposite the light source, a weaker illumination effect is created. In this way, the faces of the cube are subdivided until a sphere-like body is formed.
Finally, statistics is carried out according to the surface of each spherical body, and regulation is carried out according to any fixed basis function. The definition of the basis functions may refer to the fast fourier transform. The set of functions is not fixed and may be any set of functions. Then, the corresponding basis function parameters are obtained and returned.
Step S204, generating an illumination information matrix based on the parameters, wherein the illumination information matrix stores illumination functions for a three-dimensional array matrix;
in this embodiment, the maximum and minimum values of the illumination range are calculated according to the animation contents, and the parameters of the minimum bounding volume are obtained. Generating a corresponding three-dimensional array matrix according to the parameters to store the illumination function; and (3) calculating the illumination positions in the diffusion animation, and calculating a corresponding illumination function for each area possibly affected by the light.
In practical application, for the step of generating the illumination function, specifically, an animation model is constructed by using diffusion animation, vertex and face analysis is performed on the animation model, each vertex and face are obtained, a minimum bounding volume is constructed based on each face, the mass center and normal line of each face are calculated, and the illumination function is generated based on a preset standard function, mass center and normal line.
Further, the step of generating the illumination information matrix based on the parameters includes:
determining a light impact area of the illumination location based on the illumination impact factor;
and calculating a corresponding illumination function based on the parameters and the light influence area to obtain an illumination information matrix.
Step S205, taking vertex information of the target model as input, and preprocessing illumination information in an illumination information matrix;
the preprocessing herein may be understood as converting the illumination position corresponding to each illumination information recorded in the illumination information matrix into the vertex corresponding to the target model based on the coordinate conversion relationship between the target model and the model corresponding to the diffusion animation, and then fusing the illumination information in combination with other model parameters of each vertex, such as the normal line of the vertex, the color ratio of light, and the like.
Step S206, calculating the approximate value of the corresponding illumination position on the target model based on the preprocessed illumination information;
in this embodiment, for obtaining the approximate value, the illumination influence coefficient of the vertex corresponding to each illumination position of the target model may be extracted, and the calculation of the illumination of the vertex may be performed on the preprocessed illumination information based on the illumination influence coefficient, so as to obtain the approximate value of the vertex.
In step S207, the pixel shader corrects the vertex color of the vertex corresponding to the vertex information based on the approximation value, thereby obtaining the fragment effect.
In this embodiment, the pixel shader adjusts the vertex color of the vertex corresponding to the vertex information according to the illumination contribution function and the normal direction of the environment to be rendered based on the approximate value, and merges the illumination effects of the illumination positions to obtain the fragment effect, where the merging is understood to be a superposition and merging operation.
In practical application, after the basis functions are generated based on the parameters of the minimum bounding volumes, adding, deleting and modifying can be performed according to practical requirements, but the basis functions with larger numbers can often express more accurate effects.
Further, matching is performed according to the basis function and the normal line direction of the current actual fragment, and for any normal line direction of the surface, the factor is obtained by adopting the lambert calculation, so as to obtain the approximate illumination parameters by matching with the basis function variables, and then each illumination parameter is added to obtain the approximate illumination effect.
In another embodiment, the fragmentation effect may be in addition to the fragmentation effect of an explosion, a skill effect in the game character, in particular a skill of a scoped attack, which is a scope skill with attached equipment or weapon fragments, such as the passive of Zhu Geliang in the owner, which, upon triggering, creates a plurality of "normal balls" around the game character, which are similar to the effect of fragments flying outwards upon an explosion. In this regard, fragments in the fragment effect are understood to be skill fragments in skill, object fragments, even equipment, weapon fragments, and the like. And each fragment can set a corresponding luminous effect, namely illumination information, for display or more realistic effect.
In sum, through this scheme, can realize better illumination effect on the cell-phone. Meanwhile, the problem that static baking illumination cannot act on a dynamic object can be avoided.
Corresponding to the above method embodiment, referring to a schematic diagram of a rendering device of a fragment effect shown in fig. 3, a game scene is provided through a terminal device; the device comprises:
an acquisition module 310, configured to determine a target model and a diffusion animation of fragments matched with the target model;
The resolving module 320 is configured to resolve the diffusion animation to obtain an illumination information matrix, where illumination information corresponding to each illumination position is recorded in the illumination information matrix;
the rendering module 330 is configured to calculate approximate light information of a vertex corresponding to the illumination position in the target model based on the illumination information in the illumination information matrix and vertex information of the target model, and generate a fragment effect on the target model based on each of the approximate light information.
According to the fragment effect rendering device, the illumination information matrix is constructed by analyzing the illumination information of each illumination position in the fragment diffusion animation, the approximate light information of the corresponding vertex on the target model is calculated based on the illumination information matrix, and finally, the color of each vertex on the target model is modified based on the approximate light information, so that the fragment effect rendering is realized, and the problems that the performance consumption of equipment is large and the rendering effect is poor in the existing fragment effect implementation scheme are solved.
Referring to fig. 4, a second embodiment of a rendering device for a tile effect according to an embodiment of the present invention includes:
an acquisition module 310, configured to determine a target model and a diffusion animation of fragments matched with the target model;
The resolving module 320 is configured to resolve the diffusion animation to obtain an illumination information matrix, where illumination information corresponding to each illumination position is recorded in the illumination information matrix;
the rendering module 330 is configured to calculate approximate light information of a vertex corresponding to the illumination position in the target model based on the illumination information in the illumination information matrix and vertex information of the target model, and generate a fragment effect on the target model based on each of the approximate light information.
In this embodiment, the obtaining module 310 is specifically configured to:
and determining a target model of the fragment effect to be rendered, and utilizing a vertex animation tool to produce a diffusion animation meeting the fragment effect requirement on the target model.
In this embodiment, the resolving module 320 includes:
an extracting unit 321, configured to extract an animation at each illumination position in the diffusion animation;
a calculating unit 322, configured to calculate parameters of a minimum bounding volume of the diffusion animation based on each of the animations;
the generating unit 323 is configured to generate an illumination information matrix based on the parameters, where the illumination information matrix stores an illumination function for a three-dimensional array matrix.
In this embodiment, the resolving unit 322 is specifically configured to:
calculating the maximum influence range and the minimum influence range of each illumination position based on each animation;
and determining a minimum bounding volume of the diffusion animation based on the maximum influence range and the minimum influence range, and determining corresponding parameters based on the minimum bounding volume.
In this embodiment, the resolving unit 322 is specifically configured to:
traversing each mold surface in the minimum bounding volume, and calculating a contribution coefficient of the mold surface based on the maximum influence range and the minimum influence range of the mold surface to obtain a parameter of the minimum bounding volume, wherein the contribution coefficient is a light contribution value of illumination information of the mold surface to a corresponding illumination position;
in this embodiment, the resolving unit 322 is specifically configured to:
calculating illumination influence factors based on the distance between each molding surface and the corresponding illumination position;
calculating a contribution coefficient of the corresponding model surface based on the illumination influence factor and the maximum influence range and the minimum influence range of the corresponding model surface;
and generating parameters of the minimum bounding volume based on the contribution coefficients of the model surfaces.
In this embodiment, the resolving unit 322 is specifically configured to:
judging the length from the illumination position distance to each mold surface;
constructing a spherical body of lamplight based on the length and the illumination position;
analyzing the illumination distribution on the spherical body, and obtaining an illumination influence factor based on the illumination distribution.
In this embodiment, the resolving unit 322 is specifically configured to:
determining a light impact area of the illumination location based on the illumination impact factor;
and calculating a corresponding illumination function based on the parameters and the light influence area to obtain an illumination information matrix.
In this embodiment, the rendering module 330 includes:
a computing unit 331, configured to pre-process the illumination information in the illumination information matrix with vertex information of the target model as input; obtaining an approximate value of the corresponding illumination position on the target model based on the preprocessed illumination information;
and a rendering unit 332, configured to correct, by using a pixel shader, a vertex color of a vertex corresponding to the vertex information based on the approximation value, so as to obtain a fragment effect.
In this embodiment, the rendering unit 332 is specifically configured to:
And adjusting the vertex color of the vertex corresponding to the vertex information according to the illumination contribution function and the normal direction of the environment to be rendered based on the approximate value through a pixel shader, and fusing the illumination effect of each illumination position to obtain the fragment effect.
In the example, the illumination information matrix is constructed by analyzing illumination information of each illumination position in the diffusion animation of the fragment, approximate light information of corresponding vertexes on the target model is calculated based on the illumination information matrix, and finally colors of the vertexes on the target model are modified based on the approximate light information so as to realize the rendering of the fragment effect, so that the problems of larger consumption of equipment performance and poor rendering effect of the existing fragment effect realization scheme are solved.
The present embodiment also provides an electronic device including a processor and a memory storing machine-executable instructions executable by the processor, the processor executing the machine-executable instructions to implement the fragment effect rendering method provided by the above embodiment. The electronic device may be a server or a terminal device.
Referring to fig. 5, the electronic device includes a processor 500 and a memory 501, the memory 501 storing machine executable instructions that can be executed by the processor 500, the processor 500 executing the machine executable instructions to implement the above-described fragment effect rendering method.
Further, the electronic device shown in fig. 5 further includes a bus 502 and a communication interface 503, and the processor 500, the communication interface 503, and the memory 501 are connected by the bus 502.
The memory 501 may include a high-speed random access memory (RAM, random Access Memory), and may further include a non-volatile memory (non-volatile memory), such as at least one magnetic disk memory. The communication connection between the system network element and at least one other network element is implemented via at least one communication interface 503 (which may be wired or wireless), which may use the internet, a wide area network, a local network, a metropolitan area network, etc. Bus 502 may be an ISA bus, a PCI bus, an EISA bus, or the like. The buses may be classified as address buses, data buses, control buses, etc. For ease of illustration, only one bi-directional arrow is shown in FIG. 5, but not only one bus or type of bus.
The processor 500 may be an integrated circuit chip with signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuitry in hardware or instructions in software in the processor 500. The processor 500 may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU for short), a network processor (Network Processor, NP for short), etc.; but also digital signal processors (Digital Signal Processor, DSP for short), application specific integrated circuits (Application Specific Integrated Circuit, ASIC for short), field-programmable gate arrays (Field-Programmable Gate Array, FPGA for short) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components. The disclosed methods, steps, and logic blocks in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be embodied directly in the execution of a hardware decoding processor, or in the execution of a combination of hardware and software modules in a decoding processor. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in the memory 501, and the processor 500 reads the information in the memory 501, and in combination with its hardware, performs the following steps:
Determining a target model and a diffusion animation of fragments matched with the target model;
calculating the diffusion animation to obtain an illumination information matrix, wherein illumination information corresponding to each illumination position is recorded in the illumination information matrix;
and calculating approximate lamplight information of a vertex corresponding to the illumination position in the target model based on illumination information in the illumination information matrix and vertex information of the target model, and generating a fragment effect on the target model based on each approximate lamplight information.
The step of determining the target model and the diffusion animation of fragments matched with the target model comprises the following steps:
and determining a target model of the fragment effect to be rendered, and utilizing a vertex animation tool to produce a diffusion animation meeting the fragment effect requirement on the target model.
The step of calculating the diffusion animation to obtain the illumination information matrix comprises the following steps:
extracting the animation of each illumination position in the diffusion animation;
calculating parameters of a minimum bounding volume of the diffusion animation based on each animation;
and generating an illumination information matrix based on the parameters, wherein the illumination information matrix stores illumination functions for a three-dimensional array matrix.
The step of calculating the parameters of the minimum bounding volume of the diffusion animation based on each animation includes:
calculating the maximum influence range and the minimum influence range of each illumination position based on each animation;
and determining a minimum bounding volume of the diffusion animation based on the maximum influence range and the minimum influence range, and determining corresponding parameters based on the minimum bounding volume.
The step of determining the corresponding parameters based on the minimum bounding volume includes:
traversing each mold surface in the minimum bounding volume, and calculating a contribution coefficient of the mold surface based on the maximum influence range and the minimum influence range of the mold surface to obtain a parameter of the minimum bounding volume, wherein the contribution coefficient is a light contribution value of illumination information of the mold surface to a corresponding illumination position;
the step of calculating the contribution coefficient of the model surface based on the maximum influence range and the minimum influence range of the model surface to obtain the parameter of the minimum bounding volume includes:
calculating illumination influence factors based on the distance between each molding surface and the corresponding illumination position;
calculating a contribution coefficient of the corresponding model surface based on the illumination influence factor and the maximum influence range and the minimum influence range of the corresponding model surface;
And generating parameters of the minimum bounding volume based on the contribution coefficients of the model surfaces.
The step of calculating the illumination influence factor based on the distance between each mold surface and the corresponding illumination position includes:
judging the length from the illumination position distance to each mold surface;
constructing a spherical body of lamplight based on the length and the illumination position;
analyzing the illumination distribution on the spherical body, and obtaining an illumination influence factor based on the illumination distribution.
The step of generating the illumination information matrix based on the parameters comprises the following steps:
determining a light impact area of the illumination location based on the illumination impact factor;
and calculating a corresponding illumination function based on the parameters and the light influence area to obtain an illumination information matrix.
The step of calculating the approximate lighting information of the vertex corresponding to the lighting position in the target model based on the lighting information in the lighting information matrix and the vertex information of the target model, and generating the fragment effect on the target model based on each approximate lighting information comprises the following steps:
taking vertex information of the target model as input, and preprocessing illumination information in the illumination information matrix;
Obtaining an approximate value of the corresponding illumination position on the target model based on the preprocessed illumination information;
and correcting the vertex color of the vertex corresponding to the vertex information based on the approximate value through a pixel shader to obtain a fragment effect.
The step of correcting the vertex color of the vertex corresponding to the vertex information based on the approximate value by the pixel shader to obtain the fragment effect comprises the following steps:
and adjusting the vertex color of the vertex corresponding to the vertex information according to the illumination contribution function and the normal direction of the environment to be rendered based on the approximate value through a pixel shader, and adding the illumination effect of each illumination position to obtain the fragment effect.
According to the embodiment, the illumination information matrix for recording all illumination positions in the debris diffusion process is constructed, and then the approximate lamplight information required by the debris diffusion is calculated based on the matrix, so that the simultaneous calculation of the information of multiple illumination points is realized, the calculation amount of the debris effect rendering is greatly reduced, the occupancy rate of the equipment performance is reduced, the processing is smoother, the time delay is reduced, the frustration caused by jamming is avoided, and the display effect is improved.
The present embodiment also provides a computer-readable storage medium storing computer-executable instructions that, when invoked and executed by a processor, cause the processor to perform the steps of:
determining a target model and a diffusion animation of fragments matched with the target model;
calculating the diffusion animation to obtain an illumination information matrix, wherein illumination information corresponding to each illumination position is recorded in the illumination information matrix;
and calculating approximate lamplight information of a vertex corresponding to the illumination position in the target model based on illumination information in the illumination information matrix and vertex information of the target model, and generating a fragment effect on the target model based on each approximate lamplight information.
The step of determining the target model and the diffusion animation of fragments matched with the target model comprises the following steps:
and determining a target model of the fragment effect to be rendered, and utilizing a vertex animation tool to produce a diffusion animation meeting the fragment effect requirement on the target model.
The step of calculating the diffusion animation to obtain the illumination information matrix comprises the following steps:
Extracting the animation of each illumination position in the diffusion animation;
calculating parameters of a minimum bounding volume of the diffusion animation based on each animation;
and generating an illumination information matrix based on the parameters, wherein the illumination information matrix stores illumination functions for a three-dimensional array matrix.
The step of calculating the parameters of the minimum bounding volume of the diffusion animation based on each animation includes:
calculating the maximum influence range and the minimum influence range of each illumination position based on each animation;
and determining a minimum bounding volume of the diffusion animation based on the maximum influence range and the minimum influence range, and determining corresponding parameters based on the minimum bounding volume.
The step of determining the corresponding parameters based on the minimum bounding volume includes:
traversing each mold surface in the minimum bounding volume, and calculating a contribution coefficient of the mold surface based on the maximum influence range and the minimum influence range of the mold surface to obtain a parameter of the minimum bounding volume, wherein the contribution coefficient is a light contribution value of illumination information of the mold surface to a corresponding illumination position.
The step of calculating the contribution coefficient of the model surface based on the maximum influence range and the minimum influence range of the model surface to obtain the parameter of the minimum bounding volume includes:
Calculating illumination influence factors based on the distance between each molding surface and the corresponding illumination position;
calculating a contribution coefficient of the corresponding model surface based on the illumination influence factor and the maximum influence range and the minimum influence range of the corresponding model surface;
and generating parameters of the minimum bounding volume based on the contribution coefficients of the model surfaces.
The step of calculating the illumination influence factor based on the distance between each mold surface and the corresponding illumination position includes:
judging the length from the illumination position distance to each mold surface;
constructing a spherical body of lamplight based on the length and the illumination position;
analyzing the illumination distribution on the spherical body, and obtaining an illumination influence factor based on the illumination distribution.
The step of generating the illumination information matrix based on the parameters comprises the following steps:
determining a light impact area of the illumination location based on the illumination impact factor;
and calculating a corresponding illumination function based on the parameters and the light influence area to obtain an illumination information matrix.
The step of calculating the approximate lighting information of the vertex corresponding to the lighting position in the target model based on the lighting information in the lighting information matrix and the vertex information of the target model, and generating the fragment effect on the target model based on each approximate lighting information comprises the following steps:
Taking vertex information of the target model as input, and preprocessing illumination information in the illumination information matrix;
obtaining an approximate value of the corresponding illumination position on the target model based on the preprocessed illumination information;
and correcting the vertex color of the vertex corresponding to the vertex information based on the approximate value through a pixel shader to obtain a fragment effect.
The step of correcting the vertex color of the vertex corresponding to the vertex information based on the approximate value by the pixel shader to obtain the fragment effect comprises the following steps:
and adjusting the vertex color of the vertex corresponding to the vertex information according to the illumination contribution function and the normal direction of the environment to be rendered based on the approximate value through a pixel shader, and fusing the illumination effect of each illumination position to obtain the fragment effect.
According to the embodiment, the illumination information matrix for recording all illumination positions in the debris diffusion process is constructed, and then the approximate lamplight information required by the debris diffusion is calculated based on the matrix, so that the simultaneous calculation of the information of multiple illumination points is realized, the calculation amount of the debris effect rendering is greatly reduced, the occupancy rate of the equipment performance is reduced, the processing is smoother, the time delay is reduced, the pause and frustration caused by the blocking are avoided, and the display effect is improved.
The method, the device, the electronic device and the computer program product of the storage medium for rendering the fragment effect provided by the embodiments of the present invention include a computer readable storage medium storing program codes, and the instructions included in the program codes may be used to execute the method described in the foregoing method embodiment, and specific implementation may refer to the method embodiment and will not be described herein.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described system and apparatus may refer to corresponding procedures in the foregoing method embodiments, which are not described herein again.
In addition, in the description of embodiments of the present invention, unless explicitly stated and limited otherwise, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements. The specific meaning of the above terms in the present invention will be understood by those skilled in the art in specific cases.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
In the description of the present invention, it should be noted that the directions or positional relationships indicated by the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc. are based on the directions or positional relationships shown in the drawings, are merely for convenience of describing the present invention and simplifying the description, and do not indicate or imply that the devices or elements referred to must have a specific orientation, be configured and operated in a specific orientation, and thus should not be construed as limiting the present invention. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
Finally, it should be noted that: the above examples are only specific embodiments of the present invention for illustrating the technical solution of the present invention, but not for limiting the scope of the present invention, and although the present invention has been described in detail with reference to the foregoing examples, it will be understood by those skilled in the art that the present invention is not limited thereto: any person skilled in the art may modify or easily conceive of the technical solution described in the foregoing embodiments, or perform equivalent substitution of some of the technical features, while remaining within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention, and are intended to be included in the scope of the present invention. Therefore, the protection scope of the invention is subject to the protection scope of the claims.

Claims (13)

1. A method of rendering a fragment effect, the method comprising:
determining a target model and a diffusion animation of fragments matched with the target model;
calculating the diffusion animation to obtain an illumination information matrix, wherein illumination information corresponding to each illumination position is recorded in the illumination information matrix;
And calculating approximate lamplight information of a vertex corresponding to the illumination position in the target model based on illumination information in the illumination information matrix and vertex information of the target model, and generating a fragment effect on the target model based on each approximate lamplight information.
2. The method of claim 1, wherein the step of determining a target model and a diffusion animation of fragments matching the target model comprises:
and determining a target model of the fragment effect to be rendered, and utilizing a vertex animation tool to produce a diffusion animation meeting the fragment effect requirement on the target model.
3. The method for rendering a fragment effect according to claim 1, wherein the step of calculating the diffusion animation to obtain an illumination information matrix includes:
extracting the animation of each illumination position in the diffusion animation;
calculating parameters of a minimum bounding volume of the diffusion animation based on each animation;
and generating an illumination information matrix based on the parameters, wherein the illumination information matrix stores illumination functions for a three-dimensional array matrix.
4. A method of rendering a fragment effect as claimed in claim 3, wherein the step of calculating parameters of a minimum bounding volume of the diffusion animation based on each of the animations comprises:
Calculating the maximum influence range and the minimum influence range of each illumination position based on each animation;
and determining a minimum bounding volume of the diffusion animation based on the maximum influence range and the minimum influence range, and determining corresponding parameters based on the minimum bounding volume.
5. The method of claim 4, wherein the step of determining the corresponding parameters based on the minimum bounding volume comprises:
traversing each mold surface in the minimum bounding volume, and calculating a contribution coefficient of the mold surface based on the maximum influence range and the minimum influence range of the mold surface to obtain a parameter of the minimum bounding volume, wherein the contribution coefficient is a light contribution value of illumination information of the mold surface to a corresponding illumination position.
6. The method of claim 5, wherein the step of calculating the contribution coefficient of the model surface based on the maximum influence range and the minimum influence range of the model surface to obtain the parameter of the minimum bounding volume comprises:
calculating illumination influence factors based on the distance between each molding surface and the corresponding illumination position;
Calculating a contribution coefficient of the corresponding model surface based on the illumination influence factor and the maximum influence range and the minimum influence range of the corresponding model surface;
and generating parameters of the minimum bounding volume based on the contribution coefficients of the model surfaces.
7. The method of claim 6, wherein the calculating the illumination influence factor based on the distance between each of the mold surfaces and the corresponding illumination position comprises:
judging the length from the illumination position distance to each mold surface;
constructing a spherical body of lamplight based on the length and the illumination position;
analyzing the illumination distribution on the spherical body, and obtaining an illumination influence factor based on the illumination distribution.
8. The method of claim 7, wherein the step of generating the illumination information matrix based on the parameters comprises:
determining a light impact area of the illumination location based on the illumination impact factor;
and calculating a corresponding illumination function based on the parameters and the light influence area to obtain an illumination information matrix.
9. The method according to any one of claims 1 to 8, wherein the step of calculating approximate lighting information of vertices corresponding to the lighting positions in the target model based on the lighting information in the lighting information matrix and the vertex information of the target model, and generating a fragment effect on the target model based on each of the approximate lighting information, comprises:
Taking vertex information of the target model as input, and preprocessing illumination information in the illumination information matrix;
obtaining an approximate value of the corresponding illumination position on the target model based on the preprocessed illumination information;
and correcting the vertex color of the vertex corresponding to the vertex information based on the approximate value through a pixel shader to obtain a fragment effect.
10. The method for rendering a fragment effect according to claim 9, wherein the step of correcting, by the pixel shader, the vertex color of the vertex corresponding to the vertex information based on the approximation value, to obtain the fragment effect includes:
and adjusting the vertex color of the vertex corresponding to the vertex information according to the illumination contribution function and the normal direction of the environment to be rendered based on the approximate value through a pixel shader, and fusing the illumination effect of each illumination position to obtain the fragment effect.
11. A fragment effect rendering apparatus, the apparatus comprising:
the acquisition module is used for determining a target model and a diffusion animation of fragments matched with the target model;
the calculating module is used for calculating the diffusion animation to obtain an illumination information matrix, wherein illumination information corresponding to each illumination position is recorded in the illumination information matrix;
The rendering module is used for calculating approximate lamplight information of the vertex corresponding to the illumination position in the target model based on the illumination information in the illumination information matrix and the vertex information of the target model, and generating a fragment effect on the target model based on each piece of approximate lamplight information.
12. An electronic device comprising a processor and a memory, the memory storing machine executable instructions executable by the processor, the processor executing the machine executable instructions to implement the method of rendering a fragmentation effect of any of claims 1 to 10.
13. A computer readable storage medium storing computer executable instructions which, when invoked and executed by a processor, cause the processor to implement the method of rendering a fragmentation effect according to any of claims 1 to 10.
CN202311221544.6A 2023-09-20 2023-09-20 Fragment effect rendering method, device, equipment and storage medium Pending CN117504280A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311221544.6A CN117504280A (en) 2023-09-20 2023-09-20 Fragment effect rendering method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311221544.6A CN117504280A (en) 2023-09-20 2023-09-20 Fragment effect rendering method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117504280A true CN117504280A (en) 2024-02-06

Family

ID=89763270

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311221544.6A Pending CN117504280A (en) 2023-09-20 2023-09-20 Fragment effect rendering method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117504280A (en)

Similar Documents

Publication Publication Date Title
US20200394842A1 (en) Method and device for a placement of a virtual object of an augmented or mixed reality application in a real-world 3d environment
CN112915542B (en) Collision data processing method and device, computer equipment and storage medium
CN114419240B (en) Illumination rendering method and device, computer equipment and storage medium
JP2017514188A (en) Method, apparatus and terminal for simulating sound in a virtual scenario
US7508391B2 (en) Determining illumination of models using an ambient framing abstractions
US20230230311A1 (en) Rendering Method and Apparatus, and Device
CN112712582A (en) Dynamic global illumination method, electronic device and computer-readable storage medium
CN108295471B (en) Model vibration simulation method and device, storage medium, processor and terminal
CN111145329A (en) Model rendering method and system and electronic device
CN110694279A (en) Game special effect display method, device, equipment and medium
CN112184873A (en) Fractal graph creating method and device, electronic equipment and storage medium
WO2023142354A1 (en) Target locking method and apparatus, and electronic device and storage medium
CN112819940B (en) Rendering method and device and electronic equipment
CN112619152A (en) Game bounding box processing method and device and electronic equipment
CN117504280A (en) Fragment effect rendering method, device, equipment and storage medium
CN113786624B (en) Game resource manufacturing method, device and equipment
CN112473135B (en) Real-time illumination simulation method, device and equipment for mobile game and storage medium
CN113117334A (en) Method for determining visible area of target point and related device
CN111862345A (en) Information processing method and device, electronic equipment and computer readable storage medium
CN115177953A (en) Object processing method and device
US12008716B2 (en) Systems and methods for generating a simplified polygonal mesh
US11810241B2 (en) Systems and methods for ray traced contact shadows
WO2022121655A1 (en) Transparency determining method and apparatus, electronic device, and storage medium
CN115671748A (en) Weapon hit processing method, device, equipment and storage medium
CN115671749A (en) Weapon hit processing method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination