CN111862254A - Cross-rendering platform based material rendering method and system - Google Patents

Cross-rendering platform based material rendering method and system Download PDF

Info

Publication number
CN111862254A
CN111862254A CN202010689461.XA CN202010689461A CN111862254A CN 111862254 A CN111862254 A CN 111862254A CN 202010689461 A CN202010689461 A CN 202010689461A CN 111862254 A CN111862254 A CN 111862254A
Authority
CN
China
Prior art keywords
rendering
shadow
illumination
light
light source
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010689461.XA
Other languages
Chinese (zh)
Other versions
CN111862254B (en
Inventor
刘德建
高山晓
陈宏展
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujian TQ Digital Co Ltd
Original Assignee
Fujian TQ Digital Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujian TQ Digital Co Ltd filed Critical Fujian TQ Digital Co Ltd
Priority to CN202010689461.XA priority Critical patent/CN111862254B/en
Publication of CN111862254A publication Critical patent/CN111862254A/en
Application granted granted Critical
Publication of CN111862254B publication Critical patent/CN111862254B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Generation (AREA)

Abstract

The invention provides a material rendering method based on a cross-rendering platform, which comprises the following steps: step S1, obtaining rendering platform hardware information; step S2, according to the hardware information of the rendering platform, sequentially performing mapping format support operation, shadow support setting, illumination support setting and rendering support setting, so that the material can adapt to the rendering platform; and step S3, in the real-time rendering stage, the shadow distribution of the realistic rendering or the non-realistic rendering is determined through the illumination value improvement, after the core algorithm formulas of the realistic rendering and the non-realistic rendering are unified, the three areas, namely the illumination area, the shadow area and the light and dark transition area, are obtained so that an operator can perform color processing on the three areas, after the illumination color value is obtained, texture is combined and output, a rendering result is finally obtained to complete material rendering, and the rendering operation can be completed on a cross-rendering platform.

Description

Cross-rendering platform based material rendering method and system
Technical Field
The invention relates to the technical field of game rendering, in particular to a material rendering method and a material rendering system based on a cross-rendering platform.
Background
The native texture rendering solution often generates various texture rendering adaptation problems in a cross-platform system, for example, the rendering platform cannot adapt to a proper shadow map, or the rendering platform does not support a complex illumination system with multiple light sources, or the rendering platform does not support a related map size format, and the like. As mobile terminals have been developed, rendering terminals on the market have become more diverse. The disadvantage of the traditional material rendering scheme is more and more obvious, the non-adaptation to a certain rendering terminal is lost, the whole user group under the platform is lost, and the economic loss is substantial. Existing rendering platforms include, but are not limited to: PC, PS4, xbox, switch, chrome, safari, huawei nova6, huawei p20, iphone7, iphone8, iphone ES rendering platform.
Disclosure of Invention
In order to overcome the above problems, an object of the present invention is to provide a material rendering method based on a cross-rendering platform, which can complete rendering operations on the cross-rendering platform, and each rendering platform can normally display rendering results.
The invention is realized by adopting the following scheme: a method of cross-rendering platform based material rendering, the method comprising the steps of:
Step S1, obtaining rendering platform hardware information;
step S2, according to the hardware information of the rendering platform, sequentially performing mapping format support operation, shadow support setting, illumination support setting and rendering support setting, so that the material can adapt to the rendering platform;
and step S3, in the real-time rendering stage, the shadow distribution of the realistic rendering or the non-realistic rendering is determined through the illumination value improvement, after the core algorithm formulas of the realistic rendering and the non-realistic rendering are unified, the illumination area, the shadow area and the light and dark transition area are obtained so that an operator can conveniently perform color processing on the three areas, and after the illumination color value is obtained, the texture is combined and output to finally obtain a rendering result to complete material rendering.
Further, the map format support operation includes: format settings and size ceiling settings, the shadow support settings including: cascade shadow setting, shadow map size setting and soft shadow support setting; the light source support arrangement comprises: effective light source number setting and light source type limitation; the rendering support setting comprises post rendering, delayed rendering and high dynamic illumination rendering HDR support judgment.
Further, the determining of the shadow distribution of the photorealistic rendering or the non-photorealistic rendering through the illumination value improvement is further specifically: before material light and shade rendering, an illumination value in a bidirectional reflection distribution function BRDF formula is improved so as to unify a realistic illumination model and a non-realistic illumination model, and the improvement formula is as follows:
Figure BDA0002588765650000021
Wherein I is the model illumination value, IfinalIs the final illumination value of the model, i is the current light source id, N is the total number of the current scene light sources, N is the rendering model normal, LiIs the light incident light vector of the current light source, SiA parallel light source S as an influence factor of the current light source on the current modeli=1,e0,e1Controlling the light and dark transition region by controlling the two parameters as input parameters, and rendering e if it is realistic0=0,e 11 is ═ 1; and if the rendering is not photorealistic rendering, adjusting the size of the parameter according to actual requirements.
Further, the acquiring of the illumination area specifically includes: calculating the illumination area, wherein the formula is as follows:
CLit=[RLit+Ifinal]·PLitColor·PLitcolorIntensity
wherein, PLitColorAnd PLitcolorIntensityInput control parameters provided for the art, respectively representing an offset color of illumination and an offset color intensity of illumination; rLitObtaining a light value query result for the shadow map, wherein if the rendering platform does not support the shadow map, the value is 0, and IfinalIs the final illumination value of the model, CLitIs the final color value of the illumination.
Further, the acquiring the shadow area specifically includes: the shaded area is calculated as follows:
Figure BDA0002588765650000022
wherein, PshadowColorAnd PshadowcolorIntensityInput control parameters for the art, respectively representing the offset color of the shadow and the offset color intensity of the shadow, RshadowThe shadow value is a shadow value query result of the shadow map, if the rendering platform does not support the shadow map, the value is 0, i is the current light source id, N is the total number of the current scene light sources, N is the rendering model normal, -L iAs the inverse vector of the incident light of the current light source, 1-SiA parallel light source S as an inverse factor of the current model' S influence by the current light sourcei=1,CshadowIs the final color value of the shadow,
Figure BDA0002588765650000031
representing the fully shaded area of the n light sources.
Further, the acquiring a bright-dark transition region specifically includes: calculating a light and dark transition region, wherein the formula is as follows:
Figure BDA0002588765650000032
wherein, Pe,PTlineColorAnd PTlineIntensityInput control parameters provided for the art are respectively used for representing boundary hardness, offset color of light and shade transition and offset color intensity of light and shade transition, i is the current light source id, N is the total number of the current scene light sources, N is the rendering model normal, LiIs the light incident light vector of the current light source, -LiIs the incident light inverse of the current light source, CTlineFor the final color value of the light-dark transition,
Figure BDA0002588765650000033
representing the sum of the theoretical shaded areas of the n light sources,
Figure BDA0002588765650000034
representing the sum of the theoretical illumination areas of the n light sources,
Figure BDA0002588765650000035
representing the sum of the bright and dark boundary transition regions of the n light sources.
The invention also provides a material rendering system based on the cross-rendering platform, which comprises a hardware information acquisition module, a material setting module and a rendering operation module;
the hardware information acquisition module is used for acquiring hardware information of the rendering platform;
The material setting module is used for sequentially carrying out mapping format support operation, shadow support setting, illumination support setting and rendering support setting according to the hardware information of the rendering platform, so that the material can adapt to the rendering platform;
the rendering operation module is used for determining the shadow distribution of the realistic rendering or the non-realistic rendering through the illumination value improvement in the real-time rendering stage, acquiring an illumination area, a shadow area and a light and dark transition area after unifying the core algorithm formulas of the realistic rendering and the non-realistic rendering, so that an operator can perform color processing on the three areas, and combining textures to output a rendering result to complete material rendering after acquiring an illumination color value.
Further, the map format support operation includes: format settings and size ceiling settings, the shadow support settings including: cascade shadow setting, shadow map size setting and soft shadow support setting; the light source support arrangement comprises: effective light source number setting and light source type limitation; the rendering support setting comprises post rendering, delayed rendering and high dynamic illumination rendering HDR support judgment.
Further, the determining of the shadow distribution of the photorealistic rendering or the non-photorealistic rendering through the illumination value improvement is further specifically: before material light and shade rendering, an illumination value in a bidirectional reflection distribution function BRDF formula is improved so as to unify a realistic illumination model and a non-realistic illumination model, and the improvement formula is as follows:
Figure BDA0002588765650000041
Wherein I is the model illumination value, IfinalIs the final illumination value of the model, i is the current light source id, N is the total number of the current scene light sources, N is the rendering model normal, LiIs the light incident light vector of the current light source, SiA parallel light source S as an influence factor of the current light source on the current modeli=1,e0,e1Controlling the light and dark transition region by controlling the two parameters as input parameters, and rendering e if it is realistic0=0,e 11 is ═ 1; and if the rendering is not photorealistic rendering, adjusting the size of the parameter according to actual requirements.
Further, the acquiring of the illumination area specifically includes: calculating the illumination area, wherein the formula is as follows:
CLit=[RLit+Ifinal]·PLitColor·PLitcolorIntensity
wherein, PLitColorAnd PLitcolorIntensityInput control parameters provided for the art, respectively representing an offset color of illumination and an offset color intensity of illumination; rLitObtaining a light value query result for the shadow map, wherein if the rendering platform does not support the shadow map, the value is 0, and IfinalIs the final illumination value of the model, CLitIs the final color value of the illumination.
Further, the acquiring the shadow area specifically includes: the shaded area is calculated as follows:
Figure BDA0002588765650000042
wherein, PshadowColorAnd PshadowcolorIntensityInput control parameters for the art, respectively representing the offset color of the shadow and the offset color intensity of the shadow, RshadowThe shadow value is a shadow value query result of the shadow map, if the rendering platform does not support the shadow map, the value is 0, i is the current light source id, N is the total number of the current scene light sources, N is the rendering model normal, -L iAs the inverse vector of the incident light of the current light source, 1-SiA parallel light source S as an inverse factor of the current model' S influence by the current light sourcei=1,CshadowIs the final color value of the shadow,
Figure BDA0002588765650000051
representing the fully shaded area of the n light sources.
Further, the acquiring a bright-dark transition region specifically includes: calculating a light and dark transition region, wherein the formula is as follows:
Figure BDA0002588765650000052
wherein, Pe,PTlineColorAnd PTlineIntensityInput control parameters provided for the art are respectively used for representing boundary hardness, offset color of light and shade transition and offset color intensity of light and shade transition, i is the current light source id, N is the total number of the current scene light sources, N is the rendering model normal, LiIs the light incident light vector of the current light source, -LiIs the incident light inverse of the current light source, CTlineFor the final color value of the light-dark transition,
Figure BDA0002588765650000053
representing the sum of the theoretical shaded areas of the n light sources,
Figure BDA0002588765650000054
representing the sum of the theoretical illumination areas of the n light sources,
Figure BDA0002588765650000055
representing the sum of the bright and dark boundary transition regions of the n light sources.
The invention has the beneficial effects that: compared with the traditional rendering scheme, the method has better performance on a cross-platform system. In complex lighting situations (scene systems with more than 8 light sources, and multiple types of light sources), the traditional rendering texture scheme cannot render results on the mobile side and the H5 side. And because this patent has shielded the data in question, can normally render out corresponding result.
In addition, under a simple illumination scene (only one light source is provided, and the light source is parallel light), due to the fact that self-adaptive optimization is conducted on the rendering platform, cascade shadows are eliminated, shadow map resolution is optimized, bandwidth data can be greatly reduced on the mobile terminal, and performance data are improved.
Drawings
FIG. 1 is a schematic flow diagram of the process of the present invention.
Fig. 2 is a shadow distribution diagram of a photorealistic rendering or a non-photorealistic rendering according to a first embodiment of the present invention.
Fig. 3 is a flowchart illustrating a first embodiment of the present invention.
Fig. 4 is a schematic diagram of the system of the present invention.
Detailed Description
The invention is further described below with reference to the accompanying drawings.
Referring to fig. 1, a method for rendering a material based on a cross-rendering platform according to the present invention includes the following steps:
step S1, obtaining rendering platform hardware information;
step S2, according to the hardware information of the rendering platform, sequentially performing mapping format support operation, shadow support setting, illumination support setting and rendering support setting, so that the material can adapt to the rendering platform;
and step S3, in the real-time rendering stage, the shadow distribution of the realistic rendering or the non-realistic rendering is determined through the illumination value improvement, after the core algorithm formulas of the realistic rendering and the non-realistic rendering are unified, the illumination area, the shadow area and the light and dark transition area are obtained so that an operator can conveniently perform color processing on the three areas, and after the illumination color value is obtained, the texture is combined and output to finally obtain a rendering result to complete material rendering.
Wherein the map format support operation comprises: format settings and size ceiling settings, the shadow support settings including: cascade shadow setting, shadow map size setting and soft shadow support setting; the light source support arrangement comprises: effective light source number setting and light source type limitation; the rendering support setting comprises post rendering, delayed rendering and high dynamic illumination rendering HDR support judgment.
The invention is further illustrated below with reference to a specific embodiment:
rendering needs to be done in cross-platform support for two points: 1. in the data acquisition and processing stage, hardware differences need to be shielded, namely a reliable data set can be provided for a renderer to perform related material rendering no matter how large the hardware differences are; 2. the material rendering core algorithm must be refined and abstracted enough, unnecessary data input can be reduced by the refined rendering core, the error influence on the rendering core caused by cross-platform system difference can be reduced to the maximum extent, the abstract rendering core can ensure the parameterization adjustment of a renderer, and visual adjustment of rendering materials by art personnel is facilitated. Therefore, the patent divides rendering into two parts, namely a data set processing part and a rendering core processing part, wherein the data set processing part is used for shielding the difference of a cross-platform system and maintaining the relative safety and stability of a data set used by the rendering core; the rendering core algorithm part is used for rendering different materials, and visual adjustment of the rendering materials is achieved through related parameterized abstraction.
1. A data acquisition and processing stage: according to the experience of actual projects, the biggest difference of rendering support of different systems is five main contents, namely, mapping maximum size support, picture format, shadow support, illumination support and rendering support, so that the five contents are respectively subjected to relevant preset schemes.
Maximum dimensional specification upper limit: aiming at high-end platforms such as PC, Xbox, ps4 and the like, the permissible chartlet is 2048, 4096 and the like in high-resolution specification; for mobile platforms, such as: and the mobile phone, the palm machine and other platforms convert all the maps into maps with the resolution of 1024 and below. For models produced in 2016, all maps were converted to maps with a resolution of 512 and below.
The picture format is as follows: and for the H5 platform or the IOS platform, converting all DDS format maps into PNG format maps. If the system is the H5 platform, replace all non-JPG, PNG formatted pictures.
Shadow support determination: for high-end platforms such as PC, Xbox, ps4, cascade shading is allowed, the shadow map with the highest 2048 resolution is allowed, and soft shading is supported. For a mobile platform, the method does not support cascade shadows and soft shadows, and for a mobile phone produced in the past 2018, the method supports a shadow mapping with the highest 1024 resolution; for the mobile phone produced in 2016 to 2018, the highest 512-resolution shadow mapping is supported; shadow mapping is not supported for cell phones produced 2016. For the H5 platform, cascade shadow, soft shadow and shadow mapping are not supported.
Light source support determination: aiming at high-end platforms such as PC, Xbox, ps4 and the like, a single material is allowed to participate in material rendering calculation by at most 8 light sources, and four types of light source information data of parallel light, point light, condensation light and area light are supported. Aiming at a mobile end platform, a single material is allowed to participate in material rendering calculation by 2 light sources at most, parallel light and point light information data are supported, and light gathering and area light sources are eliminated. Aiming at an H5 platform, at most 1 light source in the same material is allowed to participate in material rendering calculation, only a parallel light source is supported, and three types of light source information including point light, light condensation and area light are eliminated.
And (3) judging the rendering support: the high-end rendering platform does not limit the number of post-renderings, supports delayed rendering and supports HDR; the number of post-rendering queues is limited for the mobile terminal platform, the number of post-rendering filters is not more than 4, delayed rendering is not supported, and HDR is supported; the number of post-rendering queues does not exceed 2 for the H5 platform, delayed rendering is not supported, and HDR is not supported.
2. Rendering core algorithm stage: the photorealistic rendering and the photorealistic rendering are unified into three main parts, namely illumination rendering, shadow rendering and light-dark transition rendering, and the self-luminous rendering part depends on whether a rendering platform supports HDR or not, so that the self-luminous rendering is forbidden on an H5 platform.
Before material light and shade rendering, the illumination value in the original BRDF formula is improved so as to unify the realistic illumination model and the non-realistic illumination model, and the improved formula 1 is as follows:
Figure BDA0002588765650000081
wherein I is the model illumination value, IfinalFor final illumination of the modelThe value i is the current light source id, N is the total number of the current scene light sources, N is the rendering model normal, LiIs the light incident light vector of the current light source, SiA parallel light source S as an influence factor of the current light source on the current model i1, spot light, area lightiThe intensity of its light source attenuation. e.g. of the type0,e1Controlling the light and dark transition region by controlling the two parameters as input parameters, and rendering e if it is realistic0=0,e 11 is ═ 1; if the rendering is not photorealistic, the size of the parameter and the project practice experience e can be adjusted according to actual requirements0=0,e1The effect is better when the value is 0.1.
After the core algorithm formulas of the realistic rendering and the non-realistic rendering are unified by the method, the three areas of the illumination area, the shadow area and the light and dark transition area are separated so that the three areas can be subjected to color processing by the art.
Calculating the illumination area, wherein the formula is as follows:
CLit=[RLit+Ifinal]·PLitColor·PLitcolorIntensity
equation 2
The input control parameter provided for the art is PLitColorAnd PLitcolorIntensityRespectively, the offset color of the illumination and the offset color intensity of the illumination. R LitObtaining a light value query result for the shadow map, wherein if the rendering platform does not support the shadow map, the value is 0, and IfinalIs the final illumination, C, calculated in equation 1LitIs the final color value of the illumination.
The shaded area is calculated as follows:
Figure BDA0002588765650000091
the input control parameter provided for the art is PshadowColorAnd PshadowcolorIntensityRespectively, the offset color of the shadow and the offset color intensity of the shadow. RshadowQuery result for shadow value of shadow map if rendering platform does not support shadowThe value is 0, i is the current light source id, N is the total number of the current scene light sources, N is the rendering model normal, -LiAs the inverse vector of the incident light of the current light source, 1-SiA parallel light source S as an inverse factor of the current model' S influence by the current light source i1, spot light, area lightiAs intensity of attenuation of its light source, CshadowIs the final color value of the shadow,
Figure BDA0002588765650000092
representing the fully shaded area of the n light sources.
Calculating a light and dark transition region, wherein the formula is as follows:
Figure BDA0002588765650000093
the input control parameter provided for the art is Pe,PTlineColorAnd PTlineIntensityThe boundary hardness, the offset color of the light-dark transition and the offset color intensity of the light-dark transition are respectively. i is the current light source id, N is the total number of the current scene light sources, N is the rendering model normal, LiIs the light incident light vector of the current light source, -L iIs the incident light inverse of the current light source, CTlineThe final color value of the light and shade transition;
Figure BDA0002588765650000094
representing the sum of the theoretical shaded areas of the n light sources,
Figure BDA0002588765650000095
representing the sum of the theoretical illumination areas of the n light sources,
Figure BDA0002588765650000096
representing the sum of the bright and dark boundary transition regions of the n light sources. PeFor the subjective control parameter of fine arts, the larger the value is, the smaller the sensitive boundary transition area is, and vice versa. After the illumination color values are calculated, the textures can be merged and output to finally obtain a rendering result.
In the real-time rendering stage, the shadow distribution of the realistic rendering or the non-realistic rendering is determined by parameter adjustment according to the formula 1, and the effect is shown in fig. 2. After the core algorithm formulas of the realistic rendering and the non-realistic rendering are unified by the method, the three areas of the illumination area, the shadow area and the light and dark transition area are separated so that the three areas can be subjected to color processing by the art. The deviation color and the deviation color intensity of the illumination range are adjusted according to a formula 2, the deviation color and the deviation color intensity of the shadow range are adjusted according to a formula 3, the light and dark boundary line range, the transition edge softness and hardness, the light and dark intersection line deviation color and the deviation color intensity are adjusted according to a formula 4, and a flow chart is shown in fig. 3.
As shown in fig. 4, the present invention further provides a system for material rendering based on a cross-rendering platform, where the system includes a hardware information obtaining module, a material setting module, and a rendering operation module;
the hardware information acquisition module is used for acquiring hardware information of the rendering platform;
the material setting module is used for sequentially carrying out mapping format support operation, shadow support setting, illumination support setting and rendering support setting according to the hardware information of the rendering platform, so that the material can adapt to the rendering platform;
the rendering operation module is used for determining the shadow distribution of the realistic rendering or the non-realistic rendering through the illumination value improvement in the real-time rendering stage, acquiring an illumination area, a shadow area and a light and dark transition area after unifying the core algorithm formulas of the realistic rendering and the non-realistic rendering, so that an operator can perform color processing on the three areas, and combining textures to output a rendering result to complete material rendering after acquiring an illumination color value.
Further, the map format support operation includes: format settings and size ceiling settings, the shadow support settings including: cascade shadow setting, shadow map size setting and soft shadow support setting; the light source support arrangement comprises: effective light source number setting and light source type limitation; the rendering support setting comprises post rendering, delayed rendering and high dynamic illumination rendering HDR support judgment.
Further, the determining of the shadow distribution of the photorealistic rendering or the non-photorealistic rendering through the illumination value improvement is further specifically: before material light and shade rendering, an illumination value in a bidirectional reflection distribution function BRDF formula is improved so as to unify a realistic illumination model and a non-realistic illumination model, and the improvement formula is as follows:
Figure BDA0002588765650000101
wherein I is the model illumination value, IfinalIs the final illumination value of the model, i is the current light source id, N is the total number of the current scene light sources, N is the rendering model normal, LiIs the light incident light vector of the current light source, SiA parallel light source S as an influence factor of the current light source on the current modeli=1,e0,e1Controlling the light and dark transition region by controlling the two parameters as input parameters, and rendering e if it is realistic0=0,e 11 is ═ 1; and if the rendering is not photorealistic rendering, adjusting the size of the parameter according to actual requirements.
Further, the acquiring of the illumination area specifically includes: calculating the illumination area, wherein the formula is as follows:
CLit=[RLit+Ifinal]·PLitColor·PLitcolorIntensity
wherein, PLitColorAnd PLitcolorIntensityInput control parameters provided for the art, respectively representing an offset color of illumination and an offset color intensity of illumination; rLitObtaining a light value query result for the shadow map, wherein if the rendering platform does not support the shadow map, the value is 0, and I finalIs the final illumination value of the model, CLitIs the final color value of the illumination.
Further, the acquiring the shadow area specifically includes: the shaded area is calculated as follows:
Figure BDA0002588765650000111
wherein, PshadowColorAnd PshadowcolorIntensityInput control parameters for the art, respectively representing the offset color of the shadow and the offset color intensity of the shadow, RshadowThe shadow value is a shadow value query result of the shadow map, if the rendering platform does not support the shadow map, the value is 0, i is the current light source id, N is the total number of the current scene light sources, N is the rendering model normal, -LiAs the inverse vector of the incident light of the current light source, 1-SiA parallel light source S as an inverse factor of the current model' S influence by the current light sourcei=1,CshadowIs the final color value of the shadow,
Figure BDA0002588765650000112
representing the fully shaded area of the n light sources.
Further, the acquiring a bright-dark transition region specifically includes: calculating a light and dark transition region, wherein the formula is as follows:
Figure BDA0002588765650000113
wherein, Pe,PTlineColorAnd PTlineIntensityInput control parameters provided for the art are respectively used for representing boundary hardness, offset color of light and shade transition and offset color intensity of light and shade transition, i is the current light source id, N is the total number of the current scene light sources, N is the rendering model normal, LiIs the light incident light vector of the current light source, -LiIs the incident light inverse of the current light source, C TlineFor the final color value of the light-dark transition,
Figure BDA0002588765650000121
representing the sum of the theoretical shaded areas of the n light sources,
Figure BDA0002588765650000122
representing the sum of the theoretical illumination areas of the n light sources,
Figure BDA0002588765650000123
representing n light sourcesSum of light and dark boundary transition regions. PeFor the subjective control parameter of fine arts, the larger the value is, the smaller the sensitive boundary transition area is, and vice versa.
The rendering results of the rendering scheme of the present invention and the conventional rendering scheme are shown in table 1 below:
TABLE 1 comparison of rendering result accuracy
Figure BDA0002588765650000124
In a word, under a simple illumination scene (only one light source is provided, and the light source is parallel light), because the rendering platform is subjected to self-adaptive optimization, cascade shadows are eliminated, and the shadow map resolution is optimized, the bandwidth data can be greatly reduced on a mobile terminal, and the performance data is improved.
The above description is only a preferred embodiment of the present invention, and all equivalent changes and modifications made in accordance with the claims of the present invention should be covered by the present invention.

Claims (12)

1. A material rendering method based on a cross-rendering platform is characterized in that: the method comprises the following steps:
step S1, obtaining rendering platform hardware information;
step S2, according to the hardware information of the rendering platform, sequentially performing mapping format support operation, shadow support setting, illumination support setting and rendering support setting, so that the material can adapt to the rendering platform;
And step S3, in the real-time rendering stage, the shadow distribution of the realistic rendering or the non-realistic rendering is determined through the illumination value improvement, after the core algorithm formulas of the realistic rendering and the non-realistic rendering are unified, the illumination area, the shadow area and the light and dark transition area are obtained so that an operator can conveniently perform color processing on the three areas, and after the illumination color value is obtained, the texture is combined and output to finally obtain a rendering result to complete material rendering.
2. The method of claim 1, wherein the method comprises: the map format support operation includes: format settings and size ceiling settings, the shadow support settings including: cascade shadow setting, shadow map size setting and soft shadow support setting; the light source support arrangement comprises: effective light source number setting and light source type limitation; the rendering support setting comprises post rendering, delayed rendering and high dynamic illumination rendering HDR support judgment.
3. The method of claim 1, wherein the method comprises: the determining of the shadow distribution of the photorealistic rendering or the non-photorealistic rendering through the illumination value improvement is further specifically: before material light and shade rendering, an illumination value in a bidirectional reflection distribution function BRDF formula is improved so as to unify a realistic illumination model and a non-realistic illumination model, and the improvement formula is as follows:
Figure FDA0002588765640000011
Figure FDA0002588765640000012
Wherein I is the model illumination value, IfinalIs the final illumination value of the model, i is the current light source id, N is the total number of the current scene light sources, N is the rendering model normal, LiIs the light incident light vector of the current light source, SiA parallel light source S as an influence factor of the current light source on the current modeli=1,e0,e1Controlling the light and dark transition region by controlling the two parameters as input parameters, and rendering e if it is realistic0=0,e11 is ═ 1; and if the rendering is not photorealistic rendering, adjusting the size of the parameter according to actual requirements.
4. The method of claim 3, wherein the method comprises: the acquiring of the illumination area specifically comprises: calculating the illumination area, wherein the formula is as follows:
CLit=[RLit+Ifinal]·PLitColor·PLitcolorIntensity
wherein, PLitColorAnd PLitcolorIntensityInput control parameters provided for the art, respectively representing an offset color of illumination and an offset color intensity of illumination; rLitObtaining a light value query result for the shadow map, wherein if the rendering platform does not support the shadow map, the value is 0, and IfinalIs the final illumination value of the model, CLitIs the final color value of the illumination.
5. The method of claim 1, wherein the method comprises: the acquiring the shadow area specifically comprises: the shaded area is calculated as follows:
Figure FDA0002588765640000021
Wherein, PshadowColorAnd PshadowcolorIntensityInput control parameters for the art, respectively representing the offset color of the shadow and the offset color intensity of the shadow, RshadowThe shadow value is a shadow value query result of the shadow map, if the rendering platform does not support the shadow map, the value is 0, i is the current light source id, N is the total number of the current scene light sources, N is the rendering model normal, -LiAs the inverse vector of the incident light of the current light source, 1-SiA parallel light source S as an inverse factor of the current model' S influence by the current light sourcei=1,CshadowIs the final color value of the shadow,
Figure FDA0002588765640000022
representing the fully shaded area of the n light sources.
6. The method of claim 1, wherein the method comprises: the acquiring of the light and dark transition region specifically comprises: calculating a light and dark transition region, wherein the formula is as follows:
Figure FDA0002588765640000023
wherein, Pe,PTlineColorAnd PTlineIntensityInput control parameters provided for the art are respectively used for representing boundary hardness, offset color of light and shade transition and offset color intensity of light and shade transition, i is the current light source id, N is the total number of the current scene light sources, N is the rendering model normal, LiIs the light incident light vector of the current light source, -LiIs the incident light inverse of the current light source, CTlineFor the final color value of the light-dark transition,
Figure FDA0002588765640000031
representing the sum of the theoretical shaded areas of the n light sources,
Figure FDA0002588765640000032
Representing the sum of the theoretical illumination areas of the n light sources,
Figure FDA0002588765640000033
representing the sum of the bright and dark boundary transition regions of the n light sources.
7. A system for material rendering based on cross-rendering platform is characterized in that: the system comprises a hardware information acquisition module, a material setting module and a rendering operation module;
the hardware information acquisition module is used for acquiring hardware information of the rendering platform;
the material setting module is used for sequentially carrying out mapping format support operation, shadow support setting, illumination support setting and rendering support setting according to the hardware information of the rendering platform, so that the material can adapt to the rendering platform;
the rendering operation module is used for determining the shadow distribution of the realistic rendering or the non-realistic rendering through the illumination value improvement in the real-time rendering stage, acquiring an illumination area, a shadow area and a light and dark transition area after unifying the core algorithm formulas of the realistic rendering and the non-realistic rendering, so that an operator can perform color processing on the three areas, and combining textures to output a rendering result to complete material rendering after acquiring an illumination color value.
8. The system of claim 7, wherein the cross-rendering platform based material rendering system is further configured to: the map format support operation includes: format settings and size ceiling settings, the shadow support settings including: cascade shadow setting, shadow map size setting and soft shadow support setting; the light source support arrangement comprises: effective light source number setting and light source type limitation; the rendering support setting comprises post rendering, delayed rendering and high dynamic illumination rendering HDR support judgment.
9. The system of claim 7, wherein the cross-rendering platform based material rendering system is further configured to: the determining of the shadow distribution of the photorealistic rendering or the non-photorealistic rendering through the illumination value improvement is further specifically: before material light and shade rendering, an illumination value in a bidirectional reflection distribution function BRDF formula is improved so as to unify a realistic illumination model and a non-realistic illumination model, and the improvement formula is as follows:
Figure FDA0002588765640000041
Figure FDA0002588765640000042
wherein I is the model illumination value, IfinalIs the final illumination value of the model, i is the current light source id, N is the total number of the current scene light sources, N is the rendering model normal, LiIs the light incident light vector of the current light source, SiA parallel light source S as an influence factor of the current light source on the current modeli=1,e0,e1For inputting parameters, controlling the light-dark transition by controlling the two parametersRegion, if photorealistic rendering e0=0,e11 is ═ 1; and if the rendering is not photorealistic rendering, adjusting the size of the parameter according to actual requirements.
10. The system of claim 9, wherein the cross-rendering platform based material rendering system is further configured to: the acquiring of the illumination area specifically comprises: calculating the illumination area, wherein the formula is as follows:
CLit=[RLit+Ifinal]·PLitColor·PLitcolorIntensity
wherein, PLitColorAnd PLitcolorIntensityInput control parameters provided for the art, respectively representing an offset color of illumination and an offset color intensity of illumination; r LitObtaining a light value query result for the shadow map, wherein if the rendering platform does not support the shadow map, the value is 0, and IfinalIs the final illumination value of the model, CLitIs the final color value of the illumination.
11. The system of claim 7, wherein the cross-rendering platform based material rendering system is further configured to: the acquiring the shadow area specifically comprises: the shaded area is calculated as follows:
Figure FDA0002588765640000043
wherein, PshadowColorAnd PshadowcolorIntensityInput control parameters for the art, respectively representing the offset color of the shadow and the offset color intensity of the shadow, RshadowThe shadow value is a shadow value query result of the shadow map, if the rendering platform does not support the shadow map, the value is 0, i is the current light source id, N is the total number of the current scene light sources, N is the rendering model normal, -LiAs the inverse vector of the incident light of the current light source, 1-SiA parallel light source S as an inverse factor of the current model' S influence by the current light sourcei=1,CshadowIs the final color value of the shadow,
Figure FDA0002588765640000051
representing the fully shaded area of the n light sources.
12. The system of claim 7, wherein the cross-rendering platform based material rendering system is further configured to: the acquiring of the light and dark transition region specifically comprises: calculating a light and dark transition region, wherein the formula is as follows:
Figure FDA0002588765640000052
wherein, Pe,PTlineColorAnd PTlineIntensityInput control parameters provided for the art are respectively used for representing boundary hardness, offset color of light and shade transition and offset color intensity of light and shade transition, i is the current light source id, N is the total number of the current scene light sources, N is the rendering model normal, L iIs the light incident light vector of the current light source, -LiIs the incident light inverse of the current light source, CTlineFor the final color value of the light-dark transition,
Figure FDA0002588765640000053
representing the sum of the theoretical shaded areas of the n light sources,
Figure FDA0002588765640000054
representing the sum of the theoretical illumination areas of the n light sources,
Figure FDA0002588765640000055
representing the sum of the bright and dark boundary transition regions of the n light sources.
CN202010689461.XA 2020-07-17 2020-07-17 Cross-rendering platform-based material rendering method and system Active CN111862254B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010689461.XA CN111862254B (en) 2020-07-17 2020-07-17 Cross-rendering platform-based material rendering method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010689461.XA CN111862254B (en) 2020-07-17 2020-07-17 Cross-rendering platform-based material rendering method and system

Publications (2)

Publication Number Publication Date
CN111862254A true CN111862254A (en) 2020-10-30
CN111862254B CN111862254B (en) 2023-06-16

Family

ID=72983989

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010689461.XA Active CN111862254B (en) 2020-07-17 2020-07-17 Cross-rendering platform-based material rendering method and system

Country Status (1)

Country Link
CN (1) CN111862254B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112835621A (en) * 2021-01-13 2021-05-25 西安飞蝶虚拟现实科技有限公司 Cross-platform virtual reality resource processing method and system
CN113052947A (en) * 2021-03-08 2021-06-29 网易(杭州)网络有限公司 Rendering method, rendering device, electronic equipment and storage medium
CN113096230A (en) * 2021-04-20 2021-07-09 浙江凌迪数字科技有限公司 Real-time rendering method of laser fabric in realistic clothing rendering
CN113112582A (en) * 2021-04-20 2021-07-13 浙江凌迪数字科技有限公司 Real-time rendering method of sidelight fabric in realistic clothing rendering
CN113658316A (en) * 2021-10-18 2021-11-16 北京市商汤科技开发有限公司 Rendering method and device of three-dimensional model, storage medium and computer equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140125672A1 (en) * 2012-11-02 2014-05-08 Microsoft Corporation Cross-platform data visualizations using common descriptions
CN106846238A (en) * 2017-03-01 2017-06-13 北京趣酷科技有限公司 A kind of cross-platform automotive engine system of Elf3D
CN107103638A (en) * 2017-05-27 2017-08-29 杭州万维镜像科技有限公司 A kind of Fast rendering method of virtual scene and model
CN108876883A (en) * 2018-05-24 2018-11-23 武汉斗鱼网络科技有限公司 Texture creation method, device, equipment and storage medium based on OpenGLES
CN108984169A (en) * 2017-06-01 2018-12-11 刘开元 A kind of cross-platform Multielement integration development system
CN110659024A (en) * 2019-08-21 2020-01-07 北京达佳互联信息技术有限公司 Graphic resource conversion method, apparatus, electronic device and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140125672A1 (en) * 2012-11-02 2014-05-08 Microsoft Corporation Cross-platform data visualizations using common descriptions
CN106846238A (en) * 2017-03-01 2017-06-13 北京趣酷科技有限公司 A kind of cross-platform automotive engine system of Elf3D
CN107103638A (en) * 2017-05-27 2017-08-29 杭州万维镜像科技有限公司 A kind of Fast rendering method of virtual scene and model
CN108984169A (en) * 2017-06-01 2018-12-11 刘开元 A kind of cross-platform Multielement integration development system
CN108876883A (en) * 2018-05-24 2018-11-23 武汉斗鱼网络科技有限公司 Texture creation method, device, equipment and storage medium based on OpenGLES
CN110659024A (en) * 2019-08-21 2020-01-07 北京达佳互联信息技术有限公司 Graphic resource conversion method, apparatus, electronic device and storage medium

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112835621A (en) * 2021-01-13 2021-05-25 西安飞蝶虚拟现实科技有限公司 Cross-platform virtual reality resource processing method and system
CN112835621B (en) * 2021-01-13 2024-04-02 西安飞蝶虚拟现实科技有限公司 Cross-platform virtual reality resource processing method and processing system
CN113052947A (en) * 2021-03-08 2021-06-29 网易(杭州)网络有限公司 Rendering method, rendering device, electronic equipment and storage medium
CN113052947B (en) * 2021-03-08 2022-08-16 网易(杭州)网络有限公司 Rendering method, rendering device, electronic equipment and storage medium
CN113096230A (en) * 2021-04-20 2021-07-09 浙江凌迪数字科技有限公司 Real-time rendering method of laser fabric in realistic clothing rendering
CN113112582A (en) * 2021-04-20 2021-07-13 浙江凌迪数字科技有限公司 Real-time rendering method of sidelight fabric in realistic clothing rendering
CN113096230B (en) * 2021-04-20 2022-06-10 浙江凌迪数字科技有限公司 Real-time rendering method of laser fabric in realistic clothing rendering
CN113112582B (en) * 2021-04-20 2022-07-12 浙江凌迪数字科技有限公司 Real-time rendering method of sidelight fabric in realistic clothing rendering
CN113658316A (en) * 2021-10-18 2021-11-16 北京市商汤科技开发有限公司 Rendering method and device of three-dimensional model, storage medium and computer equipment
CN113658316B (en) * 2021-10-18 2022-03-08 北京市商汤科技开发有限公司 Rendering method and device of three-dimensional model, storage medium and computer equipment

Also Published As

Publication number Publication date
CN111862254B (en) 2023-06-16

Similar Documents

Publication Publication Date Title
CN111862254B (en) Cross-rendering platform-based material rendering method and system
CN112116692B (en) Model rendering method, device and equipment
CN108537861B (en) Map generation method, device, equipment and storage medium
CN112316420B (en) Model rendering method, device, equipment and storage medium
US7583264B2 (en) Apparatus and program for image generation
CN111696188B (en) Rendering graph rapid illumination editing method and device and rendering method
CN111968216A (en) Volume cloud shadow rendering method and device, electronic equipment and storage medium
CN109887066B (en) Lighting effect processing method and device, electronic equipment and storage medium
US20200035038A1 (en) An Interactive Implementation Method for Mobile Terminal Display of 3D Model
CN112365861B (en) Display image adjusting method, electronic device and computer readable storage medium
CN111383320B (en) Virtual model processing method, device, equipment and storage medium
CN112891946A (en) Game scene generation method and device, readable storage medium and electronic equipment
CN112862943A (en) Virtual model rendering method and device, storage medium and electronic equipment
CN112258621B (en) Method for observing three-dimensional rendering two-dimensional animation in real time
US9626774B2 (en) Saturation varying color space
CN112509108B (en) GPU-based vertex ambient light shielding generation method and image rendering method
US8942476B1 (en) Saturation varying and lighting independent color color control for computer graphics
CN115845369A (en) Cartoon style rendering method and device, electronic equipment and storage medium
CN112465941B (en) Volume cloud processing method and device, electronic equipment and storage medium
CN114820904A (en) Illumination-supporting pseudo-indoor rendering method, apparatus, medium, and device
CN113838155A (en) Method and device for generating material map and electronic equipment
CN113240588A (en) Image defogging and exposure method based on enhanced atmospheric scattering model
KR20110059275A (en) Method and system for interactively editing lighting effects for 3d rendered images
CN100596166C (en) white point judgement method and correciton method for white balance
CN117372598A (en) Visual rendering method, system and medium based on environmental art design

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant