CN111862254B - Cross-rendering platform-based material rendering method and system - Google Patents

Cross-rendering platform-based material rendering method and system Download PDF

Info

Publication number
CN111862254B
CN111862254B CN202010689461.XA CN202010689461A CN111862254B CN 111862254 B CN111862254 B CN 111862254B CN 202010689461 A CN202010689461 A CN 202010689461A CN 111862254 B CN111862254 B CN 111862254B
Authority
CN
China
Prior art keywords
rendering
shadow
illumination
light
light source
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010689461.XA
Other languages
Chinese (zh)
Other versions
CN111862254A (en
Inventor
刘德建
高山晓
陈宏展
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujian TQ Digital Co Ltd
Original Assignee
Fujian TQ Digital Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujian TQ Digital Co Ltd filed Critical Fujian TQ Digital Co Ltd
Priority to CN202010689461.XA priority Critical patent/CN111862254B/en
Publication of CN111862254A publication Critical patent/CN111862254A/en
Application granted granted Critical
Publication of CN111862254B publication Critical patent/CN111862254B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Generation (AREA)

Abstract

The invention provides a material rendering method based on a cross-rendering platform, which comprises the following steps: s1, acquiring rendering platform hardware information; step S2, according to the hardware information of the rendering platform, mapping format supporting operation, shadow supporting setting, illumination supporting setting and rendering supporting setting are sequentially carried out, so that materials can adapt to the rendering platform; and S3, in a real-time rendering stage, determining shadow distribution of photorealistic rendering or nonphotorealistic rendering through illumination value improvement, acquiring an illumination region, a shadow region and a light-dark transition region after unifying core algorithm formulas of photorealistic rendering and nonphotorealistic rendering so as to facilitate an operator to perform color processing on the three regions, and combining texture output to finally obtain a rendering result to complete material rendering after obtaining illumination color values, so that rendering operation can be completed on a cross-rendering platform.

Description

Cross-rendering platform-based material rendering method and system
Technical Field
The invention relates to the technical field of game rendering, in particular to a material rendering method and a material rendering system based on a cross-rendering platform.
Background
The native material rendering solution often generates various material rendering adaptation problems under a cross-platform system, such as that the rendering platform cannot adapt to a proper shadow map, or the rendering platform does not support a complex illumination system with multiple light sources, or the rendering platform does not support a related map size format, and the like. With the development of mobile terminals, rendering terminals are becoming more and more diverse in the market. The method also makes the defects of the traditional material rendering scheme more and more obvious, and the whole user group under the platform is lost due to the inadaptation of a rendering terminal, so that the method is a real economic loss. Existing rendering platforms include, but are not limited to: PC, PS4, xbox, switch, chrome, safari, huawei nova6, huawei p20, iphone7, iphone8, iphone ES rendering platforms.
Disclosure of Invention
In order to overcome the problems, the invention aims to provide a material rendering method based on a cross-rendering platform, which can complete rendering operation on the cross-rendering platform and can normally display rendering results on each rendering platform.
The invention is realized by adopting the following scheme: a method of cross-rendering platform based material rendering, the method comprising the steps of:
s1, acquiring rendering platform hardware information;
step S2, according to the hardware information of the rendering platform, mapping format supporting operation, shadow supporting setting, illumination supporting setting and rendering supporting setting are sequentially carried out, so that materials can adapt to the rendering platform;
and S3, in a real-time rendering stage, determining shadow distribution of photorealistic rendering or nonphotorealistic rendering through illumination value improvement, acquiring an illumination region, a shadow region and a light-dark transition region after unifying core algorithm formulas of photorealistic rendering and nonphotorealistic rendering so as to facilitate an operator to perform color processing on the three regions, and combining texture output to finally obtain a rendering result after obtaining illumination color values to finish material rendering.
Further, the map format supporting operation includes: a format setting and an upper size limit setting, the shadow support setting comprising: setting a cascade shadow, setting a shadow map size and setting a soft shadow support; the light source support arrangement comprises: setting the number of effective light sources and limiting the types of the light sources; the rendering support settings include post-rendering, deferred rendering, and high dynamic illumination rendering HDR support decisions.
Further, the determining the shadow distribution of the photorealistic rendering or the non-photorealistic rendering through the illumination value improvement further comprises the following specific steps: before material shading rendering, the illumination value in the bidirectional reflection distribution function BRDF formula is improved so as to unify the photorealistic illumination model and the photorealistic illumination model, and the improvement formula is as follows:
Figure BDA0002588765650000021
wherein I is a model illumination value, I final I is the current light source id, N is the total number of current scene light sources, N is the rendering model normal, and L is the final illumination value of the model i Is the incident light vector of the current light source, S i For the current model to be influenced by the current light sourceFactor, parallel light source S i =1,e 0 ,e 1 For inputting parameters, controlling the light-dark transition region by controlling the two parameters, rendering e if it is realism 0 =0,e 1 =1; and if the rendering is non-realistic, adjusting the parameter according to the actual requirement.
Further, the obtaining the illumination area specifically includes: the illumination area is calculated as follows:
C Lit =[R Lit +I final ]·P LitColor ·P LitcolorIntensity
wherein P is LitColor And P LitcolorIntensity Input control parameters provided for art respectively represent offset colors of illumination and offset color intensities of illumination; r is R Lit Obtaining an illumination value query result for the shadow map, and if the rendering platform does not support the shadow map, obtaining the illumination value query result as 0,I final For the final illumination value of the model, C Lit Is the final color value of the illumination.
Further, the shadow capturing area specifically includes: shadow areas were calculated as follows:
Figure BDA0002588765650000022
wherein P is shadowColor And P shadowcolorIntensity Input control parameters provided for art, representing the offset color of the shadow and the offset color intensity of the shadow, respectively, R shadow As a shadow value query result of the shadow map, if the shadow map is not supported by the rendering platform, the value is 0, i is the current light source id, N is the total number of the current scene light sources, N is the rendering model normal, -L i 1-S is the incident light inverse vector of the current light source i For the influence factor of the current model by the current light source, the parallel light source S i =1,C shadow As the final color value of the shadow,
Figure BDA0002588765650000031
representing the total shadow area of the n light sources.
Further, the light-dark transition region is specifically: the light-dark transition region is calculated as follows:
Figure BDA0002588765650000032
wherein P is e ,P TlineColor And P TlineIntensity Input control parameters provided for art respectively represent boundary hardness, offset color of light-dark transition and offset color intensity of light-dark transition, i is current light source id, N is total number of current scene light sources, N is rendering model normal, L i Is the incident light vector of the current light source, -L i Is the incident light inverse vector of the current light source, C Tline For a bright-dark transition the final color value,
Figure BDA0002588765650000033
represents the sum of theoretical shadow areas of n light sources, < >>
Figure BDA0002588765650000034
Represents the sum of the theoretical illumination areas of n light sources, < >>
Figure BDA0002588765650000035
Representing the sum of the light-dark boundary transition areas of the n light sources.
The invention also provides a system for rendering the material based on the cross-rendering platform, which comprises a hardware information acquisition module, a material setting module and a rendering operation module;
the hardware information acquisition module is used for acquiring the hardware information of the rendering platform;
the material setting module is used for sequentially carrying out mapping format supporting operation, shadow supporting setting, illumination supporting setting and rendering supporting setting according to the hardware information of the rendering platform, so that the material can adapt to the rendering platform;
the rendering operation module is used for determining shadow distribution of photorealistic rendering or non-photorealistic rendering through illumination value improvement in a real-time rendering stage, acquiring an illumination region after unifying core algorithm formulas of photorealistic rendering and non-photorealistic rendering, and combining texture output to finally obtain a rendering result so as to finish material rendering after an operator processes colors of the three regions.
Further, the map format supporting operation includes: a format setting and an upper size limit setting, the shadow support setting comprising: setting a cascade shadow, setting a shadow map size and setting a soft shadow support; the light source support arrangement comprises: setting the number of effective light sources and limiting the types of the light sources; the rendering support settings include post-rendering, deferred rendering, and high dynamic illumination rendering HDR support decisions.
Further, the determining the shadow distribution of the photorealistic rendering or the non-photorealistic rendering through the illumination value improvement further comprises the following specific steps: before material shading rendering, the illumination value in the bidirectional reflection distribution function BRDF formula is improved so as to unify the photorealistic illumination model and the photorealistic illumination model, and the improvement formula is as follows:
Figure BDA0002588765650000041
wherein I is a model illumination value, I final I is the current light source id, N is the total number of current scene light sources, N is the rendering model normal, and L is the final illumination value of the model i Is the incident light vector of the current light source, S i For the influence factor of the current model to the current light source, the parallel light source S i =1,e 0 ,e 1 For inputting parameters, controlling the light-dark transition region by controlling the two parameters, rendering e if it is realism 0 =0,e 1 =1; and if the rendering is non-realistic, adjusting the parameter according to the actual requirement.
Further, the obtaining the illumination area specifically includes: the illumination area is calculated as follows:
C Lit =[R Lit +I final ]·P LitColor ·P LitcolorIntensity
wherein P is LitColor And P LitcolorIntensity Input control parameters provided for art respectively represent offset colors of illumination and offset color intensities of illumination; r is R Lit Obtaining an illumination value query result for the shadow map, and if the rendering platform does not support the shadow map, obtaining the illumination value query result as 0,I final For the final illumination value of the model, C Lit Is the final color value of the illumination.
Further, the shadow capturing area specifically includes: shadow areas were calculated as follows:
Figure BDA0002588765650000042
wherein P is shadowColor And P shadowcolorIntensity Input control parameters provided for art, representing the offset color of the shadow and the offset color intensity of the shadow, respectively, R shadow As a shadow value query result of the shadow map, if the shadow map is not supported by the rendering platform, the value is 0, i is the current light source id, N is the total number of the current scene light sources, N is the rendering model normal, -L i 1-S is the incident light inverse vector of the current light source i For the influence factor of the current model by the current light source, the parallel light source S i =1,C shadow As the final color value of the shadow,
Figure BDA0002588765650000051
representing the total shadow area of the n light sources.
Further, the light-dark transition region is specifically: the light-dark transition region is calculated as follows:
Figure BDA0002588765650000052
wherein P is e ,P TlineColor And P TlineIntensity Input control parameters for art are respectively represented by boundary hardness and deviation of light-dark transitionColor and shading transition offset color intensity, i is the current light source id, N is the current scene light source total number, N is the rendering model normal, L i Is the incident light vector of the current light source, -L i Is the incident light inverse vector of the current light source, C Tline For a bright-dark transition the final color value,
Figure BDA0002588765650000053
represents the sum of theoretical shadow areas of n light sources, < >>
Figure BDA0002588765650000054
Represents the sum of the theoretical illumination areas of n light sources, < >>
Figure BDA0002588765650000055
Representing the sum of the light-dark boundary transition areas of the n light sources.
The invention has the beneficial effects that: compared with the traditional rendering scheme, the method has better performance on a cross-platform system. Under complex illumination conditions (a single scene system with more than 8 light sources, with multiple types of light sources), the traditional rendering material scheme cannot render results on the mobile end and the H5 end. The patent can normally render corresponding results because the problematic data are shielded.
And this patent is under simple illumination scene (only a light source, and the light source is parallel light), owing to done adaptive optimization to rendering platform, has rejected the allied oneself with the level shadow, has optimized shadow map resolution ratio, consequently can significantly reduce bandwidth data on the mobile terminal, has improved performance data.
Drawings
FIG. 1 is a schematic flow chart of the method of the present invention.
Fig. 2 is a schematic diagram of a shadow distribution of a photorealistic or non-photorealistic rendering according to a first embodiment of the present invention.
Fig. 3 is a schematic flow chart of a first embodiment of the present invention.
Fig. 4 is a schematic diagram of the system principle of the present invention.
Detailed Description
The invention is further described below with reference to the accompanying drawings.
Referring to fig. 1, the method for rendering materials based on a cross-rendering platform of the present invention includes the following steps:
s1, acquiring rendering platform hardware information;
step S2, according to the hardware information of the rendering platform, mapping format supporting operation, shadow supporting setting, illumination supporting setting and rendering supporting setting are sequentially carried out, so that materials can adapt to the rendering platform;
and S3, in a real-time rendering stage, determining shadow distribution of photorealistic rendering or nonphotorealistic rendering through illumination value improvement, acquiring an illumination region, a shadow region and a light-dark transition region after unifying core algorithm formulas of photorealistic rendering and nonphotorealistic rendering so as to facilitate an operator to perform color processing on the three regions, and combining texture output to finally obtain a rendering result after obtaining illumination color values to finish material rendering.
Wherein the map format support operation includes: a format setting and an upper size limit setting, the shadow support setting comprising: setting a cascade shadow, setting a shadow map size and setting a soft shadow support; the light source support arrangement comprises: setting the number of effective light sources and limiting the types of the light sources; the rendering support settings include post-rendering, deferred rendering, and high dynamic illumination rendering HDR support decisions.
The invention is further described with reference to the following specific examples:
rendering requires two points in cross-platform support: 1. the data acquisition processing stage needs to shield hardware differences, namely, no matter how large the hardware differences are, reliable data sets can be provided for a renderer to render relevant materials; 2. the material rendering core algorithm must be sufficiently refined and abstract, the refined rendering core can reduce unnecessary data input, the error influence caused by cross-platform system difference on the rendering core is reduced to the greatest extent, and the abstract rendering core can ensure parameterization adjustment of a renderer, so that visual adjustment of rendering materials by art staff is facilitated. Therefore, the patent divides rendering into two parts, namely a data set processing part and a rendering core processing part, wherein the data set processing part is used for shielding cross-platform system differences and maintaining the relative safety and stability of a data set used by the rendering core; the rendering core algorithm part is used for rendering different materials, and visual adjustment of the rendering materials is achieved through related parameterization abstraction.
1. Data acquisition and processing: according to practical project experience, the largest difference between different systems for rendering support is that five main contents are supported by maximum size of a map, picture format, shadow support, illumination support and rendering support, so that related preset schemes are respectively carried out on the five contents.
Maximum size specification upper limit: aiming at high-end platforms such as PC, xbox and ps4, the allowable mapping is 2048, 4096 and other high-resolution specifications; for mobile platforms, such as: and the mobile phone, palm phone and other platforms convert all the maps into maps with 1024 resolution ratios and below. All the maps were converted to maps of 512 resolution and below for models produced in 2016.
Picture format: and converting all DDS format maps into PNG format maps aiming at an H5 platform or an IOS platform. If the system is an H5 platform, replacing all pictures in a non-JPG and PNG format.
Shadow support determination: for high-end platforms such as PC, xbox and ps4, cascade shading is allowed to be adopted, a shading diagram with the highest 2048 resolution is allowed, and soft shading is supported. For a mobile platform, joint-level shadow is not supported, soft shadow is not supported, and for a mobile phone produced after 2018, shadow mapping with the highest 1024 resolution is supported; for handsets produced in 2016 to 2018, the highest 512 resolution shadow mapping is supported; for handsets produced by 2016 years ago, shadow mapping was not supported. For the H5 platform, cascade shadows are not supported, soft shadows are not supported, and shadow mapping is not supported.
Light source support determination: aiming at high-end platforms such as PC, xbox and ps4, a single material is allowed to participate in material rendering calculation by at most 8 light sources, and four light source information data such as parallel light, spot light, condensation and area light are supported. Aiming at a mobile terminal platform, a single material is allowed to participate in material rendering calculation by at most 2 light sources, parallel light and spot light type light source information data are supported, and two types of light sources of condensation and area light are removed. Aiming at an H5 platform, the same material is allowed to participate in rendering calculation of the material at most 1 light source, only parallel light sources are supported, and three light source information of point light, light condensation and area light are removed.
Rendering support judgment: the high-end rendering platform is not limited in the number of rear rendering, supports delayed rendering and supports HDR; limiting the number of the rear rendering queues for the mobile terminal platform, wherein the number of the rear rendering filters is not more than 4, and the delay rendering is not supported, so that the HDR is supported; for the H5 platform, the number of post rendering queues is not more than 2, delay rendering is not supported, and HDR is not supported.
2. Rendering core algorithm stage: the three main body parts of the photorealistic rendering and the nonphotorealistic rendering are unified into illumination rendering, shadow rendering and light and shade transition rendering, and the self-luminous rendering part depends on whether the rendering platform supports HDR or not, so that the self-luminous rendering can be disabled on the H5 platform.
Before material shading rendering, the illumination value in the original BRDF formula is improved so as to unify the photorealistic illumination and the non-photorealistic illumination models, and the improvement formula 1 is as follows:
Figure BDA0002588765650000081
wherein I is a model illumination value, I final I is the current light source id, N is the total number of current scene light sources, N is the rendering model normal, and L is the final illumination value of the model i Is the incident light vector of the current light source, S i For the influence factor of the current model to the current light source, the parallel light source S i Spot light, spot light S i Attenuating the intensity for its light source. e, e 0 ,e 1 For inputting parameters, controlling the light-dark transition region by controlling the two parameters, rendering e if it is realism 0 =0,e 1 =1; if the rendering is non-realistic, the parameter size can be adjusted according to the actual demand, the project practice experience e 0 =0,e 1 =0.1 effect is better.
After the core algorithm formulas of the photorealistic rendering and the nonphotorealistic rendering are unified by the method, the illumination area, the shadow area and the light-dark transition area are separated so that the arts can process the colors of the three areas.
The illumination area is calculated as follows:
C Lit =[R Lit +I final ]·P LitColor ·P LitcolorIntensity
equation 2
The input control parameter provided for art is P LitColor And P LitcolorIntensity The offset color of the illumination and the offset color intensity of the illumination are respectively. R is R Lit Obtaining an illumination value query result for the shadow map, and if the rendering platform does not support the shadow map, obtaining the illumination value query result as 0,I final For the final illumination calculated in equation 1, C Lit Is the final color value of the illumination.
Shadow areas were calculated as follows:
Figure BDA0002588765650000091
the input control parameter provided for art is P shadowColor And P shadowcolorIntensity The shift color of the shade and the shift color intensity of the shade, respectively. R is R shadow As a shadow value query result of the shadow map, if the shadow map is not supported by the rendering platform, the value is 0, i is the current light source id, N is the total number of the current scene light sources, N is the rendering model normal, -L i 1-S is the incident light inverse vector of the current light source i For the influence factor of the current model by the current light source, the parallel light source S i Spot light, spot light S i For its light source attenuation intensity, C shadow As the final color value of the shadow,
Figure BDA0002588765650000092
representing the total shadow area of the n light sources.
The light-dark transition region is calculated as follows:
Figure BDA0002588765650000093
the input control parameter provided for art is P e ,P TlineColor And P TlineIntensity The boundary hardness, the offset color of the bright-dark transition and the offset color intensity of the bright-dark transition are respectively. i is the current light source id, N is the total number of current scene light sources, N is the rendering model normal, L i Is the incident light vector of the current light source, -L i Is the incident light inverse vector of the current light source, C Tline Final color values for bright-dark transitions;
Figure BDA0002588765650000094
represents the sum of theoretical shadow areas of n light sources, < >>
Figure BDA0002588765650000095
Represents the sum of the theoretical illumination areas of n light sources, < >>
Figure BDA0002588765650000096
Representing the sum of the light-dark boundary transition areas of the n light sources. P (P) e For the subjective control parameters of the art, the larger the value is, the smaller the sensitive junction transition area is, and the larger the reverse is. After the illumination color value is calculated, the texture output can be combined to finally obtain a rendering result.
In the real-time rendering stage, according to the formula 1, the shadow distribution of the realism rendering or the non-realism rendering is determined through parameter adjustment, and the effect is shown in figure 2. After the core algorithm formulas of the photorealistic rendering and the nonphotorealistic rendering are unified by the method, the illumination area, the shadow area and the light-dark transition area are separated so that the arts can process the colors of the three areas. The deviation color and the deviation color intensity of the illumination range are adjusted according to the formula 2, the deviation color and the deviation color intensity of the shadow range are adjusted according to the formula 3, the light and shade boundary line range, the transition edge hardness, the light and shade boundary line deviation color and the deviation color intensity are adjusted according to the formula 4, and the flow chart is shown in fig. 3.
As shown in fig. 4, the present invention further provides a system for rendering materials based on a cross-rendering platform, where the system includes a hardware information acquisition module, a material setting module, and a rendering operation module;
the hardware information acquisition module is used for acquiring the hardware information of the rendering platform;
the material setting module is used for sequentially carrying out mapping format supporting operation, shadow supporting setting, illumination supporting setting and rendering supporting setting according to the hardware information of the rendering platform, so that the material can adapt to the rendering platform;
the rendering operation module is used for determining shadow distribution of photorealistic rendering or non-photorealistic rendering through illumination value improvement in a real-time rendering stage, acquiring an illumination region after unifying core algorithm formulas of photorealistic rendering and non-photorealistic rendering, and combining texture output to finally obtain a rendering result so as to finish material rendering after an operator processes colors of the three regions.
Further, the map format supporting operation includes: a format setting and an upper size limit setting, the shadow support setting comprising: setting a cascade shadow, setting a shadow map size and setting a soft shadow support; the light source support arrangement comprises: setting the number of effective light sources and limiting the types of the light sources; the rendering support settings include post-rendering, deferred rendering, and high dynamic illumination rendering HDR support decisions.
Further, the determining the shadow distribution of the photorealistic rendering or the non-photorealistic rendering through the illumination value improvement further comprises the following specific steps: before material shading rendering, the illumination value in the bidirectional reflection distribution function BRDF formula is improved so as to unify the photorealistic illumination model and the photorealistic illumination model, and the improvement formula is as follows:
Figure BDA0002588765650000101
wherein I is a model illumination value, I final For final illumination value of modelI is the current light source id, N is the total number of current scene light sources, N is the rendering model normal, L i Is the incident light vector of the current light source, S i For the influence factor of the current model to the current light source, the parallel light source S i =1,e 0 ,e 1 For inputting parameters, controlling the light-dark transition region by controlling the two parameters, rendering e if it is realism 0 =0,e 1 =1; and if the rendering is non-realistic, adjusting the parameter according to the actual requirement.
Further, the obtaining the illumination area specifically includes: the illumination area is calculated as follows:
C Lit =[R Lit +I final ]·P LitColor ·P LitcolorIntensity
wherein P is LitColor And P LitcolorIntensity Input control parameters provided for art respectively represent offset colors of illumination and offset color intensities of illumination; r is R Lit Obtaining an illumination value query result for the shadow map, and if the rendering platform does not support the shadow map, obtaining the illumination value query result as 0,I final For the final illumination value of the model, C Lit Is the final color value of the illumination.
Further, the shadow capturing area specifically includes: shadow areas were calculated as follows:
Figure BDA0002588765650000111
wherein P is shadowColor And P shadowcolorIntensity Input control parameters provided for art, representing the offset color of the shadow and the offset color intensity of the shadow, respectively, R shadow As a shadow value query result of the shadow map, if the shadow map is not supported by the rendering platform, the value is 0, i is the current light source id, N is the total number of the current scene light sources, N is the rendering model normal, -L i 1-S is the incident light inverse vector of the current light source i For the influence factor of the current model by the current light source, the parallel light source S i =1,C shadow As the final color value of the shadow,
Figure BDA0002588765650000112
representing the total shadow area of the n light sources.
Further, the light-dark transition region is specifically: the light-dark transition region is calculated as follows:
Figure BDA0002588765650000113
wherein P is e ,P TlineColor And P TlineIntensity Input control parameters provided for art respectively represent boundary hardness, offset color of light-dark transition and offset color intensity of light-dark transition, i is current light source id, N is total number of current scene light sources, N is rendering model normal, L i Is the incident light vector of the current light source, -L i Is the incident light inverse vector of the current light source, C Tline For a bright-dark transition the final color value,
Figure BDA0002588765650000121
represents the sum of theoretical shadow areas of n light sources, < >>
Figure BDA0002588765650000122
Represents the sum of the theoretical illumination areas of n light sources, < >>
Figure BDA0002588765650000123
Representing the sum of the light-dark boundary transition areas of the n light sources. P (P) e For the subjective control parameters of the art, the larger the value is, the smaller the sensitive junction transition area is, and the larger the reverse is.
The rendering results of the rendering scheme of the present invention and the conventional rendering scheme are shown in the following table 1:
TABLE 1 comparison of rendering results accuracy
Figure BDA0002588765650000124
In a word, this patent is under simple illumination scene (only a light source, and the light source is parallel light), owing to carried out self-adaptation optimization to rendering platform, has rejected the allied oneself with the level shadow, has optimized shadow map resolution ratio, consequently can significantly reduce bandwidth data on the mobile terminal, has improved performance data.
The foregoing description is only of the preferred embodiments of the invention, and all changes and modifications that come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.

Claims (10)

1. A material rendering method based on a cross-rendering platform is characterized by comprising the following steps: the method comprises the following steps:
s1, acquiring rendering platform hardware information;
step S2, according to the hardware information of the rendering platform, mapping format supporting operation, shadow supporting setting, illumination supporting setting and rendering supporting setting are sequentially carried out, so that materials can adapt to the rendering platform;
s3, in a real-time rendering stage, determining shadow distribution of photorealistic rendering or non-photorealistic rendering through illumination value improvement, acquiring an illumination region, a shadow region and a light-dark transition region after unifying core algorithm formulas of photorealistic rendering and non-photorealistic rendering so as to facilitate an operator to perform color processing on the three regions, and merging textures to output a final rendering result after obtaining illumination color values so as to finish material rendering;
the determining of the shadow distribution of the photorealistic rendering or the non-photorealistic rendering by the illumination value improvement further comprises the following specific steps: before material shading rendering, the illumination value in the bidirectional reflection distribution function BRDF formula is improved so as to unify the photorealistic illumination model and the photorealistic illumination model, and the improvement formula is as follows:
Figure FDA0004231223240000011
wherein I is a model illumination value, I final I is the current light source id, N is the total number of current scene light sources, N is the rendering model normal, and L is the final illumination value of the model i Is the incident light vector of the current light source,S i For the influence factor of the current model to the current light source, the parallel light source S i =1,e 0 ,e 1 For inputting parameters, controlling the light-dark transition region by controlling the two parameters, rendering e if it is realism 0 =0,e 1 =1; and if the rendering is non-realistic, adjusting the parameter size.
2. A method of cross-rendering platform based material rendering according to claim 1, wherein: the map format support operation includes: a format setting and an upper size limit setting, the shadow support setting comprising: setting a cascade shadow, setting a shadow map size and setting a soft shadow support; the illumination support setting includes: setting the number of effective light sources and limiting the types of the light sources; the rendering support settings include post-rendering, deferred rendering, and high dynamic illumination rendering HDR support decisions.
3. A method of cross-rendering platform based material rendering according to claim 1, wherein: the illumination obtaining area specifically comprises the following steps: the illumination area is calculated as follows:
C Lit =[R Lit +I final ]·P LitColor ·P LitcolorIntensity
wherein P is LitColor And P LitcolorIntensity Input control parameters provided for art respectively represent offset colors of illumination and offset color intensities of illumination; r is R Lit Obtaining an illumination value query result for the shadow map, and if the rendering platform does not support the shadow map, obtaining the illumination value query result as 0,I final For the final illumination value of the model, C Lit Is the final color value of the illumination.
4. A method of cross-rendering platform based material rendering according to claim 1, wherein: the shadow acquisition area specifically comprises the following steps: shadow areas were calculated as follows:
Figure FDA0004231223240000021
wherein P is shadowColor And P shadowcolorIntensity Input control parameters provided for art, representing the offset color of the shadow and the offset color intensity of the shadow, respectively, R shadow As a shadow value query result of the shadow map, if the shadow map is not supported by the rendering platform, the value is 0, i is the current light source id, N is the total number of the current scene light sources, N is the rendering model normal, -L i 1-S is the incident light inverse vector of the current light source i For the influence factor of the current model by the current light source, the parallel light source S i =1,C shadow As the final color value of the shadow,
Figure FDA0004231223240000022
representing the total shadow area of the n light sources.
5. A method of cross-rendering platform based material rendering according to claim 1, wherein: the light-dark transition region is specifically: the light-dark transition region is calculated as follows:
Figure FDA0004231223240000023
wherein P is e ,P TlineColor And P TlineIntensity Input control parameters provided for art respectively represent boundary hardness, offset color of light-dark transition and offset color intensity of light-dark transition, i is current light source id, N is total number of current scene light sources, N is rendering model normal, L i Is the incident light vector of the current light source, -L i Is the incident light inverse vector of the current light source, C Tline For a bright-dark transition the final color value,
Figure FDA0004231223240000031
represents the sum of theoretical shadow areas of n light sources, < >>
Figure FDA0004231223240000032
Represents the sum of the theoretical illumination areas of n light sources, < >>
Figure FDA0004231223240000033
Representing the sum of the light-dark boundary transition areas of the n light sources.
6. A system for rendering materials based on a cross-rendering platform, characterized in that: the system comprises a hardware information acquisition module, a material setting module and a rendering operation module;
the hardware information acquisition module is used for acquiring the hardware information of the rendering platform;
the material setting module is used for sequentially carrying out mapping format supporting operation, shadow supporting setting, illumination supporting setting and rendering supporting setting according to the hardware information of the rendering platform, so that the material can adapt to the rendering platform;
the rendering operation module is used for determining shadow distribution of photorealistic rendering or nonphotorealistic rendering through illumination value improvement in a real-time rendering stage, acquiring an illumination region, a shadow region and a light-dark transition region after unifying core algorithm formulas of photorealistic rendering and nonphotorealistic rendering so that an operator can process colors of the three regions, and merging textures to output a final rendering result to finish material rendering after obtaining illumination color values;
the determining of the shadow distribution of the photorealistic rendering or the non-photorealistic rendering by the illumination value improvement further comprises the following specific steps: before material shading rendering, the illumination value in the bidirectional reflection distribution function BRDF formula is improved so as to unify the photorealistic illumination model and the photorealistic illumination model, and the improvement formula is as follows:
Figure FDA0004231223240000034
wherein I is a model illumination value, I final I is the current light source id, N is the total number of current scene light sources, N is the rendering model normal, and L is the final illumination value of the model i Is the incident light vector of the current light source, S i For the influence factor of the current model to the current light source, the parallel light source S i =1,e 0 ,e 1 For inputting parameters, controlling the light-dark transition region by controlling the two parameters, rendering e if it is realism 0 =0,e 1 =1; and if the rendering is non-realistic, adjusting the parameter size.
7. A cross-rendering platform based material rendering system as claimed in claim 6, wherein: the map format support operation includes: a format setting and an upper size limit setting, the shadow support setting comprising: setting a cascade shadow, setting a shadow map size and setting a soft shadow support; the illumination support setting includes: setting the number of effective light sources and limiting the types of the light sources; the rendering support settings include post-rendering, deferred rendering, and high dynamic illumination rendering HDR support decisions.
8. A cross-rendering platform based material rendering system as claimed in claim 6, wherein: the illumination obtaining area specifically comprises the following steps: the illumination area is calculated as follows:
C Lit =[R Lit +I final ]·P LitColor ·P LitcolorIntensity
wherein P is LitColor And P LitcolorIntensity Input control parameters provided for art respectively represent offset colors of illumination and offset color intensities of illumination; r is R Lit Obtaining an illumination value query result for the shadow map, and if the rendering platform does not support the shadow map, obtaining the illumination value query result as 0,I final For the final illumination value of the model, C Lit Is the final color value of the illumination.
9. A cross-rendering platform based material rendering system as claimed in claim 6, wherein: the shadow acquisition area specifically comprises the following steps: shadow areas were calculated as follows:
Figure FDA0004231223240000041
wherein P is shadowColor And P shadowcolorIntensity Input control parameters provided for art, representing the offset color of the shadow and the offset color intensity of the shadow, respectively, R shadow As a shadow value query result of the shadow map, if the shadow map is not supported by the rendering platform, the value is 0, i is the current light source id, N is the total number of the current scene light sources, N is the rendering model normal, -L i 1-S is the incident light inverse vector of the current light source i For the influence factor of the current model by the current light source, the parallel light source S i =1,C shadow As the final color value of the shadow,
Figure FDA0004231223240000042
representing the total shadow area of the n light sources.
10. A cross-rendering platform based material rendering system as claimed in claim 6, wherein: the light-dark transition region is specifically: the light-dark transition region is calculated as follows:
Figure FDA0004231223240000051
wherein P is e ,P TlineColor And P TlineIntensity Input control parameters provided for art respectively represent boundary hardness, offset color of light-dark transition and offset color intensity of light-dark transition, i is current light source id, N is total number of current scene light sources, N is rendering model normal, L i Is the incident light vector of the current light source, -L i Is the incident light inverse vector of the current light source, C Tline For a bright-dark transition the final color value,
Figure FDA0004231223240000052
theoretical yin representing n light sourcesSum of shadow areas>
Figure FDA0004231223240000053
Represents the sum of the theoretical illumination areas of n light sources, < >>
Figure FDA0004231223240000054
Representing the sum of the light-dark boundary transition areas of the n light sources.
CN202010689461.XA 2020-07-17 2020-07-17 Cross-rendering platform-based material rendering method and system Active CN111862254B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010689461.XA CN111862254B (en) 2020-07-17 2020-07-17 Cross-rendering platform-based material rendering method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010689461.XA CN111862254B (en) 2020-07-17 2020-07-17 Cross-rendering platform-based material rendering method and system

Publications (2)

Publication Number Publication Date
CN111862254A CN111862254A (en) 2020-10-30
CN111862254B true CN111862254B (en) 2023-06-16

Family

ID=72983989

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010689461.XA Active CN111862254B (en) 2020-07-17 2020-07-17 Cross-rendering platform-based material rendering method and system

Country Status (1)

Country Link
CN (1) CN111862254B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112835621B (en) * 2021-01-13 2024-04-02 西安飞蝶虚拟现实科技有限公司 Cross-platform virtual reality resource processing method and processing system
CN113052947B (en) * 2021-03-08 2022-08-16 网易(杭州)网络有限公司 Rendering method, rendering device, electronic equipment and storage medium
CN113112582B (en) * 2021-04-20 2022-07-12 浙江凌迪数字科技有限公司 Real-time rendering method of sidelight fabric in realistic clothing rendering
CN113096230B (en) * 2021-04-20 2022-06-10 浙江凌迪数字科技有限公司 Real-time rendering method of laser fabric in realistic clothing rendering
CN114494570A (en) * 2021-10-18 2022-05-13 北京市商汤科技开发有限公司 Rendering method and device of three-dimensional model, storage medium and computer equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106846238A (en) * 2017-03-01 2017-06-13 北京趣酷科技有限公司 A kind of cross-platform automotive engine system of Elf3D
CN107103638A (en) * 2017-05-27 2017-08-29 杭州万维镜像科技有限公司 A kind of Fast rendering method of virtual scene and model
CN108876883A (en) * 2018-05-24 2018-11-23 武汉斗鱼网络科技有限公司 Texture creation method, device, equipment and storage medium based on OpenGLES
CN108984169A (en) * 2017-06-01 2018-12-11 刘开元 A kind of cross-platform Multielement integration development system
CN110659024A (en) * 2019-08-21 2020-01-07 北京达佳互联信息技术有限公司 Graphic resource conversion method, apparatus, electronic device and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9075618B2 (en) * 2012-11-02 2015-07-07 Microsoft Technology Licensing, Llc Cross-platform data visualizations using common descriptions

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106846238A (en) * 2017-03-01 2017-06-13 北京趣酷科技有限公司 A kind of cross-platform automotive engine system of Elf3D
CN107103638A (en) * 2017-05-27 2017-08-29 杭州万维镜像科技有限公司 A kind of Fast rendering method of virtual scene and model
CN108984169A (en) * 2017-06-01 2018-12-11 刘开元 A kind of cross-platform Multielement integration development system
CN108876883A (en) * 2018-05-24 2018-11-23 武汉斗鱼网络科技有限公司 Texture creation method, device, equipment and storage medium based on OpenGLES
CN110659024A (en) * 2019-08-21 2020-01-07 北京达佳互联信息技术有限公司 Graphic resource conversion method, apparatus, electronic device and storage medium

Also Published As

Publication number Publication date
CN111862254A (en) 2020-10-30

Similar Documents

Publication Publication Date Title
CN111862254B (en) Cross-rendering platform-based material rendering method and system
CN108537861B (en) Map generation method, device, equipment and storage medium
CN112116692B (en) Model rendering method, device and equipment
JP5139293B2 (en) Imaging camera processing apparatus and imaging camera processing method
CN111696188B (en) Rendering graph rapid illumination editing method and device and rendering method
US7583264B2 (en) Apparatus and program for image generation
US7173631B2 (en) Flexible antialiasing in embedded devices
CN110599574A (en) Rendering method and device of game scene and electronic equipment
CN111179150B (en) Shader automatic simplification method and system based on drawing instruction stream
JP2008522530A (en) Electronic color image saturation processing method
Sheng et al. Global illumination compensation for spatially augmented reality
CN110288670B (en) High-performance rendering method for UI (user interface) tracing special effect
CN111489430B (en) Game light and shadow data processing method and device and game equipment
CN114004923B (en) WebGL-based three-dimensional model shadow mapping texture rendering method
US9626774B2 (en) Saturation varying color space
CN111383320B (en) Virtual model processing method, device, equipment and storage medium
CN111476861A (en) Image rendering method and device, electronic equipment and storage medium
CN111784814A (en) Virtual character skin adjusting method and device
CN113554554B (en) Image color filtering method and device, electronic equipment and storage medium
JP4583844B2 (en) Image processing apparatus, image processing method, and program
CN112967363A (en) 8K three-dimensional ink-wash animation production method
CN100596166C (en) white point judgement method and correciton method for white balance
CN116188667B (en) Method for realizing map grid tile filter based on GLSL (global navigation satellite system) shader
Hathaway Alpha Blending as a Post-Process
Lakshmi et al. Analysis of tone mapping operators on high dynamic range images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant