CN117078838B - Object rendering method and device, storage medium and electronic equipment - Google Patents

Object rendering method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN117078838B
CN117078838B CN202310830103.XA CN202310830103A CN117078838B CN 117078838 B CN117078838 B CN 117078838B CN 202310830103 A CN202310830103 A CN 202310830103A CN 117078838 B CN117078838 B CN 117078838B
Authority
CN
China
Prior art keywords
value
curve
fitting
information
illumination
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310830103.XA
Other languages
Chinese (zh)
Other versions
CN117078838A (en
Inventor
陈仁松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Sanbao Information Technology Co ltd
Original Assignee
Shanghai Sanbao Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Sanbao Information Technology Co ltd filed Critical Shanghai Sanbao Information Technology Co ltd
Priority to CN202310830103.XA priority Critical patent/CN117078838B/en
Publication of CN117078838A publication Critical patent/CN117078838A/en
Application granted granted Critical
Publication of CN117078838B publication Critical patent/CN117078838B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/80Shading

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Generation (AREA)

Abstract

An object rendering method, an object rendering device, electronic equipment and a storage medium, wherein the object rendering method comprises the following steps: s1, confirming model information of a target object model, wherein the model information comprises smooth adjustment of vertex normals in modeling data, and then storing the adjusted vertex normals into vertex colors; s2, obtaining illumination parameter information of a target object model, wherein the illumination parameter information is obtained by fitting a curve by using a curve fitting function, and storing the curve information on a ramp map to express different material effects; and S3, rendering the target object model based on the adjusted vertex normal information and illumination parameter information. The method comprises the steps of adjusting the normal line of the vertex by a receiving artist at a production end by using a normal line smoothing tool, storing the adjusted normal line of the vertex into the vertex color, embedding the NPR part into the illumination calculation of the PBR, respectively carrying out normalization processing on diffuse reflection, specular reflection and the like, using ramp, rendering the object, and then presenting a physical writing effect, and further realizing cartoon effect so as to improve the visual display effect after image rendering.

Description

Object rendering method and device, storage medium and electronic equipment
Technical Field
The embodiment of the invention relates to the field of computer application, in particular to an object rendering method, an object rendering device, object rendering equipment and a storage medium.
Background
PBR rendering (physical-based rendering, PHYSICALLY BASED RENDERING), which refers to a collection of rendering techniques that are based, to varying degrees, on basic theories that are more consistent with the physical principles of the real world. The PBR rendering mode derives or simplifies or simulates a series of rendering equations through various mathematical methods and relies on computer hardware and graphics APIs to render a picture that approximates the real world. In the PBR mode, a virtual light source can be generally arranged to simulate a real illumination environment, so that a virtual model under the virtual light source can also show a fine illumination effect to improve the expressive force of a picture obtained by rendering.
Non-real rendering (Non-photorealistic Rendering, NPR rendering) is a rendering mode opposite to PBR rendering, mainly simulating the effects of artistic classes, also called stylized rendering. In the NPR rendering mode, the rendering code is usually custom and does not participate in the PBR rendering logic of the engine bottom layer, so that the virtual model in the NPR rendering mode is not affected by the virtual light source, and the NPR rendering has the advantages of being closer to the cartoon feel, easier to make stylized pictures and more attractive. In order to make the virtual model in the NPR rendering mode also exhibit the illumination effect, a separate lighting system is generally required for the character, but the PBR rendering mode has illumination conservation, and the general NPR rendering mode is difficult to satisfy, so that the two modes can have a cracking feel when appearing in the same picture after being used. How to combine the two together without collision is a relatively big difficulty.
Disclosure of Invention
The invention aims to provide an object rendering method, an object rendering device and a storage medium, so as to solve the problem that the combination of a PBR rendering mode and an NPR rendering mode is relatively large in the prior art.
A first aspect of the present invention provides an object rendering method, including:
Confirming model information of a target object model, wherein the model information further comprises smooth adjustment of vertex normals in modeling data, and then storing the adjusted vertex normals into vertex colors;
Obtaining illumination parameter information of a target object model, wherein the illumination parameter information is obtained by fitting a curve by using a curve fitting function, and storing the curve information on a ramp map to express different material effects;
and rendering the target object model based on the adjusted vertex normal information and the illumination parameter information.
Preferably, the "fitting a curve using a curve-fitting function" includes numerical curve fitting of diffuse-reflecting dark area portions using a ramp map, further comprising:
Determining a v value and a u value of the ramp map, wherein the v value is a preset first fixed value, and the u value takes the value of dot product according to a dark part threshold value and a first fitting constant respectively;
Performing corresponding curve fitting output through the v value and the pixel point of the u value sampling ramp map to realize mapping of an input threshold value to a corresponding color, wherein the dark part threshold value calculation Shading =shadow x NoL, shadow is pre-calculated projection, noL is the dot product of a normal line and a light source direction, so that strong light-dark contrast is adjusted to soft light-dark transition; and the first constant value and the first fitting constant can be adjusted, and the adjusting conditions are set to provide environment matching degree for the virtual model and the virtual scene.
Preferably, the "fitting a curve using a curve-fitting function" includes numerical curve fitting of the highlight GGX coefficient portion using a ramp map, further comprising:
determining a v value and a u value of the ramp map, wherein the v value is a preset second fixed value, and the u value takes a value according to a specular high light GGX coefficient in a D term calculation formula;
And performing corresponding curve fitting output through the pixel points of the v value and the u value sampling ramp map so as to realize mapping of the input highlight GGX coefficient part to the corresponding color.
Preferably, the "fitting a curve using a curve-fitting function" includes numerical curve fitting of the ambient light reflecting portion using a ramp map, further comprising:
determining a v value and a u value of the ramp map, wherein the v value is a preset third fixed value, and the u value takes a value according to a coefficient of an ambient light reflection calculation formula;
and performing corresponding curve fitting output through the pixel points of the v value and the u value sampling ramp map so as to realize mapping of the input ambient light reflection part to a corresponding color.
Preferably, the coefficient value according to the ambient light reflection calculation formula further includes:
The illumination LUT graph is characterized in that data are stored on a lookup table LUT, the LUT graph is fit into a curved surface, the coefficient B of a light reflection calculation formula is calculated as F= FRESNELRAMP (NoL. Times.B), noL is the dot product of a normal line and a light source, the coefficient B is a coloring model parameter in the ambient light BRDF, after the calculated F value passes through a ramp, the transition of a reflection part simulates the drawing reflection change according to the requirement of art in the cartoon by the ramp adjustment.
Preferably, the "fitting a curve using a curve-fitting function" includes performing a numerical curve-fitting on the diffuse dark area of the additional light source using a ramp map, further including:
determining a v value and a u value of the ramp map, wherein the v value is a preset fourth fixed value, and the u value takes the value of dot product according to the dark area of the additional light source and the second fitting constant respectively;
And performing corresponding curve fitting output through the v value and the pixel points of the u value sampling ramp map so as to realize mapping of the diffuse reflection dark part area input into the additional light source to a corresponding color.
Preferably, the present example further includes: and after the diffuse reflection dark part area numerical value curve fitting, the highlight GGX coefficient numerical value curve fitting, the ambient light reflection numerical value curve fitting and the additional light source diffuse reflection dark part area numerical value curve fitting, fitting curve information of four parts is stored on one ramp at the same time.
Preferably, the present example further includes:
creating a new spherical harmonic illumination map in Unity;
adding a light source in the scene, and applying the illumination map to the objects in the scene;
Calculating illumination using spherical harmonics in the shader;
The color and intensity of the illumination are calculated using the spherical harmonics provided by Unity: the color of the color with the brightest spherical harmonic is the illumination color of the spherical harmonic, the brightness ratio of the color to the spherical harmonic coding coefficient dominantColor is used for carrying out normalization processing on the color, and the color is scaled to the brightness in the brightest spherical harmonic direction;
and applying the calculated illumination color and intensity to the material, and transmitting the information of the spherical harmonic illumination map to the shader, so that the shader can accurately calculate the illumination.
Preferably, performing smooth adjustment on the vertex normals in the modeling data, and then saving the adjusted vertex normals to the vertex color further comprises:
Dividing all sub-components of the object to be cartoon-arranged;
and using a normal line smoothing tool to carry out smoothing setting with an adaptive gradual change effect on each sub-component respectively, and storing the modified normal line information of each vertex.
And rendering the target object model based on the adjusted vertex normal information and the illumination parameter information further comprises:
Calling the adjusted vertex normal information and the adjusted fitting curve information of the four parts, and outputting color to the prerendered object according to a BRDF calculation formula;
Wherein L o represents BRDF output color;
f d represents diffuse reflection proportion;
c diff represents a surface color;
f s is the specular reflectance ratio;
D is a normal distribution function;
f is a Fresnel equation coefficient;
G is a geometric function;
cos θ i is the dot product of the line of sight direction and the normal direction;
cos θ o is the dot product of the light source direction and the normal direction;
L i is the color of the light source
Dw i is the solid angle.
Compared with the prior art, the invention directly embeds the NPR part into the illumination calculation of the PBR, respectively normalizes the diffuse reflection, the specular reflection and the like (linearly changes the original data, maps the data between [0,1 ]) and uses the ramp, because the NPR is embedded into the PBR formula, the precondition of energy conservation is ensured, and the scene and the role can unify the lighting system. The invention realizes cartoon style rendering and optimizes and processes the fusion of the illumination effect of the cartoon style and the environment.
Drawings
In order to more clearly illustrate the technical solution of the embodiments of the present invention, the following description will briefly explain the drawings that are required to be used in the description of the embodiments:
FIG. 1 is an exemplary diagram of an application environment for an object rendering method;
FIG. 2 is a flow chart of an object rendering method;
FIG. 3 is an exemplary graph of smooth adjustment of vertex normals;
Fig. 4A to 4D are respectively schematic views of smooth adjustment of the sub-components of the cartoon setting object, in which fig. 4A is a garment ramp, fig. 4B is a face ramp, fig. 4C is a skin ramp, and fig. 4D is a hair ramp;
5A-5B are respectively object example rendering effect graphs;
Fig. 6 is a schematic diagram of an object rendering electronic device.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the embodiments of the present disclosure more apparent, the technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the drawings in the embodiments of the present disclosure, and it is apparent that the described embodiments are only some embodiments of the present disclosure, but not all embodiments. The components of the embodiments of the present disclosure, which are generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present disclosure provided in the accompanying drawings is not intended to limit the scope of the disclosure, as claimed, but is merely representative of selected embodiments of the disclosure. All other embodiments, which can be made by those skilled in the art based on the embodiments of this disclosure without making any inventive effort, are intended to be within the scope of this disclosure.
The network game is deeply favored by users because of higher reality, better appreciation and operability of the game scene. The game scene styles of different types of games are different, and rich game scenes are displayed for the user through different stylized rendering, so that the game experience of the user can be effectively improved. Such as a realistic style rendering and a cartoon style rendering, etc., regardless of the rendering style, the rendering of the light and shadow is an important factor affecting the visual effect. The light shadow of the writing style can be calculated according to the surface information of the model and the physical characteristics of illumination, and the light shadow with rich details and fine transition can be obtained generally. The cartoon style has the opposite requirements on the shadow, on one hand, the cartoon style requires that details are not too much, so that the clean feeling of the picture can be destroyed, and on the other hand, the bright-dark transition is hard, and the shadow and the highlight have clear and neat outlines. According to the research, the cartoon rendering adopts a physical rendering-based hand-painted texture and stylized post-processing mode matched with the cartoon style, a Lut image interpolation rendering mode after shadow is fused into illumination, or a double-texture interpolation rendering mode after shadow is fused into illumination to realize the cartoon style rendering. However, although the above manner can realize cartoon style rendering, further optimization is needed to well support cartoon effects and optimally process the fusion of the cartoon style illumination effect and the environment.
The object rendering method provided by the embodiment of the application can be applied to an application environment shown in fig. 1. Wherein the terminal 102 communicates with the server 104 via a network. The data storage system may store data that the server 104 needs to process. The data storage system may be integrated on the server 104 or may be located on the cloud or other servers.
Specifically, an application program may be installed on the terminal 102, where the application program is an application program that provides a real-time rendering function, for example, the application program is a game application, and the terminal 102 performs real-time rendering after starting the application program, and displays an image that is rendered in real time, and in the process of real-time rendering, the terminal 102 may determine a current diffuse reflection image and a current specular reflection image to obtain a target diffuse reflection image, process the current specular reflection image to obtain a target specular reflection image, and perform image fusion on the target diffuse reflection image and the target specular reflection image to obtain the target image. The current diffuse reflection image is an image obtained by performing illumination rendering on a scene area observed at a current moment by using diffuse reflection illumination, the current specular reflection image is an image obtained by performing illumination rendering on a scene area observed at a current moment by using specular reflection illumination, the terminal 102 can send a target image to the server 104, and the server 104 can store the target image or send the target image to other devices. The terminal 102 may also display the target image.
The terminal 102 may be, but not limited to, various desktop computers, notebook computers, smart phones, tablet computers, internet of things devices, and portable wearable devices, where the internet of things devices may be smart speakers, smart televisions, smart air conditioners, smart vehicle devices, and the like. The portable wearable device may be a smart watch, smart bracelet, headset, or the like. The server 104 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs, basic cloud computing services such as big data and artificial intelligence platforms, and the like. The terminal 102 and the server 104 may be directly or indirectly connected through wired or wireless communication, and the present application is not limited herein.
In some embodiments, as shown in fig. 2, there is provided an object rendering method, which may be performed by a terminal or a server, or may be performed by the terminal and the server together, and the method is applied to the terminal 102 in fig. 1, for example, and includes the following steps:
S110, confirming model information of a target object model, wherein the model information further comprises smooth adjustment of vertex normals in modeling data, and then storing the adjusted vertex normals in vertex colors;
S120, obtaining illumination parameter information of a target object model, wherein the illumination parameter information is obtained by fitting a curve by using a curve fitting function, and storing the curve information on a ramp map to express different material effects;
and S130, rendering the target object model based on the adjusted vertex normal information and the illumination parameter information.
The scene area refers to an area in the virtual scene. The virtual scene refers to a virtual scene that an application program displays (or provides) while running on a terminal. The virtual scene may be a simulation environment for the real world, a semi-simulation and semi-fictional virtual scene, or a pure fictional virtual scene. The Virtual scene may be, for example, a game scene, a VR (Virtual Reality) scene, a cartoon scene, or the like.
At least one virtual object may be included in the scene area. Each virtual object has its own shape and volume in the virtual scene, occupying a portion of the space in the virtual scene. The virtual object may be an inanimate object including, but not limited to, a building, vegetation, sky, road, mountain stone, or body of water, etc., or an animate object including, but not limited to, an animal or digital person that is virtual. Digital humans are computer-generated roles that aim to replicate the behavior and personality characteristics of humans. In other words, a realistic 3D (three-dimensional) human model. Digital persons can appear anywhere within the sense of reality, from fantasy characters of children (representing humans) to super-realistic digital actors, which are barely distinguishable from real humans. Advances in digital humans are driven primarily by talents and technologies in the world where animation, visual effects, and video games are fused. The digital person may include a virtual person whose identity is fictitious and not present in the real world, e.g., a virtual anchor, and a virtual digital person that emphasizes virtual identity and digitized production features. In this example, object rendering is mainly taken as an example. The rendering process is described by taking a cartoon character as an example object.
The rendered current diffuse reflection image and the rendered current specular reflection image mainly refer to diffuse reflection images and specular reflection images of virtual objects in the scene area. The current diffuse reflection image can be an image obtained directly or indirectly by performing illumination rendering on the current object by diffuse reflection illumination. The current specular reflection image may be an image obtained directly or indirectly by performing illumination rendering on the current object by using specular reflection illumination. Diffuse reflected illumination and specular reflected illumination both belong to indirect illumination, which may also be referred to as diffuse reflected indirect illumination, and specular reflected illumination may also be referred to as specular reflected indirect illumination. In diffuse reflection illumination, photons strike a rough surface and are scattered randomly in all directions. In specular illumination, photons bounce in a predictable direction when they hit a strongly reflecting surface, such as a mirror. In indirect illumination, light is bounced over the surface of an object one or more times, multiple times being at least twice.
It should be noted that, the diffuse reflection illumination image corresponding to the current time and the specular reflection illumination image corresponding to the current time may be rendered at the same time, or may be sequentially rendered, for example, the diffuse reflection illumination image corresponding to the current time is rendered first, then the specular reflection illumination image corresponding to the current time is rendered, or the specular reflection illumination image corresponding to the current time is rendered first, then the diffuse reflection illumination image corresponding to the current time is rendered, and the rendering sequence is not limited here.
The main core of the invention is that the NPR part is directly embedded into the illumination calculation of the PBR, the light system is used after normalization (linear change of the original data, mapping the data between [0,1 ]) is carried out on diffuse reflection, specular reflection and the like, and because the NPR is embedded into the PBR formula, the precondition of energy conservation is ensured, and the scene and the role can be unified with the light system.
1. Step S110 is specifically described.
Model information of a target object model is obtained, wherein the target object can be a game role, a game prop, a game object, a game building and the like, and the method is not limited herein. The target object model refers to a model constituting various target objects in a game scene. Specifically, the target object model may be a model designed and manufactured in proportion to a game character, a game prop, a game object, a game building, or the like. For example, the target object model may be a scene model, a building model, an animation model, a character model, a prop model, a particle effect model, and the like, which are distinguished according to the type of the game model.
For example, model information of the target object model may include model surface normal information, which is vertex-based basis information. In some possible embodiments, the model information of the target object model may further include coordinate information, color information, vertex (Vertexs) information, primitive (PRIMITIVES) information, fragment (Fragments) information, texture (Texture) information, depth information, and the like of the model, which are not limited herein.
In this example, when a model is manufactured, the art may be received, and the normal line of the vertex (the normal line information of each vertex in the modeling data) may be adjusted by using a normal line smoothing tool (an insert tool written in 3dmax and maya manufacturing software) at the production end, and the adjusted normal line of the vertex may be saved in the vertex color, and the ramp map may be distinguished by different thresholds according to different materials, such as skin, hair, clothes, face, and the like. The ramp in rendering generally refers to a color gradient effect that can be used to create smooth transitional effects such as gradient backgrounds, illumination simulations, etc. Referring to fig. 3 and 4, the sub-parts of the subject to be cartoonized, including clothing, skin, face and hair, are divided, and the gradation of each sub-part can be configured separately in fig. 4. And using a normal line smoothing tool to carry out smoothing setting with an adaptive gradual change effect on each sub-component respectively, and storing the modified normal line information of each vertex. The gradual change can generate strong perspective sense and space sense, and the color shade, brightness and the like can be used for smoothly setting each sub-component independently. For cartoon virtual objects, the normal information of each vertex does not store relevant information in the prior art. The invention expands the storage of the normal information of each vertex, directly stores the smooth setting (shown in figure 4) of the sub-components with the adaptive gradual change effect in the normal information of each vertex, and directly calls the normal information stored in each vertex during the subsequent rendering. Clothing, skin, face and hair are examples only, and the sub-components of the object to be cartooned may be finer granularity or other sub-components divided.
2. Step S120
Physical-based illumination PBR is a reduction to reality, while NPR is a non-realistic rendering. We can stitch the relevant features to show the effect. The PBR is used for being artistic, and can enable art students to use visual parameters and standardized workflow to quickly realize realistic rendering of a large amount of materials, transfer the characteristics of the NPR into the PPR, and keep the usability. The ambient light and the texture of the PBR brought by the PBR are relatively easy to preserve. The NPR mixed with PBR has a part of cartoon effect and also has the texture of PBR. After the NPR characteristic is modified, the whole PBR calculation flow is not changed, and the direct light and the indirect light are subjected to specular reflection and diffuse reflection superposition. The key of the step is that numerical curve fitting processing is carried out on the light source parameters, the fitted curve is controlled through a specific ramp map, and the ramp map can be distinguished according to different thresholds through rendering program fragments. That is, different fitting formulas are used to perform transition processing on the threshold value, fitting curve information is stored on a ramp map, different material effects are expressed by the curve, the ramp map is mainly used for the diffuse reflection dark region, the high light GGX coefficient, the four parts of the ambient light reflection and the diffuse reflection dark region of the additional light source are subjected to numerical curve fitting, and the fitting curve information of the four parts is stored on one ramp.
For a ramp on a two-dimensional plane, both u and v values can be used to represent its position. Typically, u and v have values in the range of [0,1], where (0, 0) represents the lower left corner of the ramp and (1, 1) represents the upper right corner of the ramp. In rendering, the values of u and v may be used to obtain color values for corresponding locations on the ramp. First, the actual coordinates on the ramp are calculated from the values of u and v. Then, sampling is carried out on the ramp texture according to the coordinates, and corresponding color values are acquired.
Curve fitting the thresholds means fitting a function using the pixel values of the ramp map, which is able to map the input thresholds to the corresponding colors.
Specifically, assume that there is a ramp map that contains a series of colors under different thresholds. There is also a set of known thresholds and corresponding ramp map pixel values. From these known data, a functional model is found that can input arbitrary thresholds and output the corresponding colors.
To achieve this goal, a curve fitting method may be used. The goal of curve fitting is to find a functional model that matches as closely as possible to known data points. In this case, a function is found which accepts the threshold as input and outputs the corresponding color.
Common curve fitting methods include polynomial fitting, exponential fitting, logarithmic fitting, and the like. And selecting a proper function model according to actual conditions. The known threshold values and corresponding ramp map pixel values may also be used as training data, parameters of the fitting function are obtained by the fitting process, and these parameters may then be used to calculate the color under any threshold.
In the step, the pixel value of the ramp map is adopted to perform curve fitting on the threshold value, and the v value and the u value of the map are used for sampling, so that the previous code can be adjusted to adapt to new requirements. In this case, it is necessary to provide the sampled data of v and u values and take them as inputs, and perform curve fitting with the pixel values of the ramp map as outputs.
S121, "fitting a curve using a curve-fitting function" includes numerical curve fitting of diffuse-reflecting dark area portions using a ramp map, further comprising:
Determining a v value and a u value of the ramp map, wherein the v value is a preset first fixed value, and the u value takes the value of dot product according to a dark part threshold value and a first fitting constant respectively;
And performing corresponding curve fitting output through the v value and the pixel points of the u value sampling ramp map so as to realize mapping the input threshold value to a corresponding color.
For the dark part threshold part of the diffuse reflection color, the invention takes the v value 0.125 part (the first constant value is 0.125) of the map, the u value takes the value according to the dot product of the dark part threshold value and 0.3333 (the first fitting constant), samples the pixel point of the ramp map, and then carries out corresponding curve fitting on the dark part threshold value. The diffuse reflection color dark portion threshold portion mainly controls transition of a dark portion and a bright portion, and dark portion threshold calculation Shading =shadow NoL (Shadow is a pre-calculation projection, noL is a dot product of a normal line and a light source, and the pre-calculation projection Shadow can be preset at Unity).
The scheme realizes that the strong light-dark contrast is adjusted to soft light-dark transition; then, the first constant value and the first fitting constant can be adjusted according to specific simulation conditions, and the core is that the matching degree between the virtual model and the environment provided by the virtual scene is adjusted, so that the virtual model has cartoon effect and better matches the light and shadow effect in the virtual scene. For example, the example can be implemented by using a simulation model, and the first fixed value and the first fitting constant of different values are adjusted to output a rendering effect diagram. And selecting a rendering effect diagram with high matching degree between the virtual model and the environment provided by the virtual scene from the rendering effect diagram, wherein the corresponding first fixed value and first fitting constant are values adopted by the example.
S122 "fitting a curve using a curve-fitting function" includes numerical curve fitting of the highlight GGX coefficient portions using a ramp map, further comprising:
determining a v value and a u value of the ramp map, wherein the v value is a preset second fixed value, and the u value takes a value according to a specular high light GGX coefficient in a D term calculation formula;
And performing corresponding curve fitting output through the pixel points of the v value and the u value sampling ramp map so as to realize mapping of the input highlight GGX coefficient part to the corresponding color.
This step is mainly a normal distribution function D term, such as remapping the range of GGX highlights. The specular reflection specular high light distribution function may be of the type the specular distribution function (Bidirectional Reflectance Distribution Function, short for BRDF), the specular reflection specular distribution function may be GGX, where GGX may be a specular reflection specular distribution function in BRDF. The effect of the parameters of the specular reflection factor SpecularFactor calculated by GGX in the D term is similar to the simulation of the reflection of light by the material in terms of material, and the artistic effect of the reflection of light in the cartoon is simulated by controlling the ramp to SpecularFactor, so that the transition of the parameters is closer to the drawn feeling.
Briefly, a specular reflection factor (specular factor) is calculated, the value of which can be related to the cosine value of the angle between the reflected ray and the vector from the point of incidence to the observer. The effect of specular reflection is only visible when this angle is less than 90 degrees, and it is therefore checked whether the value of this specular reflection factor is greater than 0. The final high-light color is obtained by multiplying the illumination color, the specular reflection intensity of the material and the specular reflection factor. The high light color, ambient light color, and diffuse reflected light color are added to give the overall illumination color. Finally this value is multiplied with the sampled value from the texture and the result is taken as the final color of the pixel. I can map the value between [0,1], and find the fine art effect of the through light reflection corresponding to the parameter value of the specular reflection factor SpecularFactor through multiple instances.
S123 "fitting a curve using a curve-fitting function" includes numerical curve fitting of the ambient light reflecting portion using a ramp map, further comprising:
determining a v value and a u value of the ramp map, wherein the v value is a preset third fixed value, and the u value takes a value according to coefficients of an ambient light reflection calculation formula respectively;
and performing corresponding curve fitting (fitting by adopting an approximate fitting function of an illumination LUT graph) output through the pixel points of the v value and the u value sampling ramp map so as to realize mapping of the input ambient light reflection part to a corresponding color.
For example, the illumination LUT map stores data on a lookup table LUT. The LUT is further fit into a curved surface, so that the LUT can be directly calculated in a loader, and a primary texture sampling environment is omitted. The coefficients a, B of the light reflection calculation formula, the smoothed ambient light reflection calculation is f= FRESNELRAMP (NoL ×b), where NoL is the dot product of the normal and the light source.
In fitting using an approximate fit function of the illumination LUT map, the incoming coefficients B are mapped out in the fit function after the following computation of b=1.04×a004+r.w, a004=min (r.x×r.x, exp2 (-9.28×nov) ×r.x+r.y).
The coefficient B is a coloring model parameter in the ambient light BRDF, and after the calculated F value passes through the ramp, the transition of the reflecting part can be adjusted according to the ramp to simulate the method of drawing the reflection change in the cartoon so as to meet the requirements of art.
S124, "fitting a curve using a curve-fitting function" includes numerical curve fitting of the diffuse dark area of the additional light source using a ramp map, further comprising:
determining a v value and a u value of the ramp map, wherein the v value is a preset fourth fixed value, and the u value takes the value of dot product according to the dark area of the additional light source and the second fitting constant respectively;
And (3) performing corresponding curve fitting output on the pixel points of the sampling ramp map so as to realize mapping of the diffuse reflection dark part area of the input additional light source to the corresponding color.
Such as: for the diffuse reflection dark area of the additional light source, we will take the v value of 0.875 of the map, the u value takes the value according to the dot product of the dark area of the additional light source and 0.3333, sample the pixel point of the ramp map, and then make the corresponding mapping smoothing for the ambient light reflection.
Generally, in order to improve the efficiency of real-time rendering, four parts, i.e. a diffuse reflection dark part area, a high-light GGX coefficient, an ambient light reflection and an additional light source diffuse reflection dark part area, are subjected to flattening transition treatment, and the coefficients of the four parts are stored on one ramp.
In step S120, the method may further include normalizing the Unity spherical harmonic illumination brightness, that is, flattening the spherical harmonic illumination color according to the ratio of brightness of the spherical harmonic illumination color to the spherical harmonic coding.
First, the basic illumination flow using spherical harmonics is introduced:
first, a new spherical harmonic illumination map is created in Unity (SPHERICAL HARMONIC LIGHTING).
A light source is added to the scene and its type is set to "real time" or "Mixed" in its setting. Real-time light sources may be used in real-time rendering, while mixed light sources may be used for mixed reality and virtual reality applications.
The illumination map is applied to objects in the scene. This may be achieved by using the illumination map as an environmental map or as a texture map.
The spherical harmonics are used in the shader to calculate the illumination. Spherical harmonics are a set of basis functions that can represent arbitrary functions on a sphere. Spherical harmonics can be used to represent the intensity and direction of ambient light, thereby reducing the computational effort.
The color and intensity of the illumination is calculated using the spherical harmonics provided by Unity. In Unity, sphericalHarmonicsL2 structures may be used to represent spherical harmonics. The contribution of ambient light can be added using the spheronics l2. Addbumientlight method, the contribution of directional light can be added using the spheronics l2. Adddirectionlight method, and the contribution of point light can be added using the spheronics l2.Addpointlight method.
And applying the calculated illumination color and intensity to the material. The loader, setglobal vector method may be used to pass the information of the spherical harmonic illumination map into the shader, thereby enabling the shader to calculate the illumination correctly.
A mature engine Unity has helped us store cubemap processed. There are such a set of variables in Unity:
Here is global illumination encoded with spherical harmonics after integration. Namely Unity does: the environment map cubemap is integrated into a blurred global illumination map, and then the global illumination map is projected onto the basis function of spherical harmonic illumination for storage, wherein seven parameters are the coefficients of the stored basis function. The basis function for Unity is called the third order accompanying legendre polynomial. In this example, units_ SHAr, SHAg, and SHAb are global illuminations encoded with spherical harmonics after integration of units, they represent the brightest direction of spherical harmonics, colD is the color of spherical harmonic illumination, the ratio of the brightness of colD to the brightness of spherical harmonic coding coefficient dominantColor is normalized, and the colD is scaled to the brightness of the brightest direction of spherical harmonics.
3. Step S130
Rendering the target object model based on the adjusted vertex normal information and the illumination parameter information further comprises:
Calling the adjusted vertex normal information and the adjusted fitting curve information of the four parts, and outputting color to the prerendered object according to a BRDF calculation formula;
Wherein L o is L o(p,wo) represents BRDF output color;
f d represents the diffuse reflection ratio (1-mettalic);
c diff represents the surface colour (where the threshold value for calculating the colour dark part is the first Diffuse fitted at step S120);
f s is the specular reflectance ratio;
D is a normal distribution function (wherein the calculated GGX coefficients are the second parameter Specular fitted in step S120);
F is a fresnel equation coefficient (the fresnel equation coefficient is made according to the third Diffuse fitted in step S120);
G is a geometric function;
cos θ i is the dot product of the line of sight direction and the normal direction;
cos θ o is the dot product of the light source direction and the normal direction;
L i is the color of the light source;
dw i is the solid angle.
The fourth parameter fitted in step S120, which is c of the additional light source, can be extended to the original formula.
And a fourth parameter, additional, for c of the Additional light source.
When the object rendering event is started, the rendering module (mainly the formula or the formula after transformation) is utilized to call related ramp and the like to realize real-time rendering. Fig. 5A-5B are schematic diagrams illustrating the final display effect provided by the embodiment of the invention. The final display effect result of the object to be rendered shown in fig. 5 is obtained after execution 130.
An object rendering apparatus, the apparatus comprising:
Vertex normal smooth adjustment module: and the model information is used for confirming the model information of the target object model, and further comprises the steps of smoothly adjusting the vertex normals in the modeling data, and then storing the adjusted vertex normals into the vertex color.
When a model is manufactured, a normal line is adjusted by using a normal line smoothing tool. Such as skin, hair, clothing, face, etc. Each type of skin, hair, clothing, face may be a parent material, which if modified results in all virtual models using the parent material being changed. In game development, however, only the material of a portion of the virtual model may be required to enable the fade effect. In the process, cartoon elements are mainly introduced, and smooth adjustment is performed by utilizing gradual change of materials.
Illumination parameter normalization adjustment module: the method comprises the steps of obtaining illumination parameter information of a target object model, wherein the illumination parameter information is obtained by fitting a curve by using a curve fitting function, and storing the curve information on a ramp map to express different material effects; the ramp is used after normalization (linear change of the original data, mapping the data between [0,1 ]) for diffuse reflection and specular reflection, respectively.
And a rendering module: and rendering the target object model based on the adjusted vertex normal information and the illumination parameter information.
Exemplary embodiments of the present disclosure also provide a computer-readable storage medium having stored thereon a program product capable of implementing the object rendering method described above in the present specification. In some possible implementations, aspects of the present disclosure may also be implemented in the form of a program product comprising program code for causing an electronic device to carry out the steps according to the various exemplary embodiments of the disclosure as described in the detailed description section above when the program product is run on the electronic device. The program product may employ a portable compact disc read-only memory (CD-ROM) and comprise program code and may be run on an electronic device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer readable signal medium may include a data signal propagated in baseband or as part of a carrier wave with readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
The exemplary embodiment of the disclosure also provides an electronic device capable of implementing the rendering method. An electronic device 600 according to such an exemplary embodiment of the present disclosure is described below with reference to fig. 6. The electronic device 600 shown in fig. 6 is merely an example and should not be construed to limit the functionality and scope of use of embodiments of the present disclosure in any way. As shown in fig. 6, the electronic device 600 may be embodied in the form of a general purpose computing device. Components of electronic device 600 may include, but are not limited to: at least one processing unit 610, at least one memory unit 620, a bus 630 connecting the different system components (including the memory unit 620 and the processing unit 610), and a display unit 640. The storage unit 620 stores program codes that can be executed by the processing unit 610, so that the processing unit 610 performs steps according to various exemplary embodiments of the present disclosure described in the above section of the specification. In particular, a program product stored on a computer readable storage medium may cause an electronic device to perform the steps of: confirming model information of a target object model, wherein the model information further comprises smooth adjustment of vertex normals in modeling data, and then storing the adjusted vertex normals into vertex colors; obtaining illumination parameter information of a target object model, wherein the illumination parameter information is obtained by fitting a curve by using a curve fitting function, and storing the curve information on a ramp map to express different material effects; and rendering the target object model based on the adjusted vertex normal information and the illumination parameter information.
Note that this is just a simple example, and the actual fitting process may require more complex model selection and parameter adjustment depending on the actual situation. In addition, curve fitting may be affected by data noise and the number of samples, so that attention is paid to adjustment parameters and data quality in practical applications.

Claims (12)

1. An object rendering method, comprising:
confirming model information of a target object model, wherein the model information further comprises smooth adjustment of vertex normals in modeling data, and then storing the adjusted vertex normals into vertex colors;
Obtaining illumination parameter information of a target object model, wherein the illumination parameter information is obtained by fitting a curve by using a curve fitting function, and storing the curve information on a ramp map to express different material effects, and the curve information is the illumination parameter information of the target object model;
"fitting a curve using a curve-fitting function" includes numerical curve fitting of diffuse-reflecting dark area portions using a ramp map, further comprising:
Determining a v value and a u value of the ramp map, wherein the v value is a preset first fixed value, and the u value takes the value of dot product according to a dark part threshold value and a first fitting constant respectively;
Performing corresponding curve fitting output through the v value and the pixel point of the u value sampling ramp map to realize mapping of an input threshold value to a corresponding color, wherein the dark part threshold value calculation Shading =shadow x NoL, shadow is pre-calculated projection, noL is the dot product of a normal line and a light source direction, and strong light-dark contrast is adjusted to soft light-dark transition; the first constant value and the first fitting constant can be adjusted, and the adjusting conditions are set to provide environment matching degree for the virtual model and the virtual scene;
And rendering the target object model based on the adjusted vertex normal information and the illumination parameter information.
2. The method of claim 1, wherein fitting a curve using a curve fitting function comprises numerically curve fitting a highlight GGX coefficient portion using a ramp map, further comprising:
Determining a v value and a u value of the ramp map, wherein the v value is a preset second fixed value, and the u value takes the value of a specular high-light GGX coefficient in a normal distribution function D term calculation formula;
And performing corresponding curve fitting output through the pixel points of the v value and the u value sampling ramp map so as to realize mapping of the input highlight GGX coefficient part to the corresponding color.
3. The method of claim 1, wherein fitting a curve using a curve-fitting function comprises numerically curve fitting an ambient light reflecting portion using a ramp map, further comprising:
determining a v value and a u value of the ramp map, wherein the v value is a preset third fixed value, and the u value takes a value according to a coefficient of an ambient light reflection calculation formula;
and performing corresponding curve fitting output through the pixel points of the v value and the u value sampling ramp map so as to realize mapping of the input ambient light reflection part to a corresponding color.
4. The method of claim 3, wherein the coefficient values according to the ambient light reflection calculation formula further comprise:
The illumination LUT graph is characterized in that data are stored on a lookup table LUT, the LUT graph is fit into a curved surface, the coefficient B of a light reflection calculation formula is calculated as F= FRESNELRAMP (NoL. Times.B), noL is the dot product of a normal line and a light source, the coefficient B is a coloring model parameter in the ambient light BRDF, after the calculated F value passes through a ramp, the transition of a reflection part simulates the drawing reflection change according to the requirement of art in the cartoon by the ramp adjustment.
5. The method of claim 1, wherein fitting a curve using a curve-fitting function comprises numerically curve fitting an additional light source diffuse dark region using a ramp map, further comprising:
determining a v value and a u value of the ramp map, wherein the v value is a preset fourth fixed value, and the u value takes the value of dot product according to the dark area of the additional light source and the second fitting constant respectively;
And performing corresponding curve fitting output through the v value and the pixel points of the u value sampling ramp map so as to realize mapping of the diffuse reflection dark part area input into the additional light source to a corresponding color.
6. The method of any one of claims 2 to 5, further comprising: and after the diffuse reflection dark part area numerical value curve fitting, the highlight GGX coefficient numerical value curve fitting, the ambient light reflection numerical value curve fitting and the additional light source diffuse reflection dark part area numerical value curve fitting, fitting curve information of four parts is stored on one ramp at the same time.
7. The method as recited in claim 1, further comprising:
creating a new spherical harmonic illumination map in Unity;
adding a light source in the scene, and applying the illumination map to the objects in the scene;
Calculating illumination using spherical harmonics in the shader;
The color and intensity of the illumination are calculated using the spherical harmonics provided by Unity: the color of the color with the brightest spherical harmonic is the illumination color of the spherical harmonic, the brightness ratio of the color to the spherical harmonic coding coefficient dominantColor is used for carrying out normalization processing on the color, and the color is scaled to the brightness in the brightest spherical harmonic direction;
and applying the calculated illumination color and intensity to the material, and transmitting the information of the spherical harmonic illumination map to the shader, so that the shader can accurately calculate the illumination.
8. The method of claim 1, wherein smoothing the vertex normals in the modeling data and then saving the adjusted vertex normals information to the vertex color further comprises:
Dividing all sub-components of the object to be cartoon-arranged;
and using a normal line smoothing tool to carry out smoothing setting with an adaptive gradual change effect on each sub-component respectively, and storing the modified normal line information of each vertex.
9. The method of claim 8, wherein rendering the target object model based on the adjusted vertex normal information and the illumination parameter information further comprises:
Calling the adjusted vertex normal information and the adjusted fitting curve information of the four parts, and outputting color to the prerendered object according to a BRDF calculation formula;
Wherein L o represents BRDF output color;
f d represents diffuse reflection proportion;
c diff represents a surface color;
f s is the specular reflectance ratio;
D is a normal distribution function;
f is a Fresnel equation coefficient;
G is a geometric function;
cos θ i is the dot product of the line of sight direction and the normal direction;
cos θ o is the dot product of the light source direction and the normal direction;
L i is the color of the light source
Dw i is the solid angle.
10. An object rendering apparatus, the apparatus comprising:
vertex normal smooth adjustment module: the model information is used for confirming the model information of the target object model, and further comprises the steps of smoothly adjusting the vertex normals in the modeling data, and then storing the adjusted vertex normals into the vertex color;
Illumination parameter normalization adjustment module: the method comprises the steps of obtaining illumination parameter information of a target object model, wherein the illumination parameter information is obtained by fitting a curve by using a curve fitting function, and storing the curve information on a ramp map to express different material effects, and the curve information is the illumination parameter information of the target object model;
The curve includes numerical curve fitting a diffuse dark area portion with a ramp map, further comprising:
Determining a v value and a u value of the ramp map, wherein the v value is a preset first fixed value, and the u value takes the value of dot product according to a dark part threshold value and a first fitting constant respectively;
Performing corresponding curve fitting output through the v value and the pixel point of the u value sampling ramp map to realize mapping of an input threshold value to a corresponding color, wherein the dark part threshold value calculation Shading =shadow x NoL, shadow is pre-calculated projection, noL is the dot product of a normal line and a light source direction, and strong light-dark contrast is adjusted to soft light-dark transition; the first constant value and the first fitting constant can be adjusted, and the adjusting conditions are set to provide environment matching degree for the virtual model and the virtual scene;
And a rendering module: and rendering the target object model based on the adjusted vertex normal information and the illumination parameter information.
11. A computer-readable storage medium, on which a computer program is stored, characterized in that the program, when executed by a processor, implements the object rendering method according to any one of claims 1 to 9.
12. An electronic device, comprising: one or more processors; storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the object rendering method of any of claims 1 to 9.
CN202310830103.XA 2023-07-07 2023-07-07 Object rendering method and device, storage medium and electronic equipment Active CN117078838B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310830103.XA CN117078838B (en) 2023-07-07 2023-07-07 Object rendering method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310830103.XA CN117078838B (en) 2023-07-07 2023-07-07 Object rendering method and device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN117078838A CN117078838A (en) 2023-11-17
CN117078838B true CN117078838B (en) 2024-04-19

Family

ID=88716013

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310830103.XA Active CN117078838B (en) 2023-07-07 2023-07-07 Object rendering method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN117078838B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103164855A (en) * 2013-02-26 2013-06-19 清华大学深圳研究生院 Bayesian Decision Theory foreground extraction method combined with reflected illumination
CN112053423A (en) * 2020-09-18 2020-12-08 网易(杭州)网络有限公司 Model rendering method and device, storage medium and computer equipment
CN113012273A (en) * 2021-03-24 2021-06-22 网易(杭州)网络有限公司 Illumination rendering method, device, medium and equipment based on target model
CN114998501A (en) * 2022-04-22 2022-09-02 网易(杭州)网络有限公司 Color gradation rendering method and device for model
CN115845369A (en) * 2022-12-27 2023-03-28 北京字跳网络技术有限公司 Cartoon style rendering method and device, electronic equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2536964B (en) * 2015-04-02 2019-12-25 Ge Aviat Systems Ltd Avionics display system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103164855A (en) * 2013-02-26 2013-06-19 清华大学深圳研究生院 Bayesian Decision Theory foreground extraction method combined with reflected illumination
CN112053423A (en) * 2020-09-18 2020-12-08 网易(杭州)网络有限公司 Model rendering method and device, storage medium and computer equipment
CN113012273A (en) * 2021-03-24 2021-06-22 网易(杭州)网络有限公司 Illumination rendering method, device, medium and equipment based on target model
CN114998501A (en) * 2022-04-22 2022-09-02 网易(杭州)网络有限公司 Color gradation rendering method and device for model
CN115845369A (en) * 2022-12-27 2023-03-28 北京字跳网络技术有限公司 Cartoon style rendering method and device, electronic equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
3D游戏引擎构架及游戏动画渲染技术分析;陈凌超 等;《中国新技术新产品》;20220131(第1期) *

Also Published As

Publication number Publication date
CN117078838A (en) 2023-11-17

Similar Documents

Publication Publication Date Title
US11694392B2 (en) Environment synthesis for lighting an object
CN110599574B (en) Game scene rendering method and device and electronic equipment
US9619920B2 (en) Method and system for efficient modeling of specular reflection
Lu et al. Illustrative interactive stipple rendering
KR102173546B1 (en) Apparatus and method of rendering game objects
Okabe et al. Illumination brush: Interactive design of all-frequency lighting
US9905045B1 (en) Statistical hair scattering model
WO2023109486A1 (en) Hair model generation method and apparatus, electronic device, and storage medium
CN116228943B (en) Virtual object face reconstruction method, face reconstruction network training method and device
Marques et al. Deep spherical harmonics light probe estimator for mixed reality games
JP4975159B2 (en) Colorless lighting in a graphics system, method and program for generating graphics images
CN115082607A (en) Virtual character hair rendering method and device, electronic equipment and storage medium
CN117078838B (en) Object rendering method and device, storage medium and electronic equipment
US20180005432A1 (en) Shading Using Multiple Texture Maps
US20030025706A1 (en) System and method for rendering a texture map utilizing an illumination modulation value
US7164421B2 (en) Image generation system, program, and information storage medium
Mortensen et al. Real-time global illumination for vr applications
CN116883580B (en) Silk stocking object rendering method and device
US7710419B2 (en) Program, information storage medium, and image generation system
US7724255B2 (en) Program, information storage medium, and image generation system
CN117745915B (en) Model rendering method, device, equipment and storage medium
CN117671110B (en) Real-time rendering system and method based on artificial intelligence
CN116883567A (en) Fluff rendering method and device
CN117409131A (en) Model rendering method and device, computer readable storage medium and electronic equipment
CN117218271A (en) Dough sheet generation method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant