CN113838155A - Method and device for generating material map and electronic equipment - Google Patents

Method and device for generating material map and electronic equipment Download PDF

Info

Publication number
CN113838155A
CN113838155A CN202110976154.4A CN202110976154A CN113838155A CN 113838155 A CN113838155 A CN 113838155A CN 202110976154 A CN202110976154 A CN 202110976154A CN 113838155 A CN113838155 A CN 113838155A
Authority
CN
China
Prior art keywords
rendered
model
information
parameter
map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110976154.4A
Other languages
Chinese (zh)
Other versions
CN113838155B (en
Inventor
杨斌
胡欣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202110976154.4A priority Critical patent/CN113838155B/en
Publication of CN113838155A publication Critical patent/CN113838155A/en
Application granted granted Critical
Publication of CN113838155B publication Critical patent/CN113838155B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/40Filling a planar surface by adding surface attributes, e.g. colour or texture

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)
  • Image Generation (AREA)

Abstract

The invention discloses a method and a device for generating a material map and electronic equipment. Wherein, the method comprises the following steps: acquiring a parameter file corresponding to a material file of a preset material type, wherein the parameter file comprises material parameters for generating the material file; receiving an adjusting instruction, and adjusting the material parameters corresponding to the parameter file to obtain adjusted material parameters; and generating a material map based on the adjusted material parameters. The invention solves the technical problem of low manufacturing efficiency of the material map in the related technology.

Description

Method and device for generating material map and electronic equipment
Technical Field
The invention relates to the field of programming map making, in particular to a method and a device for generating a material map and electronic equipment.
Background
In the art of mapping, mapping assets and material files are typically created using mapping software (e.g., Substance Designer), or mapping of virtual models is typically created using mapping software (e.g., Substance pointer software).
In the traditional process of manufacturing the hand-drawing style material chartlet, an SP (short for a Substance Designer) firstly fills a layer, and then controls the display range of the layer through a mask, or draws the blank layer in cooperation with a painting brush to obtain the chartlet.
However, the current method of producing the map requires a large number of masks and generators, which reduces the efficiency of producing the hand-drawn material. In addition, in the process of using the SP to make a map, a large number of layers are generally used, and with the increase of the number of layers and the increase of the hand-drawn mask, a map file is too large, the file is too large, and the efficiency of making the map is further affected. In addition, the existing method for manufacturing the map also has the defects of poor standardization, troublesome modification, inconvenient outer wrapping management and the like.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the invention provides a method and a device for generating a material map and electronic equipment, which are used for at least solving the technical problem of low manufacturing efficiency of the material map in the related technology.
According to an aspect of an embodiment of the present invention, a method for generating a material map is provided, including obtaining a parameter file corresponding to a material file of a preset material type, where the parameter file includes a material parameter for generating the material file; receiving an adjusting instruction, and adjusting the material parameters corresponding to the parameter file to obtain adjusted material parameters; and generating a material map based on the adjusted material parameters.
Further, the method for generating the material map further comprises the following steps: after receiving an adjustment instruction, adjusting the material parameters corresponding to the parameter file, and before obtaining the adjusted material parameters, obtaining initial material parameters corresponding to the parameter file; and generating an initial material map based on the initial material parameters.
Further, the method for generating the material map further comprises the following steps: determining a material parameter to be adjusted corresponding to the parameter file and a parameter value corresponding to the material parameter to be adjusted according to the model to be rendered; and adjusting the material parameter to be adjusted based on the parameter value to obtain the adjusted material parameter.
Further, the method for generating the material map further comprises the following steps: determining the volume relation of the model to be rendered according to the space coordinate information of the model to be rendered to obtain volume parameters; determining color information of a model to be rendered to obtain color parameters; determining structural information and light and shadow information of a model to be rendered, wherein the light and shadow information at least comprises: the structure information of the illumination parameter of the model to be rendered at least comprises one of the following information: light and shade information of the model to be rendered, three-dimensional hierarchical information of the model to be rendered and target highlight information of the model to be rendered; and generating a parameter file based on the volume parameter, the color parameter, the structure information and the light and shadow information.
Further, the method of rendering the model further comprises: acquiring a position map corresponding to a model to be rendered, wherein the position map at least comprises data information of color gradient gradual change of the model to be rendered according to a preset direction; acquiring an environment mask map corresponding to a model to be rendered, wherein the environment mask map at least comprises illumination information corresponding to the distance between a preset light source and the model to be rendered; and mixing the position map and the environment mask map to obtain the volume parameter.
Further, the method of rendering the model further comprises: determining first color information of a model to be rendered, wherein the first color information is used for representing gray information of the model to be rendered; overlapping the color information corresponding to the volume parameter with the first color information to obtain second color information; determining the normal direction of the surface normal of the model to be rendered from a normal map, wherein the normal map at least comprises normal information of the surface normal of the model to be rendered; and determining third color information of the model to be rendered according to the normal direction, wherein the color parameters comprise the second color information and the third color information, and the third color information is used for representing the color information of the model to be rendered.
Further, the method of rendering the model further comprises: and determining illumination information corresponding to the model to be rendered according to the world normal of the model to be rendered to obtain a brightness parameter and a darkness parameter, wherein the world normal is normal information of the model to be rendered in a world space, and the brightness information comprises the brightness parameter and the darkness parameter.
Further, the method of rendering the model further comprises: acquiring height information corresponding to a model to be rendered from a normal map to obtain a height map, wherein the normal map at least comprises normal information of a surface normal of the model to be rendered; obtaining curvature information corresponding to a model to be rendered from the normal map to obtain a curvature map; and processing the height map and the curvature map to obtain three-dimensional layer parameters corresponding to the model to be rendered.
Further, the method of rendering the model further comprises: acquiring height information corresponding to a model to be rendered from a normal map to obtain a height map, wherein the normal map at least comprises normal information of a surface normal of the model to be rendered; obtaining surface highlight information corresponding to a model to be rendered from the height map; adding illumination to a world normal of the model to be rendered to obtain punctiform highlight information corresponding to the model to be rendered, wherein the world normal is normal information of the model to be rendered in a world space; and determining target highlight information according to the surface highlight information and the point highlight information.
According to another aspect of the embodiments of the present invention, there is also provided a device for generating a material map, including: the acquisition module is used for acquiring a parameter file corresponding to a material file of a preset material type, wherein the parameter file comprises a material parameter for generating the material file; the adjusting module is used for receiving an adjusting instruction and adjusting the material parameters corresponding to the parameter file to obtain adjusted material parameters; and the generation module is used for generating a material map based on the adjusted material parameters.
According to another aspect of the embodiments of the present invention, there is also provided a computer-readable storage medium, in which a computer program is stored, wherein the computer program is configured to execute the above-mentioned material map generation method when running.
According to another aspect of the embodiments of the present invention, there is also provided an electronic device, including one or more processors; a storage device for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement a method for running a program, wherein the program is arranged to perform the above-described method for generating a texture map when run.
In the embodiment of the invention, a material mapping chart is generated based on the material parameters in the parameter file, and after the parameter file corresponding to the material file with the preset material type is obtained, an adjusting instruction is received to adjust the material parameters corresponding to the parameter file to obtain the adjusted material parameters, and then the material mapping chart is generated based on the adjusted material parameters. The parameter file comprises material parameters for generating the material file.
In the process, a mask and a generator are not needed in the process of generating the material map, so that the generation of the maps of the hand-drawn stylized material types with different stylized degrees can be met, the types of the material map are not limited to the cartoon type and the handwriting type, the material map can be the hand-drawn type between the cartoon type and the handwriting type, and the material types of the material map are enriched. In addition, because the parameter file contains the material parameters for generating the material file, the user only needs to adjust the material parameters to perfect the material map, thereby not only improving the manufacturing efficiency of the material map, but also improving the modification efficiency of the material map.
Therefore, the scheme provided by the application achieves the purpose of improving the manufacturing efficiency of the material map, the technical effect of improving the generation efficiency of the material map is achieved, and the technical problem of low manufacturing efficiency of the material map in the related technology is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
FIG. 1 is a schematic diagram of a layer of an SP according to the prior art;
FIG. 2 is a schematic diagram of layers of an SP according to the prior art;
FIG. 3 is a schematic diagram of layers of an SP according to the prior art;
FIG. 4 is a schematic diagram of a layer of an SP according to the prior art;
FIG. 5 is a flowchart of a method for generating a texture map according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of an alternative graphical user interface in accordance with an embodiment of the present invention;
FIG. 7 is a schematic diagram of an alternative graphical user interface in accordance with an embodiment of the present invention;
FIG. 8 is a schematic diagram of an alternative material parameter according to an embodiment of the present invention;
FIG. 9 is a schematic diagram of an alternative material parameter according to an embodiment of the invention;
FIG. 10 is an alternative base volume parameter interface according to embodiments of the present invention;
FIG. 11 is a diagram of a chartlet effect of an alternative model to be rendered according to an embodiment of the invention;
FIG. 12 is an interface diagram of an alternative hand-painted effect random color block layer 01 according to an embodiment of the present invention;
FIG. 13 is an alternative random tile map effect diagram in accordance with embodiments of the present invention;
FIG. 14 is a parameter diagram of an alternative SP software according to an embodiment of the present invention;
FIG. 15 is a diagram of a chartlet effect of an alternative model to be rendered according to an embodiment of the invention;
FIG. 16 is a diagram of a chartlet effect of an alternative model to be rendered according to an embodiment of the invention;
FIG. 17 is an interface corresponding to a parameter menu of an optional SP bright-surfaced color patch, in accordance with an embodiment of the present invention;
FIG. 18 is a diagram of a chartlet effect of an alternative model to be rendered according to an embodiment of the invention;
FIG. 19 is an alternative SP reflectant layer parameter interface according to an embodiment of the present invention;
FIG. 20 is a diagram of a chartlet effect of an alternative model to be rendered according to an embodiment of the invention;
FIG. 21 is a diagram of a chartlet effect of an alternative model to be rendered according to an embodiment of the invention;
FIG. 22 is a diagram of a chartlet effect of an alternative model to be rendered according to an embodiment of the invention;
FIG. 23 is a diagram of a chartlet effect of an alternative model to be rendered according to an embodiment of the invention;
FIG. 24 is a diagram of a chartlet effect of an alternative model to be rendered according to an embodiment of the invention;
FIG. 25 is a block diagram of an apparatus for generating a texture map according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In addition, it should be noted that, in the process of traditional hand-drawing style material mapping, the SP fills the layer first, and then controls the display range of the layer through masking, or performs drawing in cooperation with a brush pen in an empty layer to obtain the mapping.
Specifically, the SP first establishes a basic volume relationship of the virtual model in a form of superimposing a plurality of monochromatic image layers and a mask, for example, in the image layer diagram of the SP shown in fig. 1, the SP superimposes 5 image layers to obtain the volume relationship of the virtual model. Then, the SP superimposes the three layers shown in fig. 2 to realize the edge wear effect of the virtual model, and superimposes the two layers shown in fig. 3 to realize the texture of the virtual model and the production of random color patches to enrich the colors of the virtual model. Finally, the hand-drawn feeling and the rich detail expression are increased by the multiple hand-drawn layers shown in fig. 4.
However, the above method for making a map has the following disadvantages:
(1) the manufacturing efficiency is low. The method for making the chartlet needs to use a large number of masks and generators, and because the material has writing property and the masks have randomness, the method is often applied to writing items and is not suitable for the style of hand-drawing effect between cartoon and writing. At present, the style of the hand-drawing effect is realized by using a manual drawing mode, the manufacturing efficiency of materials with the hand-drawing style is reduced, and the programming advantage of the SP cannot be well utilized.
(2) The document is bloated. In the process of using SP to make a map, a large number of layers are usually used, for example, each layer controls a color, and the purpose of enriching the material performance is achieved by controlling the superposition between the mask under each layer and each layer. The programmed mask of the SP cannot directly meet the requirements of the hand-drawing style material, so that the hand-drawing style of the material is usually realized by a pure hand-drawing mode in practical application. With the increase of the number of layers and the increase of the hand-drawn mask, the map file becomes more and more bloated, so that the map file is too large, the whole process of map making becomes unsmooth, and the efficiency of map making is further influenced.
(3) The normalization is poor. Different art workers have different ways of making the pictures, so that the layers used in the process of making the pictures are different, and the standardization of the material files is poor.
(4) The modification is troublesome. Due to the increase of the number of layers and the use of a large amount of hand-drawn masks, the standardization of the whole material manufacturing is poor, and therefore the modification efficiency of the material is reduced.
(5) The overwrap management is inconvenient. The existing method for manufacturing the paste chart has the problems, so that the cost for communication and cooperation between the art personnel and the outside is inevitably increased, and the external packing management is unchanged.
To solve the above problems of the prior art, according to the embodiment of the present invention, a method for generating a texture map is provided, it should be noted that the steps shown in the flowchart of the figure can be executed in a computer system such as a set of computer executable instructions, and although the logical order is shown in the flowchart, in some cases, the steps shown or described can be executed in an order different from the order shown.
In addition, it should be further noted that an electronic device may be used as an execution subject of the method provided in this embodiment, where the electronic device may be, but is not limited to, a handheld device (e.g., a smartphone, a smart tablet, etc.), a non-handheld device (e.g., a computer device, etc.). Optionally, an application or program for creating the map is installed in the electronic device, for example, sd (substance designer) software and sp (substance designer) software.
FIG. 5 is a flowchart of a method for generating a texture map according to an embodiment of the present invention, as shown in FIG. 5, the method includes the following steps:
step S502, a parameter file corresponding to a material file of a preset material type is obtained.
In step S502, the preset material types may be, but are not limited to, a hand-drawing material type, a realistic material type, and a cartoon material type. The hand-painted material type is a hand-painted material, and the hand-painted material is an artistic expression, which is derived from the processing manner of painting, for example, the hand-painted pen touch in oil painting is an important component in painting with impression. The hand-drawing expression effect can be obtained by performing hand processing on the aspects of the outer contour, the virtual-real contrast, the color, the random brush stroke and the like.
In an alternative embodiment, material files of different material types correspond to different parameter files. The user can determine the material file for performing model rendering on the model to be rendered according to actual requirements, and further determine the parameter file required for generating the material map according to the material type to which the material file belongs. Optionally, the model to be rendered may be a game character in a game, or may also be a virtual model in the game, for example, a mountain, a tree, a house, or the like.
Optionally, the SD software may generate parameter files corresponding to material files of different material types, and the local memory of the electronic device may store an association relationship between the material types and the material files, so that when a user generates a material map through the SP software, the electronic device may call the parameter files corresponding to the material types according to the material types of the material files, for example, in the graphical user interface shown in fig. 6, the user may select the material types, so that the SP software may call the parameter files corresponding to the material types selected by the user, and further, the initial material map may be generated according to the initial parameters of the parameter files.
In the process, the parameter file contains the material parameters for generating the material file, and the user can adjust the material map corresponding to the model to be rendered only by adjusting the material parameters corresponding to the parameter file without modifying the mask or the generator, so that the modification efficiency of the material map is improved.
It should be noted that, as shown in step S502, the application can implement the making and modifying of the material file of the hand-drawn material type by adjusting the parameter file of the material file, and does not need to use a large number of layers and hand-drawn masks, thereby improving the generation efficiency and modification efficiency of the material file of the hand-drawn material type.
In addition, the material parameters corresponding to the parameter file may include, but are not limited to, color information, lighting angle, color block contrast, color block freehand drawing stylized intensity, and the like. Namely, the user can adjust the material quality parameter of the parameter file to achieve the adjustment of the material quality mapping corresponding to the model to be rendered.
Step S504, receiving an adjustment instruction, and adjusting the material parameter corresponding to the parameter file to obtain the adjusted material parameter.
In an alternative embodiment, the user may adjust the material parameters of the parameter file through the SP software in the electronic device. For example, the graphic user interface of the electronic device shown in fig. 7 shows layers corresponding to the parameter file, where "highlight" corresponds to a highlight layer, and "base _ fabric" corresponds to a base layer.
It should be noted that, the electronic device can adjust the material parameters in the parameter file according to the adjustment instruction initiated by the user, which is convenient and fast.
In step S506, a material map is generated based on the adjusted material parameters.
Based on the solutions defined in the foregoing steps S502 to S506, it can be known that, in the embodiment of the present invention, a manner of generating a material map based on the material parameters in the parameter file is adopted, and after the parameter file corresponding to the material file of the preset material type is obtained, an adjustment instruction is received to adjust the material parameters corresponding to the parameter file, so as to obtain the adjusted material parameters, and then the material map is generated based on the adjusted material parameters. The parameter file comprises material parameters for generating the material file.
It is easy to notice that in the above process, no mask and generator are needed in the process of generating the material map, so that the generation of the maps of the hand-drawn stylized material types with different stylization degrees can be met, the types of the material map are not limited to the cartoon type and the realistic writing type, the material map can be the hand-drawn type between the cartoon type and the realistic writing type, and the material types of the material map are enriched. In addition, because the parameter file contains the material parameters for generating the material file, the user only needs to adjust the material parameters to perfect the material map, thereby not only improving the manufacturing efficiency of the material map, but also improving the modification efficiency of the material map.
Therefore, the scheme provided by the application achieves the purpose of improving the manufacturing efficiency of the material map, the technical effect of improving the generation efficiency of the material map is achieved, and the technical problem of low manufacturing efficiency of the material map in the related technology is solved.
In an optional embodiment, before receiving the adjustment instruction and adjusting the material parameters corresponding to the parameter file to obtain the adjusted material parameters, the electronic device first obtains the initial material parameters corresponding to the parameter file, and generates the initial material map based on the initial material parameters.
Optionally, after the user selects the parameter file corresponding to the material map through the electronic device, the SP software in the electronic device generates the initial material map according to the parameter value corresponding to the initial material parameter in the parameter file. It should be noted that the initial material map generated by the initial material parameters in the parameter file may not meet the user requirements, that is, the rendering effect obtained by rendering the model to be rendered by using the initial material map cannot meet the user requirements, and in this scenario, the user may adjust the material parameters in the parameter file by using the electronic device, so that the material map generated by the adjusted material parameters can meet the user requirements.
Specifically, the electronic device determines a material parameter to be adjusted corresponding to the parameter file and a parameter value corresponding to the material parameter to be adjusted according to the model to be rendered, and adjusts the material parameter to be adjusted based on the parameter value to obtain an adjusted material parameter.
Optionally, the user may determine whether to adjust the material parameter corresponding to the material file according to the color information and the illumination information of the model to be rendered. If the color information, the light and shadow information, the illumination information and the like of the model to be rendered meet the requirements of the user, the user does not need to adjust the material parameters; if the color information, the light and shadow information, the illumination information and the like of the model to be rendered do not meet the requirements of the user, the user can adjust the material parameters corresponding to the material file through the SP software. For example, the user may adjust the material parameters shown in fig. 8 through SP software in the electronic device, where in fig. 8, base, grunge01, grunge02, shadow, top, light, reflection, ao, deepine, and the like represent different charts, and each chart is expanded to obtain the corresponding material parameters under each chart, and for example, when the user clicks the shadow chart in fig. 8, the electronic device may display the material parameters (e.g., color block color, color block range, and the like) included in the shadow chart shown in fig. 9.
It should be noted that the parameter file provides the user with the material parameters capable of adjusting the material mapping, and provides the user with the freedom of operation, thereby meeting the requirements of different virtual models on the material mapping to the maximum extent. Moreover, the user can freely adjust the light angle of the virtual model through the parameter file so as to meet the light and shadow expression of different shapes of the virtual model.
In an optional embodiment, before adjusting the material parameter corresponding to the parameter file, the electronic device first needs to obtain the parameter file corresponding to the material file of the preset material type. The electronic device can make the parameter file through SD software. Specifically, the electronic device determines the volume relationship of the model to be rendered according to the space coordinate information of the model to be rendered to obtain a volume parameter, determines the color information of the model to be rendered to obtain a color parameter, then determines the structure information and the light and shadow information of the model to be rendered, and finally generates a parameter file based on the volume parameter, the color parameter, the structure information and the light and shadow information. Wherein the light and shadow information at least comprises: the structure information of the illumination parameter of the model to be rendered at least comprises one of the following information: the model rendering method comprises the steps of obtaining light and shade information of a model to be rendered, three-dimensional hierarchical information of the model to be rendered and target highlight information of the model to be rendered.
In the process of determining the volume relation of the model to be rendered according to the model structure information corresponding to the model to be rendered and obtaining the volume parameter, the electronic equipment obtains a position map corresponding to the model to be rendered and obtains an environment mask map corresponding to the model to be rendered, and then the position map and the environment mask map are mixed to obtain the volume parameter. The position map at least comprises data information of color gradient gradual change of the model to be rendered according to a preset direction; the environment mask map at least comprises illumination information corresponding to the distance between the preset light source and the model to be rendered.
It should be noted that the position map corresponding to the model to be rendered represents the overall gradient relationship from bottom to top in the map corresponding to the model to be rendered. The environment mask map (a 0 map for short) can simulate shadows generated by a model to be rendered, and the volume of the model to be rendered can be increased by using the environment mask map when no light is applied, that is, light is not considered at all, and the environment mask map can simulate the effect of real illumination based on the phenomenon that the closer the light source is to the model to be rendered, the weaker the illumination of reflected light is.
In addition, it should be noted that after the world normal and the environmental mask map corresponding to the model to be rendered are obtained, the electronic device may perform a blending process on the position map and the environmental mask map, so as to obtain a black-white-gray relationship (i.e., the above-mentioned volume relationship) corresponding to the material map. The SD software is opened with three levels of black, white and gray colors, so that a user can adjust the basic color of the texture map in the SP software.
Optionally, in the basic volume parameter interface shown in fig. 10, the overall volume relationship and the color relationship corresponding to the model to be rendered may be made by setting a color, where color01 corresponds to a dark portion color, color02 corresponds to a gray scale color, and color03 corresponds to a bright portion color. Fig. 11 is a mapping effect obtained after rendering the model to be rendered based on the volume parameter.
In an optional embodiment, in the process of determining the color information of the model to be rendered and obtaining the color parameter, the electronic device first determines first color information of the model to be rendered, performs superposition processing on the color information corresponding to the volume parameter and the first color information to obtain second color information, then determines a normal direction of a surface normal of the model to be rendered from the normal map, and determines third color information of the model to be rendered according to the normal direction. The normal map at least comprises normal information of a surface normal of the model to be rendered, the color parameters comprise second color information and third color information, the first color information is used for representing gray information of the model to be rendered, and the third color information is used for representing color information of the model to be rendered.
It is noted that a normal map is an image texture mapped to the surface of the model to be rendered, similar to a color texture, but unlike a color texture, each vertex (i.e. texel) in the normal map may also represent the distance from a planar (or smooth) polygonal real surface in the direction of the surface normal. The normal map may represent a variation of the surface normal of the model to be rendered, wherein the normal map may store information how to change the surface normal by texture.
Optionally, as shown in the interface diagram of the hand-drawn effect random color block layer 01 shown in fig. 12, a user may obtain a required hand-drawn effect by adjusting the intensity parameter.
It should be noted that, in practical applications, the electronic device may select different color block materials from the SP software to meet practical requirements, for example, a user may select a color block style actually required from the color block materials shown in fig. 12 by the SP software to perform color superposition processing. Fig. 13 is a diagram of the random color patch effect obtained based on the above functional operations.
Optionally, after obtaining the second color information, the electronic device extracts a random color block associated with the model structure for a normal based on the model to be rendered in a grunge02 layer shown in fig. 14. Finally, the electronic device determines color information (i.e., third color information) of the model to be rendered according to the determined normal direction, such as red, green, and blue information of the model to be rendered.
It should be noted that, the parameters of the SP software shown in fig. 14 show four different types of color block materials, where each color block material corresponds to one normal direction, and a user can select the color block material according to actual needs. Fig. 15 shows rendering map effects obtained after rendering the model to be rendered based on the color parameters.
In an optional embodiment, the electronic device determines, according to a world normal of the model to be rendered, illumination information corresponding to the model to be rendered, to obtain a brightness parameter and a darkness parameter, where the world normal is normal information of the model to be rendered in a world space, and the darkness information includes the brightness parameter and the darkness parameter.
It should be noted that the world normal represents normal information of front, back, left, right, up, down, etc. directions of the model to be rendered.
Optionally, the electronic device may add an illumination node to the world normal to generate a dark-surface color block, thereby obtaining the darkness parameter. The SD software provides two dark part color blocks for a user to flexibly select, the slop blu node in the SD software can be used for manufacturing irregular edges, and the color blocks corresponding to the darkness parameters can be obtained in a mode of overlaying texture color blocks with hand-drawing feeling. Fig. 16 shows a map effect obtained after rendering the model to be rendered based on the darkness parameter.
Optionally, the electronic device may add a light node to the world normal to generate a bright color block, thereby obtaining the luminance parameter. Fig. 17 shows an interface corresponding to a parameter menu of an SP bright-surface color block, and fig. 18 shows a mapping effect on a model to be rendered based on the above-mentioned brightness parameters.
Optionally, fig. 19 shows an SP reflectogram layer parameter interface, and fig. 20 shows a chartlet effect obtained after rendering the model to be rendered based on the above parameters.
In addition, it should be noted that, unlike the large dark portion generated by lighting, the color block in the reflection layer emphasizes the modeling of the local shape of the model to be rendered, and therefore, the electronic device may use the surface facing the normal G channel downward to enrich the structural information of the model to be rendered.
In an optional embodiment, the electronic device further obtains height information corresponding to the model to be rendered from the normal map to obtain a height map, obtains curvature information corresponding to the model to be rendered from the normal map to obtain a curvature map, and then processes the height map and the curvature map to obtain a three-dimensional level parameter corresponding to the model to be rendered. Wherein the normal map at least contains normal information of the surface normal of the model to be rendered.
It should be noted that the three-dimensional layer parameters at least include a concave modeling parameter and an edge wear parameter, where the concave modeling parameter is used to add a concave effect to the model to be rendered, and the edge wear parameter is used to add a wear effect to the model to be rendered.
Optionally, the concave shape parameter may be calculated based on a height map of normal rotation and a curvature map of normal rotation, where the curvature map allows the electronic device to extract and store concave-convex information.
It should be noted that the color blocks corresponding to the recess molding parameters are obtained by picking up the dark colors of the height map or the curvature map and then deforming the dark colors. In contrast, the height map is off-block and the curvature map is linear. The part is divided into three switchable levels for the user to select, namely AO level (based on AO picture), concave shape 01 (based on height picture) and concave shape 02 (based on curvature picture). Fig. 21 shows a mapping effect obtained after rendering the model to be rendered based on the above-mentioned concave modeling parameters.
Optionally, the electronic device may combine the normal rotation height map with the curvature map to generate a thicker and heavier wear effect map that conforms to the hand-drawn style. Fig. 22 shows a mapping effect obtained after rendering the model to be rendered based on the edge wear parameters.
In an optional embodiment, the electronic device further obtains height information corresponding to the model to be rendered from the normal map to obtain a height map, obtains surface highlight information corresponding to the model to be rendered from the height map, then adds illumination to a world normal of the model to be rendered to obtain point highlight information corresponding to the model to be rendered, and finally determines target highlight information according to the surface highlight information and the point highlight information, where the normal map at least includes normal information of a surface normal of the model to be rendered, and the world normal is normal information of the model to be rendered in a world space.
It should be noted that, in practical applications, a user may select one or two of the area highlight information and the dot highlight information as the target highlight information according to actual needs.
Optionally, the electronic device may employ the height map to generate face height information to enhance the volumetric impression of the model to be rendered. In the process of generating the parameter file, the SD software opens a range parameter, an edge hardness parameter, a deformation strength parameter and the like for a user to adjust in the SP software so as to adjust the surface height information of the model to be rendered. Fig. 23 shows a mapping effect obtained after rendering a model to be rendered based on the above-mentioned surface highlight information.
Optionally, the electronic device may add a light node to the world normal to generate punctate highlight information corresponding to the model to be rendered. Fig. 24 shows a mapping effect obtained after rendering the model to be rendered based on the dotted highlight information.
As can be seen from the above, in the present application, the parameter file corresponding to the material file of the hand-drawing texture is generated in the SD file, and the parameter file is called in the SP software to edit the material file, so as to generate the chartlet of the hand-drawing texture, thereby achieving the following effects:
(1) the manufacturing efficiency of the material map is improved. In the prior art, the generation of the material map depends on the use of a mask and a generator, the material carried by the SP software is convenient to write, and in the application, a user can modify the material content in the SP software by utilizing the SD software, so that the modified material content is more in line with the material style of actual needs. Due to the presetting of the formula and the logic, a user can simply adjust a plurality of parameters to obtain a satisfactory effect, and the loss of the advantages and the meanings of PBR (physical Based Rendering) manufacturing caused by frequent trial and error and massive manual manufacturing in the prior art is avoided.
(2) The size of the map file is reduced. Although the SD software belongs to materials, the multilayer relation is integrated into one layer, the SD software generates a parameter file through a programmed flow, the file size of a material mapping can be even kept about several megabytes, and the size of the mapping file is reduced.
(3) The normalization is stronger. Because the parameters that adjust in the electronic equipment output material in this application all classify according to being used for predetermined level, consequently, the user only need open corresponding second grade menu adjustment corresponding parameter and can realize corresponding map effect, no matter producer or the structural information that the auditor homoenergetic was clear to know the material map, and the normalization is stronger.
(4) Modification is facilitated. Because the parameters in the parameter file are preset, the user can adjust the parameters by only opening the fixed parameter menu aiming at the modification of different effects, and the modification efficiency is higher.
(5) Is convenient for outsourcing management. Compared with the prior art, in the scheme that this application provided, no matter be inside personnel or outside personnel more clear to the preparation of material map, promoted the management and the communication efficiency of outsourcing, the cost is reduced.
According to an embodiment of the present invention, an embodiment of a device for generating a texture map is further provided, where fig. 25 is a schematic diagram of the device for generating a texture map according to an embodiment of the present invention, and as shown in fig. 25, the device includes: an acquisition module 3101, an adjustment module 3103, and a generation module 3105.
The obtaining module 3101 is configured to obtain a parameter file corresponding to a material file of a preset material type, where the parameter file includes a material parameter for generating the material file; an adjusting module 3103, configured to receive an adjusting instruction, adjust the material parameter corresponding to the parameter file, and obtain an adjusted material parameter; a generating module 3105 for generating a texture map based on the adjusted texture parameters.
It should be noted that the acquiring module 3101, the adjusting module 3103 and the generating module 3105 correspond to steps S502 to S506 in the above embodiment, and the three modules are the same as the corresponding steps in the implementation example and application scenarios, but are not limited to the disclosure in the above embodiment.
Optionally, the generating device of the material map further includes: the device comprises a third acquisition module and a first generation module. The third obtaining module is used for obtaining the initial material parameters corresponding to the parameter file before receiving the adjusting instruction and adjusting the material parameters corresponding to the parameter file to obtain the adjusted material parameters; and the first generation module is used for generating an initial material map based on the initial material parameters.
Optionally, the adjusting module includes: the device comprises a first determining module and a first adjusting module. The first determining module is used for determining the material parameters to be adjusted corresponding to the parameter files and the parameter values corresponding to the material parameters to be adjusted according to the models to be rendered; and the first adjusting module is used for adjusting the material parameter to be adjusted based on the parameter value to obtain the adjusted material parameter.
Optionally, the second obtaining module includes: the device comprises a second determining module, a third determining module, a fourth determining module and a second generating module. The second determining module is used for determining the volume relation of the model to be rendered according to the space coordinate information of the model to be rendered to obtain a volume parameter; the third determining module is used for determining the color information of the model to be rendered to obtain color parameters; a fourth determining module, configured to determine structure information and light and shadow information of the model to be rendered, where the light and shadow information at least includes: the structure information of the illumination parameter of the model to be rendered at least comprises one of the following information: light and shade information of the model to be rendered, three-dimensional hierarchical information of the model to be rendered and target highlight information of the model to be rendered; and the second generation module is used for generating a parameter file based on the volume parameter, the color parameter, the structure information and the light and shadow information.
Optionally, the second determining module includes: the device comprises a fourth acquisition module, a fifth acquisition module and a mixing module. The fourth obtaining module is used for obtaining a position map corresponding to the model to be rendered, wherein the position map at least comprises data information of color gradient gradual change of the model to be rendered according to a preset direction; the fifth obtaining module is used for obtaining an environment mask map corresponding to the model to be rendered, wherein the environment mask map at least comprises illumination information corresponding to the distance between a preset light source and the model to be rendered; and the mixing module is used for mixing the position map and the environment mask map to obtain the volume parameter.
Optionally, the third determining module includes: the device comprises a fifth determination module, a superposition module, a sixth determination module and a seventh determination module. The fifth determining module is used for determining first color information of the model to be rendered, wherein the first color information is used for representing gray information of the model to be rendered; the superposition module is used for carrying out superposition processing on the color information corresponding to the volume parameter and the first color information to obtain second color information; a sixth determining module, configured to determine a normal direction of a surface normal of the model to be rendered from a normal map, where the normal map at least includes normal information of the surface normal of the model to be rendered; and the seventh determining module is used for determining third color information of the model to be rendered according to the normal direction, wherein the color parameters comprise the second color information and the third color information, and the third color information is used for representing the color information of the model to be rendered.
Optionally, the fourth determining module includes: and the eighth determining module is used for determining the illumination information corresponding to the model to be rendered according to the world normal of the model to be rendered to obtain a brightness parameter and a darkness parameter, wherein the world normal is the normal information of the model to be rendered in the world space, and the brightness information comprises the brightness parameter and the darkness parameter.
Optionally, the fourth determining module includes: the device comprises a seventh acquisition module, an eighth acquisition module and a first processing module. The seventh obtaining module is configured to obtain height information corresponding to the model to be rendered from the normal map to obtain a height map, where the normal map at least includes normal information of a surface normal of the model to be rendered; the eighth obtaining module is used for obtaining curvature information corresponding to the model to be rendered from the normal map to obtain a curvature map; and the first processing module is used for processing the height map and the curvature map to obtain the three-dimensional level parameters corresponding to the model to be rendered.
Optionally, the fourth determining module includes: the device comprises a ninth acquisition module, a tenth acquisition module, a second processing module and a ninth determination module. The ninth obtaining module is configured to obtain height information corresponding to the model to be rendered from the normal map to obtain a height map, where the normal map at least includes normal information of a surface normal of the model to be rendered; the tenth acquisition module is used for acquiring the surface height information corresponding to the model to be rendered from the height map; the second processing module is used for adding illumination to a world normal of the model to be rendered to obtain punctiform highlight information corresponding to the model to be rendered, wherein the world normal is normal information of the model to be rendered in a world space; and the ninth determining module is used for determining the target highlight information according to the surface highlight information and the point highlight information.
According to another aspect of the embodiments of the present invention, there is also provided a computer-readable storage medium, in which a computer program is stored, wherein the computer program is configured to execute the above-mentioned material map generation method when running.
According to another aspect of the embodiments of the present invention, there is also provided an electronic device, including one or more processors; a storage device for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement a method for running a program, wherein the program is arranged to perform the above-described method for generating a texture map when run.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (12)

1. A method for generating a material map is characterized by comprising the following steps:
acquiring a parameter file corresponding to a material file of a preset material type, wherein the parameter file comprises a material parameter for generating the material file;
receiving an adjusting instruction, and adjusting the material parameters corresponding to the parameter file to obtain adjusted material parameters;
and generating a material map based on the adjusted material parameters.
2. The method according to claim 1, wherein before receiving an adjustment instruction to adjust the material parameter corresponding to the parameter file to obtain the adjusted material parameter, the method further comprises:
acquiring initial material parameters corresponding to the parameter file;
and generating an initial material map based on the initial material parameters.
3. The method of claim 2, wherein receiving an adjustment instruction to adjust the material parameter corresponding to the parameter file to obtain an adjusted material parameter comprises:
determining a material parameter to be adjusted corresponding to the parameter file and a parameter value corresponding to the material parameter to be adjusted according to a model to be rendered;
and adjusting the material parameter to be adjusted based on the parameter value to obtain the adjusted material parameter.
4. The method of claim 3, wherein obtaining the parameter file corresponding to the material file of the predetermined material type comprises:
determining the volume relation of the model to be rendered according to the space coordinate information of the model to be rendered to obtain a volume parameter;
determining color information of the model to be rendered to obtain color parameters;
determining structure information and light and shadow information of the model to be rendered, wherein the light and shadow information at least comprises: the structure information of the illumination parameter of the model to be rendered at least comprises one of the following information: the light and shade information of the model to be rendered, the three-dimensional hierarchical information of the model to be rendered and the target highlight information of the model to be rendered;
generating the parameter file based on the volume parameter, the color parameter, and the structure information and the light and shadow information.
5. The method of claim 4, wherein determining the volume relationship of the model to be rendered according to the spatial coordinate information of the model to be rendered to obtain a volume parameter comprises:
acquiring a position map corresponding to the model to be rendered, wherein the position map at least comprises data information of color gradient gradual change of the model to be rendered according to a preset direction;
acquiring an environment mask map corresponding to the model to be rendered, wherein the environment mask map at least comprises illumination information corresponding to the distance between a preset light source and the model to be rendered;
and mixing the position map and the environment mask map to obtain the volume parameter.
6. The method of claim 4, wherein determining color information of the model to be rendered, and obtaining color parameters comprises:
determining first color information of the model to be rendered, wherein the first color information is used for representing gray information of the model to be rendered;
superposing the color information corresponding to the volume parameter with the first color information to obtain second color information;
determining a normal direction of a surface normal of the model to be rendered from a normal map, wherein the normal map at least contains normal information of the surface normal of the model to be rendered;
and determining third color information of the model to be rendered according to the normal direction, wherein the color parameters comprise the second color information and the third color information, and the third color information is used for representing the color information of the model to be rendered.
7. The method of claim 4, wherein determining shadow information for the model to be rendered comprises:
and determining illumination information corresponding to the model to be rendered according to the world normal of the model to be rendered to obtain a brightness parameter and a darkness parameter, wherein the world normal is normal information of the model to be rendered in a world space, and the darkness information comprises the brightness parameter and the darkness parameter.
8. The method of claim 4, wherein determining structural information of the model to be rendered comprises:
acquiring height information corresponding to the model to be rendered from a normal map to obtain a height map, wherein the normal map at least comprises normal information of a surface normal of the model to be rendered;
obtaining curvature information corresponding to the model to be rendered from the normal map to obtain a curvature map;
and processing the height map and the curvature map to obtain a three-dimensional level parameter corresponding to the model to be rendered.
9. The method of claim 4, wherein determining structural information of the model to be rendered comprises:
acquiring height information corresponding to the model to be rendered from a normal map to obtain a height map, wherein the normal map at least comprises normal information of a surface normal of the model to be rendered;
obtaining surface highlight information corresponding to the model to be rendered from the height map;
adding illumination to a world normal of the model to be rendered to obtain punctiform highlight information corresponding to the model to be rendered, wherein the world normal is normal information of the model to be rendered in a world space;
and determining the target highlight information according to the surface highlight information and the point highlight information.
10. A generation device of a material map is characterized by comprising:
the device comprises an acquisition module, a storage module and a processing module, wherein the acquisition module is used for acquiring a parameter file corresponding to a material file of a preset material type, and the parameter file comprises a material parameter for generating the material file;
the adjusting module is used for receiving an adjusting instruction and adjusting the material parameters corresponding to the parameter file to obtain adjusted material parameters;
and the generation module is used for generating a material map based on the adjusted material parameters.
11. A computer-readable storage medium, in which a computer program is stored, wherein the computer program is configured to execute a method for generating a material map according to any one of claims 1 to 9 when the computer program is executed.
12. An electronic device, wherein the electronic device comprises one or more processors; storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement a method for running a program, wherein the program is arranged to perform the method for generating a material map of any of claims 1 to 9 when run.
CN202110976154.4A 2021-08-24 2021-08-24 Method and device for generating texture map and electronic equipment Active CN113838155B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110976154.4A CN113838155B (en) 2021-08-24 2021-08-24 Method and device for generating texture map and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110976154.4A CN113838155B (en) 2021-08-24 2021-08-24 Method and device for generating texture map and electronic equipment

Publications (2)

Publication Number Publication Date
CN113838155A true CN113838155A (en) 2021-12-24
CN113838155B CN113838155B (en) 2024-07-19

Family

ID=78961065

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110976154.4A Active CN113838155B (en) 2021-08-24 2021-08-24 Method and device for generating texture map and electronic equipment

Country Status (1)

Country Link
CN (1) CN113838155B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114419233A (en) * 2021-12-31 2022-04-29 网易(杭州)网络有限公司 Model generation method and device, computer equipment and storage medium

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104134230A (en) * 2014-01-22 2014-11-05 腾讯科技(深圳)有限公司 Image processing method, image processing device and computer equipment
US9202291B1 (en) * 2012-06-27 2015-12-01 Pixar Volumetric cloth shader
CN106023295A (en) * 2016-05-27 2016-10-12 美屋三六五(天津)科技有限公司 Three-dimensional model processing method and apparatus
CN107103638A (en) * 2017-05-27 2017-08-29 杭州万维镜像科技有限公司 A kind of Fast rendering method of virtual scene and model
CN110280014A (en) * 2019-05-21 2019-09-27 西交利物浦大学 The method of spinning sensation is reduced under a kind of reality environment
CN111275802A (en) * 2020-01-19 2020-06-12 杭州群核信息技术有限公司 VRAY-based PBR material rendering method and system
CN112037311A (en) * 2020-09-08 2020-12-04 腾讯科技(深圳)有限公司 Animation generation method, animation playing method and related device
CN112184880A (en) * 2020-09-03 2021-01-05 同济大学建筑设计研究院(集团)有限公司 Building three-dimensional model processing method and device, computer equipment and storage medium
CN112274934A (en) * 2020-11-19 2021-01-29 网易(杭州)网络有限公司 Model rendering method, device, equipment and storage medium
US20210192838A1 (en) * 2019-12-24 2021-06-24 Tencent Technology (Shenzhen) Company Limited Object rendering method and apparatus, storage medium, and electronic device
CN113052947A (en) * 2021-03-08 2021-06-29 网易(杭州)网络有限公司 Rendering method, rendering device, electronic equipment and storage medium

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9202291B1 (en) * 2012-06-27 2015-12-01 Pixar Volumetric cloth shader
CN104134230A (en) * 2014-01-22 2014-11-05 腾讯科技(深圳)有限公司 Image processing method, image processing device and computer equipment
CN106023295A (en) * 2016-05-27 2016-10-12 美屋三六五(天津)科技有限公司 Three-dimensional model processing method and apparatus
CN107103638A (en) * 2017-05-27 2017-08-29 杭州万维镜像科技有限公司 A kind of Fast rendering method of virtual scene and model
CN110280014A (en) * 2019-05-21 2019-09-27 西交利物浦大学 The method of spinning sensation is reduced under a kind of reality environment
US20210192838A1 (en) * 2019-12-24 2021-06-24 Tencent Technology (Shenzhen) Company Limited Object rendering method and apparatus, storage medium, and electronic device
CN111275802A (en) * 2020-01-19 2020-06-12 杭州群核信息技术有限公司 VRAY-based PBR material rendering method and system
CN112184880A (en) * 2020-09-03 2021-01-05 同济大学建筑设计研究院(集团)有限公司 Building three-dimensional model processing method and device, computer equipment and storage medium
CN112037311A (en) * 2020-09-08 2020-12-04 腾讯科技(深圳)有限公司 Animation generation method, animation playing method and related device
CN112274934A (en) * 2020-11-19 2021-01-29 网易(杭州)网络有限公司 Model rendering method, device, equipment and storage medium
CN113052947A (en) * 2021-03-08 2021-06-29 网易(杭州)网络有限公司 Rendering method, rendering device, electronic equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
左洪亮: "虚拟三维环境中汽车模型的照明设置", 汽车科技, no. 02, 25 March 2005 (2005-03-25), pages 28 - 30 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114419233A (en) * 2021-12-31 2022-04-29 网易(杭州)网络有限公司 Model generation method and device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN113838155B (en) 2024-07-19

Similar Documents

Publication Publication Date Title
EP0947959A2 (en) System for automatic generation of selective partial renderings of complex scenes
JP3626144B2 (en) Method and program for generating 2D image of cartoon expression from 3D object data
CN102682420A (en) Method and device for converting real character image to cartoon-style image
CN109035381B (en) Cartoon picture hair rendering method and storage medium based on UE4 platform
US7777745B2 (en) Edge effect
CN111563951B (en) Map generation method, device, electronic equipment and storage medium
CN106898040A (en) Virtual resource object rendering intent and device
CN110223372A (en) Method, apparatus, equipment and the storage medium of model rendering
KR20030073424A (en) A rendering system, rendering method, and recording medium therefor
CN103065357A (en) Manufacturing method of shadow figure model based on common three-dimensional model
Gerl et al. Interactive example-based hatching
CN109712226A (en) The see-through model rendering method and device of virtual reality
WO2023098344A1 (en) Graphic processing method and apparatus, computer device, and storage medium
CN116363288A (en) Rendering method and device of target object, storage medium and computer equipment
CN112258621B (en) Method for observing three-dimensional rendering two-dimensional animation in real time
CN113838155B (en) Method and device for generating texture map and electronic equipment
Li et al. real-time rendering of 3D animal models in Chinese ink painting style
US8942476B1 (en) Saturation varying and lighting independent color color control for computer graphics
CN116485981A (en) Three-dimensional model mapping method, device, equipment and storage medium
CN115731369A (en) Post-processing material obtaining method based on illusion engine and using method
CN116485967A (en) Virtual model rendering method and related device
CN113936080A (en) Rendering method and device of virtual model, storage medium and electronic equipment
Xu et al. PointWorks: Abstraction and Rendering of Sparsely Scanned Outdoor Environments.
Chen et al. Synthesizing non photo-realistic rendering effects of volumetric strokes
CN110264564A (en) A kind of video display style emulation mode based on colour atla mapping and Histogram Mapping

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant