CN113658064A - Texture image generation method and device and electronic equipment - Google Patents

Texture image generation method and device and electronic equipment Download PDF

Info

Publication number
CN113658064A
CN113658064A CN202110888025.XA CN202110888025A CN113658064A CN 113658064 A CN113658064 A CN 113658064A CN 202110888025 A CN202110888025 A CN 202110888025A CN 113658064 A CN113658064 A CN 113658064A
Authority
CN
China
Prior art keywords
texture
noise
texture image
width value
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110888025.XA
Other languages
Chinese (zh)
Inventor
孟庆宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202110888025.XA priority Critical patent/CN113658064A/en
Publication of CN113658064A publication Critical patent/CN113658064A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Generation (AREA)

Abstract

The invention discloses a texture image generation method and device and electronic equipment. Wherein, the method comprises the following steps: sampling the noise texture image in a preset direction in a texture space to obtain a noise result; performing superposition processing on texture coordinates in the texture space according to the noise result to obtain a disturbed texture space; and generating a texture image in the disturbed texture space based on a preset linear texture function, wherein the linear texture function is used for adjusting texture parameters in the texture image, and the texture image at least comprises a waveform texture changing along with time. The method and the device solve the technical problem of low generation efficiency of the texture image in the prior art.

Description

Texture image generation method and device and electronic equipment
Technical Field
The invention relates to the field of image processing, in particular to a texture image generation method and device and electronic equipment.
Background
The visual representation of sound wave, electrocardiogram, magnetic field, etc. is the common special effect texture of games, and the special effect texture is usually realized by means of noise disturbance. Specifically, the noise map is sampled to obtain noise information, then the noise information is superimposed into a texture space of the model sheet, the texture in the texture image is disturbed, and the disturbed texture coordinates are used for sampling the original texture lines, so that the disturbed waveform texture is obtained. In the process of sampling the noise texture, operations such as translation, stretching and the like can be performed on the noise texture, so that the noise value of each pixel point can be changed in real time, and the final waveform texture can be dynamically changed.
In the above process, since the noise map is usually a single-channel (e.g., grayscale) image or a multi-channel map (e.g., including RGB information or RGBA information) image, and the disturbance information of the required texture space is a two-dimensional vector (i.e., disturbance amount in both horizontal and vertical directions), in the conversion process of superimposing the noise information on the texture space, values of two channels are usually extracted from the noise map, wherein in a single-channel scene, the required values of two channels are considered to be the same, so that two values ranging from 0 to 1 can be obtained, and then the two values are mapped, and then the two mapped values form a two-dimensional vector, which is superimposed as the disturbance information in the subsequent process.
However, in the above-mentioned noise texture disturbing method to generate dynamic wave texture, after obtaining noise information, since there is disturbance information to both horizontal and vertical directions of texture space, as a wave effect, it is usually desired to generate disturbance only in a direction perpendicular to a wave line, and for each point on the wave line, the disturbance amount in the direction is preferably identical, or at least substantially identical, to prevent a disturbance from generating a plurality of closed lines or a blurring problem caused by widening a disturbed line by pulling.
If a user needs to obtain a texture map with a thinner waveform line, the texture line map with the thinner line and higher resolution is needed to ensure that the line is thin enough and not fuzzy; if a user desires a texture map with sharp line edges, a higher resolution texture line map is required. In addition, when the dynamic waveform texture effect is manufactured, in order to prevent the lines from being disturbed by the noise information to generate redundant pulling, the corresponding noise maps need to be specially modified and manufactured, so that the change of the vertical direction of the lines is small, and the disturbance parallel to the line direction is 0. If the user wants to have a thin texture line or a sharp texture line edge, a higher resolution map is required, which not only increases the capacity of the map in the game package, but also increases the sampling load during the rendering process, and reduces the processing efficiency of the texture image.
In view of the above problems, no effective solution has been proposed.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The embodiment of the invention provides a method and a device for generating a texture image and electronic equipment, which are used for at least solving the technical problem of low generation efficiency of the texture image in the prior art.
According to an aspect of the embodiments of the present invention, there is provided a method for generating a texture image, including: sampling the noise texture image in a preset direction in a texture space to obtain a noise result; performing superposition processing on texture coordinates in the texture space according to the noise result to obtain a disturbed texture space; and generating a texture image in the disturbed texture space based on a preset linear texture function, wherein the linear texture function is used for adjusting texture parameters in the texture image, and the texture image at least comprises a waveform texture changing along with time.
Further, the method for generating the texture image further includes: and after generating a texture image in the disturbed texture space based on a preset straight line texture function, generating a waveform animation based on the texture image, wherein the waveform animation at least comprises waveform textures changing along with time.
Further, the method for generating the texture image further includes: sampling the noise texture image in a first direction on the texture space based on sampling density to obtain a first sampling result, wherein the sampling density at least comprises a first numerical value in the first direction, and the first numerical value is smaller than a preset value; and sampling the first sampling result in a second direction on the texture space to obtain a noise result, wherein the first direction is different from the second direction.
Further, the method for generating the texture image further includes: acquiring a first coordinate of each pixel of the noise texture image in a texture space in a first direction and a second coordinate of each pixel in a second direction, wherein the sampling density further comprises a second value in the second direction; calculating the product of the first coordinate and the first numerical value to obtain a third coordinate; calculating the product of the second coordinate and the second numerical value to obtain a fourth coordinate; calculating the product of the preset time parameter and the moving speed to obtain a product result, wherein the moving speed represents the speed of the moving noise texture image; and obtaining a first sampling result according to the third coordinate, the fourth coordinate and the product result.
Further, the method for generating the texture image further includes: acquiring coordinate information corresponding to the first sampling result; mapping the coordinate information to a preset coordinate range to obtain target noise information; and acquiring noise component information of the target noise information in the second direction to obtain a noise result.
Further, the method for generating the texture image further includes: and according to the noise component information, carrying out superposition processing on the coordinates of each pixel point in the texture space to obtain the disturbed texture space.
Further, the method for generating the texture image further includes: generating a texture image in a disturbed texture space according to the width value information of the wave texture in the texture image and/or the position information of the wave texture in the texture image and/or the background color information of the texture image and/or the color information of the wave texture, wherein the preset straight line texture function comprises one or more of the following information: width value information, position information, background color information, and color information.
Further, the method for generating the texture image further includes: acquiring a first width value and a second width value of a waveform texture, wherein the width value information of the waveform texture at least comprises the first width value and the second width value, the waveform texture comprises a display part and a transition part, the transition part represents a part of the waveform texture which is transited to the background color of the texture image, the first width value represents the difference value between the width of the display part and the width of the transition part, and the second width value represents the width of the display part; calculating a difference value between the first width value and the second width value to obtain a first difference value; and calculating the distance between the position of the wave texture in the texture image and the position of each pixel point in the wave texture.
Obtaining a second difference value according to the distance and the first width value; obtaining a first ratio according to the first difference and the second difference; and adjusting the color information of the waveform texture and the background color information of the texture image according to the first ratio.
Further, the method for generating the texture image further includes: acquiring a first width value and a second width value of a waveform texture, wherein the width value information of the waveform texture at least comprises the first width value and the second width value, the waveform texture comprises a display part and a transition part, the transition part represents a part of the waveform texture which is transited to the background color of the texture image, the first width value represents the difference value between the width of the display part and the width of the transition part, and the second width value represents the width of the display part; obtaining a first difference value according to the first width value and the second width value; and adjusting the fuzzy degree of the waveform texture according to the first difference.
Further, the method for generating the texture image further includes: acquiring a first width value and a second width value of a waveform texture, wherein the width value information of the waveform texture at least comprises the first width value and the second width value, the waveform texture comprises a display part and a transition part, the transition part represents a part of the waveform texture which is transited to the background color of the texture image, the first width value represents the difference value between the width of the display part and the width of the transition part, and the second width value represents the width of the display part; and adjusting the width of the wave texture according to the size of the first width value and/or the second width value.
According to another aspect of the embodiments of the present invention, there is also provided a texture image generating apparatus, including: the sampling module is used for sampling the noise texture image in a preset direction in a texture space to obtain a noise result; the processing module is used for performing superposition processing on texture coordinates in the texture space according to the noise result to obtain a disturbed texture space; and the generating module is used for generating a texture image in the disturbed texture space based on a preset linear texture function, wherein the linear texture function is used for adjusting texture parameters in the texture image, and the texture image at least comprises a waveform texture which changes along with time.
According to another aspect of the embodiments of the present invention, there is also provided a computer-readable storage medium having a computer program stored therein, wherein the computer program is configured to execute the above-mentioned texture image generation method when running.
According to another aspect of the embodiments of the present invention, there is also provided an electronic device, including one or more processors; a storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to carry out a method for running a program, wherein the program is arranged to carry out the above-mentioned method of generating a texture image when run.
In the embodiment of the invention, a method of disturbing a texture image is adopted, a noise result is obtained by sampling the noise texture image in a preset direction in a texture space, then, texture coordinates in the texture space are superposed according to the noise result to obtain a disturbed texture space, and finally, the texture image is generated in the disturbed texture space based on a preset linear texture function, wherein the linear texture function is used for adjusting texture parameters of textures in the texture image, and the texture image at least comprises waveform textures which change along with time.
In the process, any noise texture image can be sampled to obtain a noise result, and the texture image can be generated without special processing and extra modification processes for any noise texture mapping, so that the generation efficiency of the texture image is improved. In addition, in the method, the texture space is firstly subjected to superposition processing to generate the disturbed texture space, and then the texture image is generated in the disturbed texture space based on the preset linear texture function. In addition, the texture parameters in the texture image can be adjusted through the preset linear texture function, so that the waveform texture in the texture image can be adjusted without a texture image with higher resolution.
Therefore, the scheme provided by the application achieves the purpose of generating the texture image, the technical effect of improving the generation efficiency of the texture image is achieved, and the technical problem of low generation efficiency of the texture image in the prior art is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
FIG. 1 is a flow chart of a method for generating a texture image according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of an alternative texture image according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a texture image of an alternative wave texture according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of an alternative texture image in accordance with an embodiment of the present invention;
FIG. 5 is a schematic diagram of an alternative texture image in accordance with an embodiment of the present invention;
FIG. 6 is a schematic diagram of an alternative texture image in accordance with an embodiment of the present invention;
fig. 7 is a schematic diagram of a texture image generation apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In accordance with an embodiment of the present invention, there is provided an embodiment of a method for generating a texture image, it is noted that the steps illustrated in the flowchart of the drawings may be performed in a computer system such as a set of computer executable instructions, and that while a logical order is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in an order different than here.
In addition, it should be noted that, in this embodiment, the pixel shader may be an execution subject of the method provided in this embodiment, where the pixel shader is an instruction for rendering the pixels in the vertices.
Fig. 1 is a flowchart of a method for generating a texture image according to an embodiment of the present invention, as shown in fig. 1, the method includes the following steps:
and S102, sampling the noise texture image in a preset direction in a texture space to obtain a noise result.
In step S102, the texture space may be a UV space, the noise texture image may be any one of texture images, and the noise result at least includes noise information, where the noise information includes, but is not limited to, coordinate information and color information of noise.
In an alternative embodiment, the pixel shader may sample the noise texture image according to a preset sampling density, and the pixel shader may sample the noise texture image in a preset direction according to the preset sampling density, where the preset direction at least includes a first direction and a second direction, and the first direction and the second direction are different directions, for example, the first direction is a longitudinal direction, and the second direction is a transverse direction. Optionally, in this embodiment, the first direction is taken as a longitudinal direction, and the second direction is taken as a transverse direction for example.
And step S104, performing superposition processing on texture coordinates in the texture space according to the noise result to obtain the disturbed texture space.
Optionally, in step S104, the pixel shader may perform superposition processing on the coordinate value of each pixel in the texture space by using the noise result, so as to implement disturbance on the texture space, and obtain the disturbed texture space. The pixel shader can perform superposition processing on the texture space by performing product operation on the noise result and the coordinate value of the pixel.
And step S106, generating a texture image in the disturbed texture space based on a preset linear texture function. Wherein the texture image at least comprises waveform textures which change along with time.
In step S106, the preset straight-line texture function is used to adjust texture parameters in the texture image, where the texture parameters include, but are not limited to, width value information of the wave texture, color information of the wave texture, and background color information of the texture image and position information of the wave texture in the texture image. That is, the pixel shader adjusts the texture parameters in the texture image according to the preset linear texture function.
In addition, the texture image is generated by a function in a disturbed texture space, the texture image at least comprises a wavy line (the original shape is a linear line), and the texture generated in the texture space can present a wavy curve except for dynamic change due to the fact that the texture space is disturbed, namely, the texture image contains the wavy texture with dynamic change.
It should be noted that, because the texture image is generated by a function in the pixel shader, and is not obtained by pulling the existing texture, the problems of increased bag volume and increased sampling load due to high-precision mapping can be avoided while ensuring the high precision of the texture image.
Based on the solutions defined in steps S102 to S106, it can be known that, in the embodiment of the present invention, a method of performing perturbation processing on a texture image is adopted, a noise result is obtained by sampling a noise texture image in a preset direction in a texture space, then, texture coordinates in the texture space are superimposed according to the noise result, so as to obtain a perturbed texture space, and finally, a texture image is generated in the perturbed texture space based on a preset linear texture function, where the linear texture function is used to adjust texture parameters in the texture image, and the texture image at least includes a waveform texture that changes with time.
It is easy to note that in the above process, any noise texture image can be sampled to obtain a noise result, and any noise texture map can be generated without special processing, and without additional modification process, thereby improving the generation efficiency of the texture image. In addition, in the method, the texture space is firstly subjected to superposition processing to generate the disturbed texture space, and then the texture image is generated in the disturbed texture space based on the preset linear texture function. In addition, the texture parameters in the texture image can be adjusted through the preset linear texture function, so that the waveform texture in the texture image can be adjusted without a texture image with higher resolution.
Therefore, the scheme provided by the application achieves the purpose of generating the texture image, the technical effect of improving the generation efficiency of the texture image is achieved, and the technical problem of low generation efficiency of the texture image in the prior art is solved.
In an optional embodiment, after generating the texture image in the disturbed texture space based on the preset linear texture function, the pixel shader further generates a waveform animation based on the texture image, wherein the waveform animation at least comprises a waveform texture changing with time.
In the present application, it is possible to generate a texture image using an arbitrary noise texture image, and it is possible to easily create a waveform animation without special processing and modification.
In an alternative embodiment, before performing the overlay processing on the texture coordinates in the texture space according to the noise result, the pixel shader needs to perform sampling in a preset direction on the noise texture image in the texture space to obtain the noise result. Specifically, the pixel shader first performs sampling in a first direction on the noise texture image in the texture space based on the sampling density to obtain a first sampling result, and then performs sampling in a second direction on the first sampling result in the texture space to obtain a noise result, wherein the first direction is different from the second direction. The sampling density at least comprises a first value in the first direction, and the first value is smaller than a preset value.
It should be noted that, since the preset directions at least include a first direction and a second direction, and the first direction and the second direction are different directions, the pixel shader may sample in the first direction and in the second direction, respectively. Optionally, the pixel shader samples in the first direction first and then samples in the second direction. For example, the pixel shader first samples the noisy texture image in the vertical direction and then samples the noisy texture image in the horizontal direction.
In addition, it should be noted that, in order to minimize the variation of the wave texture in the texture image in the vertical direction of the lines and minimize the disturbance in the direction parallel to the wave texture (for example, the disturbance is 0), the longitudinal sampling density (i.e., the first value) needs to be reduced to be extremely low and close to or even equal to 0, for example, the first value needs to be adjusted to a value smaller than a preset value, so as to ensure that the waveform disturbance intensity visually varies only in the transverse direction, and when the texture space is subjected to the superposition processing, it can be ensured that the texture space does not generate the disturbance in the transverse direction to cause the blurring of the lines by only taking the disturbance component in the longitudinal direction.
In addition, when the texture space is subjected to superposition processing, the texture space is disturbed only by adopting the disturbance component in a single direction, so that the problem of fuzzy texture caused by disturbance of the texture space in multiple directions is avoided.
In an optional embodiment, in the process of sampling the noise texture image in the first direction on the texture space based on the sampling density to obtain the first sampling result, the pixel shader first obtains a first coordinate of each pixel of the noise texture image in the texture space in the first direction and a second coordinate of each pixel of the noise texture image in the second direction, then calculates a product of the first coordinate and the first numerical value to obtain a third coordinate, calculates a product of the second coordinate and the second numerical value to obtain a fourth coordinate, calculates a product of the preset time parameter and the moving speed to obtain a product result, and finally obtains the first sampling result according to the third coordinate, the fourth coordinate and the product result. The sampling density further comprises a second numerical value in a second direction, and the moving speed represents the speed of moving the noise texture image;
optionally, when sampling the noise texture image in the default manner, the corresponding sampling code is as follows:
float2 newNoiseUV=UV+time*speed;
fixed4 noiseColor=tex2D(noiseTexture,newNoiseUV);
it should be noted that the code may implement translation in a sampling process of a noise texture image, where newnoise UV represents coordinate information of noise obtained after sampling, UV represents coordinate information corresponding to each pixel in an original texture space, time represents current sampling time (i.e., the preset time parameter), and speed represents a two-dimensional vector (i.e., the moving speed) which represents a translation speed of sampling; the noiseColor represents the color information of the noise, and the noiseTexture represents the noise texture image; the tex2D () function represents a sampling function.
In this embodiment, the pixel shader samples the noise texture image by multiplying the sampling ordinate (i.e. the first coordinate in the first direction) by a value with a very small absolute value, that is, the sampling of the noise texture image is performed by the following function in this application:
float2 newNoiseUV=UV*float2(1,x)+time*speed;
fixed4 noiseColor=tex2D(noiseTexture,newNoiseUV);
in the above code, UV represents a coordinate value composed of a first coordinate and a second coordinate, for example, (a, b), where a represents a second numerical value and b represents a first numerical value; x represents a first value, 1 represents a second value, namely a is multiplied by 1 to obtain a third coordinate, and b is multiplied by x to obtain a fourth coordinate. The third coordinate and the fourth coordinate constitute the above-mentioned sample coordinate newNoiseUV.
Further, after the noise texture image is sampled in the texture space in the first direction based on the sampling density, the pixel shader samples the first sampling result in the texture space in the second direction to obtain a noise result. Specifically, the pixel shader first obtains coordinate information corresponding to the first sampling result, maps the coordinate information to a preset coordinate range to obtain target noise information, and then obtains noise component information of the target noise information in the second direction to obtain a noise result.
Optionally, the pixel shader performs horizontal zeroing on the noise information, wherein the following codes are used to implement:
float2 noise=(noiseColor.xy–float2(0.5))*2;
float2 noiseNew=noise*float2(0,1);
in the codes, the noise color xy represents the coordinate information corresponding to the noise information, wherein the value range corresponding to the noise color xy is 0-1, the coordinate information corresponding to the noise information can be mapped from the range of 0-1 to the range of-1 (namely the preset coordinate range) through the first code, so that the coordinate information corresponding to the noise information is different in positive and negative values, namely, the transverse disturbance can be divided into the left and right directions, the longitudinal disturbance can be divided into the up and down directions, and then a two-dimensional vector can be formed by the two mapped values to obtain the target noise information noise which is superposed to the later process as disturbance information.
In addition, in this embodiment, the pixel shader obtains the noise component information of the target noise information in the lateral direction (i.e., the second direction) by multiplying the target noise information by a value (i.e., float2(0,1) described above) whose magnitude is adjusted as a whole, so as to adjust the noise disturbance intensity.
Furthermore, after the noise result is obtained, the pixel shader performs superposition processing on texture coordinates in the texture space according to the noise result to obtain a disturbed texture space. Optionally, the pixel shader performs superposition processing on the coordinate of each pixel point in the texture space according to the noise component information, so as to obtain a disturbed texture space, which can be implemented by the following codes:
float2 newUV=UV+noiseNew。
it should be noted that, after the disturbed texture space is generated, the pixel shader generates a linear texture on the disturbed texture space by using a function to obtain a target texture image, and since the texture space has been disturbed, the linear texture in the texture space finally presents a naturally and dynamically changing waveform curve.
Specifically, the pixel shader first determines width value information of a waveform texture in the texture image and/or position information of the waveform texture in the texture image and/or background color information of the texture image and/or color information of the waveform texture, and then generates the texture image in a disturbed texture space according to the width value information of the waveform texture in the texture image and/or the position information of the waveform texture in the texture image and/or the background color information of the texture image and/or the color information of the waveform texture, wherein the preset linear texture function includes one or more of the following information: width value information, position information, background color information, and color information.
It should be noted that the above-mentioned width value information of the wave texture, the position information of the wave texture in the texture image, the background color information of the texture image, and the value of the color information of the wave texture may be opened to the user, so that the user can adjust the above-mentioned information to obtain the wave lines with various thicknesses, colors, and blurriness at different height positions of the model slice. When the noise intensity in the noise texture image is 0, the noise information in the noise texture image does not disturb the texture space, and at this time, the texture in the texture image generated in the texture space is a straight line texture, for example, in the texture image shown in fig. 2, a white line represents a waveform texture in the texture image. When the noise intensity in the noise texture image is greater than 0, the waveform texture may be distorted and jittered along with the noise, so as to obtain a dynamic waveform texture, such as the texture image of the waveform texture shown in fig. 3.
In an optional embodiment, in the process of adjusting the color information of the wave texture and the background color information of the texture image, the pixel shader first obtains a first width value and a second width value of the wave texture, calculates a difference between the first width value and the second width value to obtain a first difference, then calculates a distance between a position of the wave texture in the texture image and a position of each pixel point in the wave texture, obtains a second difference according to the distance and the first width value, obtains a first ratio according to the first difference and the second difference, and finally adjusts the color information of the wave texture and the background color information of the texture image according to the first ratio. The width value information of the wave texture at least comprises a first width value and a second width value, the wave texture comprises a display part and a transition part, the transition part represents a part of the wave texture which is transited to the background color of the texture image, the first width value represents the difference value of the width of the display part and the width of the transition part, and the second width value represents the width of the display part.
Alternatively, a texture image with a ground color of color01 and a line color of a waveform texture of color02 may be obtained by color lrp (color01, color02, lineMask), for example, by taking color01 as (1,0,0,1) (pure red) and color02 as (0,0,1,1) (pure blue), a red waveform texture with a ground color of pure blue may be obtained. If the ground color is required to be transparent, the color01 can be adjusted to be transparent, namely, the alpha component of the color01 is 0.
It should be noted that, the distance between the position of the above-mentioned wave texture in the texture image and the position of each pixel point in the wave texture can be calculated by the following formula:
distance=|newUV.y-linePosition|
where distance represents the above distance, newuv.y represents the longitudinal position coordinate of each pixel point in the wavy texture, linePosition represents the longitudinal position of the wavy texture in the texture image, and is usually represented by the longitudinal position of the center position of the wavy texture in the texture image, for example, in the texture image shown in fig. 4, linePosition is the longitudinal position of the center position of the wavy texture in the texture image.
lineMask ═ 1-saturrate ((distance-a)/(B-a)), where B > a.
In the above equation, lineMask denotes a line mask of the wave texture, and a denotes one-half of the width of the most real part of the wave texture color, that is, a is (first width value-second width value)/2; b denotes that the wave texture transitions to one half of the same width as the base color (i.e., where the line edge is widest), i.e., B equals the first width value/2.
Optionally, each pixel on the virtual model generates a lineMask value, and the following formula is known:
when distance is equal to a, LineMask is 1-saturrate ((a-a)/(B-a)) -1-0 is 1;
when distance is equal to B, LineMask is equal to 1-saturrate ((B-a)/(B-a)) -1 is equal to 0.
Thus, 0< lineMask < 1. Then:
when the distance is larger than or equal to B, linemaster is 0; when the distance is less than or equal to A, linemaster is 1;
lineMask is a linear transition from 0 to 1 when B > distance > a.
Through the above process, lineMask can be changed with the example change from linePosition to distance, wherein the closer the distance is, the closer the value is, the 1 is, the farther the distance is, the closer the value is, the 0 is, so as to form the information required for generating the waveform texture, and then linear mapping is performed on the waveform texture:
color=lerp(color01,color02,lineMask)。
the texture image with the bottom color of color01 and the line color of the waveform texture of color02 can be formed.
In an alternative embodiment, the pixel shader may also adjust the degree of blurring of the waveform texture. Specifically, the pixel shader first obtains a first width value and a second width value of the waveform texture, obtains a first difference value according to the first width value and the second width value, and then adjusts the blurring degree of the waveform texture according to the first difference value. For example, in fig. 5, when the values of a and B are closer, the line edge of the waveform texture is sharper, and lineMask rapidly transits from 1 to 0 in a narrow region; when the difference between the values of A and B is larger, the line edge of the wave texture is blurry.
In an alternative embodiment, the pixel shader may also adjust the width of the texture waveform. Specifically, the pixel shader acquires a first width value and a second width value of the waveform texture, and adjusts the width of the waveform texture according to the size of the first width value and/or the second width value. For example, in fig. 6, when both a and B are small, the lines of the texture waveform are thin; when both a and B are large, the lines of the texture waveform are thick.
According to the method, the dynamic oscillogram effect is rapidly manufactured by using any noise mapping without an additional modification process, and the sharpness of the wave texture can be adjusted, so that the method is flexible, controllable and high in precision. In addition, the scheme provided by the application can also reduce the process of sampling the texture lines, reduce the rendering load, and reduce the manufacture of the texture lines and the increase of the texture lines to the total capacity of the game.
According to an embodiment of the present invention, there is further provided an embodiment of a texture image generating apparatus, where fig. 7 is a schematic diagram of a texture image generating apparatus according to an embodiment of the present invention, and as shown in fig. 7, the apparatus includes: a sampling module 701, a processing module 703 and a generating module 705.
The sampling module 701 is configured to perform sampling in a preset direction on a noise texture image in a texture space to obtain a noise result; the processing module 703 is configured to perform superposition processing on the texture coordinates in the texture space according to the noise result to obtain a disturbed texture space; a generating module 705, configured to generate a texture image in the disturbed texture space based on a preset straight line texture function, where the straight line texture function is used to adjust texture parameters in the texture image, and the texture image at least includes a waveform texture that changes with time.
It should be noted that the sampling module 701, the processing module 703 and the generating module 705 correspond to steps S102 to S106 in the above embodiment, and the three modules are the same as the examples and application scenarios realized by the corresponding steps, but are not limited to the disclosure in the above embodiment.
Optionally, the texture image generating device further includes: and the first generation module is used for generating a waveform animation based on the texture image after the texture image is generated in the disturbed texture space based on the preset linear texture function, wherein the waveform animation at least comprises a waveform texture changing along with time.
Optionally, the sampling module includes: the device comprises a first sampling module and a second sampling module. The noise texture image processing device comprises a first sampling module, a second sampling module and a processing module, wherein the first sampling module is used for sampling a noise texture image in a first direction on the basis of sampling density in a texture space to obtain a first sampling result, the sampling density at least comprises a first numerical value in the first direction, and the first numerical value is smaller than a preset value; and the second sampling module is used for sampling the first sampling result in a second direction on the texture space to obtain a noise result, wherein the first direction is different from the second direction.
Optionally, the first sampling module includes: the device comprises a first acquisition module, a first calculation module, a second calculation module, a third calculation module and a fourth calculation module. The first obtaining module is configured to obtain a first coordinate of each pixel of the noise texture image in the texture space in a first direction and a second coordinate of each pixel in the texture space in a second direction, where the sampling density further includes a second value in the second direction; the first calculation module is used for calculating the product of the first coordinate and the first numerical value to obtain a third coordinate; the second calculation module is used for calculating the product of the second coordinate and the second numerical value to obtain a fourth coordinate; the third calculation module is used for calculating the product of the preset time parameter and the moving speed to obtain a product result, wherein the moving speed represents the speed of the moving noise texture image; and the fourth calculation module is used for obtaining a first sampling result according to the third coordinate, the fourth coordinate and the product result.
Optionally, the second sampling module includes: the device comprises a second acquisition module, a mapping module and a third acquisition module. The second acquisition module is used for acquiring coordinate information corresponding to the first sampling result; the mapping module is used for mapping the coordinate information to a preset coordinate range to obtain noise information; and the third acquisition module is used for acquiring the noise component information of the noise information in the second direction to obtain a noise result.
Optionally, the processing module includes: and the first processing module is used for performing superposition processing on the coordinates of each pixel point in the texture space according to the noise component information to obtain the disturbed texture space.
Optionally, the generating module includes: a second generating module, configured to generate a texture image in the disturbed texture space according to width value information of a wave texture in the texture image and/or position information of the wave texture in the texture image and/or background color information of the texture image and/or color information of the wave texture, where the preset linear texture function includes one or more of the following information: width value information, position information, background color information, and color information.
Optionally, the texture image generating device further includes: the device comprises a fourth obtaining module, a fifth calculating module, a sixth calculating module, a seventh calculating module, an eighth calculating module and a first adjusting module. The fourth obtaining module is configured to obtain a first width value and a second width value of the waveform texture, where the width value information of the waveform texture at least includes the first width value and the second width value, the waveform texture includes a display portion and a transition portion, the transition portion represents a portion where the waveform texture transitions to a background color of the texture image, the first width value represents a difference between a width of the display portion and a width of the transition portion, and the second width value represents a width of the display portion; the fifth calculation module is used for calculating a difference value between the first width value and the second width value to obtain a first difference value; the sixth calculation module is used for calculating the distance between the position of the waveform texture in the texture image and the position of each pixel point in the waveform texture; the seventh calculation module is used for obtaining a second difference value according to the distance and the first width value; the eighth calculating module is used for obtaining a first ratio according to the first difference and the second difference; and the first adjusting module is used for adjusting the color information of the waveform texture and the background color information of the texture image according to the first ratio.
Optionally, the texture image generating device further includes: the device comprises a fifth acquisition module, a ninth calculation module and a second adjustment module. The fifth obtaining module is configured to obtain a first width value and a second width value of a waveform texture, where the width value information of the waveform texture at least includes the first width value and the second width value, the waveform texture includes a display portion and a transition portion, the transition portion represents a portion where the waveform texture transitions to a background color of the texture image, the first width value represents a difference between a width of the display portion and a width of the transition portion, and the second width value represents a width of the display portion; the ninth calculation module is used for obtaining a first difference value according to the first width value and the second width value; and the second adjusting module is used for adjusting the fuzzy degree of the waveform texture according to the first difference value.
Optionally, the texture image generating device further includes: a sixth obtaining module and a third adjusting module. The sixth obtaining module is configured to obtain a first width value and a second width value of a waveform texture, where the width value information of the waveform texture at least includes the first width value and the second width value, the waveform texture includes a display portion and a transition portion, the transition portion represents a portion where the waveform texture transitions to a background color of the texture image, the first width value represents a difference between a width of the display portion and a width of the transition portion, and the second width value represents a width of the display portion; and the third adjusting module is used for adjusting the width of the waveform texture according to the size of the first width value and/or the second width value.
According to another aspect of the embodiments of the present invention, there is also provided a computer-readable storage medium having a computer program stored therein, wherein the computer program is configured to execute the method for generating a texture image in the above-mentioned embodiments when running.
According to another aspect of the embodiments of the present invention, there is also provided an electronic device, including one or more processors; storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to carry out a method for running a program, wherein the program is arranged to carry out the method for generating a texture image in the above-described embodiments when run.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (13)

1. A method for generating a texture image, comprising:
sampling the noise texture image in a preset direction in a texture space to obtain a noise result;
superposing texture coordinates in the texture space according to the noise result to obtain a disturbed texture space;
and generating a texture image in the disturbed texture space based on a preset linear texture function, wherein the linear texture function is used for adjusting texture parameters in the texture image, and the texture image at least comprises a waveform texture changing along with time.
2. The method of claim 1, wherein after generating the texture image in the perturbed texture space based on a preset linear texture function, the method further comprises:
generating a waveform animation based on the texture image, wherein the waveform animation comprises at least the waveform texture over time.
3. The method of claim 1, wherein sampling the noise texture image in a preset direction in a texture space to obtain a noise result comprises:
sampling the noise texture image in a first direction on the texture space based on sampling density to obtain a first sampling result, wherein the sampling density at least comprises a first numerical value in the first direction, and the first numerical value is smaller than a preset value;
and sampling the first sampling result in a second direction on the texture space to obtain the noise result, wherein the first direction is different from the second direction.
4. The method of claim 3, wherein sampling the noise texture image in a first direction in the texture space based on a sampling density to obtain a first sampling result comprises:
acquiring a first coordinate of each pixel of the noise texture image in the texture space in the first direction and a second coordinate of each pixel in the texture space in the second direction, wherein the sampling density further comprises a second value in the second direction;
calculating the product of the first coordinate and the first numerical value to obtain a third coordinate;
calculating the product of the second coordinate and the second numerical value to obtain a fourth coordinate;
calculating the product of a preset time parameter and a moving speed to obtain a product result, wherein the moving speed represents the speed of moving the noise texture image;
and obtaining the first sampling result according to the third coordinate, the fourth coordinate and the product result.
5. The method of claim 3, wherein sampling the first sampling result in a second direction in the texture space to obtain the noise result comprises:
acquiring coordinate information corresponding to the first sampling result;
mapping the coordinate information to a preset coordinate range to obtain target noise information;
and acquiring noise component information of the target noise information in the second direction to obtain the noise result.
6. The method according to claim 5, wherein the superimposing the texture coordinates in the texture space according to the noise result to obtain a disturbed texture space comprises:
and according to the noise component information, carrying out superposition processing on the coordinates of each pixel point in the texture space to obtain the disturbed texture space.
7. The method of claim 1, wherein generating a texture image in the perturbed texture space based on a predetermined linear texture function comprises:
generating the texture image in the disturbed texture space according to the width value information of the wave texture in the texture image and/or the position information of the wave texture in the texture image and/or the background color information of the texture image and/or the color information of the wave texture, wherein the preset straight line texture function comprises one or more of the following information: the width value information, the position information, the background color information, and the color information.
8. The method of claim 7, further comprising:
acquiring a first width value and a second width value of the wave texture, wherein the width value information of the wave texture at least comprises the first width value and the second width value, the wave texture comprises a display part and a transition part, the transition part represents a part of the wave texture which is transited to the background color of the texture image, the first width value represents a difference value between the width of the display part and the width of the transition part, and the second width value represents the width of the display part;
calculating a difference value between the first width value and the second width value to obtain a first difference value;
calculating the distance between the position of the wave texture in the texture image and the position of each pixel point in the wave texture;
obtaining a second difference value according to the distance and the first width value;
obtaining a first ratio according to the first difference and the second difference;
and adjusting the color information of the waveform texture and the background color information of the texture image according to the first ratio.
9. The method of claim 7, further comprising:
acquiring a first width value and a second width value of the wave texture, wherein the width value information of the wave texture at least comprises the first width value and the second width value, the wave texture comprises a display part and a transition part, the transition part represents a part of the wave texture which is transited to the background color of the texture image, the first width value represents a difference value between the width of the display part and the width of the transition part, and the second width value represents the width of the display part;
obtaining a first difference value according to the first width value and the second width value;
and adjusting the fuzzy degree of the waveform texture according to the first difference.
10. The method of claim 7, further comprising:
acquiring a first width value and a second width value of the wave texture, wherein the width value information of the wave texture at least comprises the first width value and the second width value, the wave texture comprises a display part and a transition part, the transition part represents a part of the wave texture which is transited to the background color of the texture image, the first width value represents a difference value between the width of the display part and the width of the transition part, and the second width value represents the width of the display part;
and adjusting the width of the wave texture according to the size of the first width value and/or the second width value.
11. An apparatus for generating a texture image, comprising:
the sampling module is used for sampling the noise texture image in a preset direction in a texture space to obtain a noise result;
the processing module is used for performing superposition processing on the texture coordinates in the texture space according to the noise result to obtain a disturbed texture space;
and the generating module is used for generating a texture image in the disturbed texture space based on a preset linear texture function, wherein the linear texture function is used for adjusting texture parameters in the texture image, and the texture image at least comprises a waveform texture changing along with time.
12. A computer-readable storage medium, in which a computer program is stored, wherein the computer program is arranged to execute the method for generating a texture image as claimed in any one of claims 1 to 10 when executed.
13. An electronic device, wherein the electronic device comprises one or more processors; storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement a method for running a program, wherein the program is arranged to perform the method for generating a texture image of any one of claims 1 to 10 when run.
CN202110888025.XA 2021-08-03 2021-08-03 Texture image generation method and device and electronic equipment Pending CN113658064A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110888025.XA CN113658064A (en) 2021-08-03 2021-08-03 Texture image generation method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110888025.XA CN113658064A (en) 2021-08-03 2021-08-03 Texture image generation method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN113658064A true CN113658064A (en) 2021-11-16

Family

ID=78478357

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110888025.XA Pending CN113658064A (en) 2021-08-03 2021-08-03 Texture image generation method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN113658064A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114339448A (en) * 2021-12-31 2022-04-12 深圳万兴软件有限公司 Method and device for manufacturing light beam video special effect, computer equipment and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2227595A1 (en) * 1995-08-04 1997-02-20 David Sarnoff Research Center, Inc. Method and apparatus for generating image textures
WO1999063489A1 (en) * 1998-06-05 1999-12-09 Evans & Sutherland Computer Corporation Method and system for antialiased procedural solid texturing
US20050273712A1 (en) * 1999-03-04 2005-12-08 Smith Jeffrey A Method and system for transmitting texture information through communications networks
WO2006095481A1 (en) * 2005-03-07 2006-09-14 Sony Computer Entertainment Inc. Texture processing device, drawing processing device, and texture processing method
US20080117214A1 (en) * 2006-11-22 2008-05-22 Michael Perani Pencil strokes for vector based drawing elements
US20130016112A1 (en) * 2007-07-19 2013-01-17 Disney Enterprises, Inc. Methods and apparatus for multiple texture map storage and filtering including irregular texture maps
WO2019178037A1 (en) * 2018-03-13 2019-09-19 Google Llc Mixed noise and fine texture synthesis in lossy image compression
CN111402124A (en) * 2020-03-24 2020-07-10 支付宝(杭州)信息技术有限公司 Method and device for generating texture image and synthetic image

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2227595A1 (en) * 1995-08-04 1997-02-20 David Sarnoff Research Center, Inc. Method and apparatus for generating image textures
WO1999063489A1 (en) * 1998-06-05 1999-12-09 Evans & Sutherland Computer Corporation Method and system for antialiased procedural solid texturing
US20050273712A1 (en) * 1999-03-04 2005-12-08 Smith Jeffrey A Method and system for transmitting texture information through communications networks
WO2006095481A1 (en) * 2005-03-07 2006-09-14 Sony Computer Entertainment Inc. Texture processing device, drawing processing device, and texture processing method
US20080117214A1 (en) * 2006-11-22 2008-05-22 Michael Perani Pencil strokes for vector based drawing elements
US20130016112A1 (en) * 2007-07-19 2013-01-17 Disney Enterprises, Inc. Methods and apparatus for multiple texture map storage and filtering including irregular texture maps
WO2019178037A1 (en) * 2018-03-13 2019-09-19 Google Llc Mixed noise and fine texture synthesis in lossy image compression
CN111402124A (en) * 2020-03-24 2020-07-10 支付宝(杭州)信息技术有限公司 Method and device for generating texture image and synthetic image

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
刘东昌;王欣刚;李锐;: "基于纹理分析的张数检测算法", 中国图象图形学报, no. 03, 16 March 2011 (2011-03-16), pages 413 - 419 *
江巨浪;薛峰;郑江云;黄忠;: "一种基于样图的体纹理快速生成算法", 计算机辅助设计与图形学学报, no. 08, 15 August 2011 (2011-08-15), pages 1311 - 1318 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114339448A (en) * 2021-12-31 2022-04-12 深圳万兴软件有限公司 Method and device for manufacturing light beam video special effect, computer equipment and storage medium
CN114339448B (en) * 2021-12-31 2024-02-13 深圳万兴软件有限公司 Method and device for manufacturing special effects of beam video, computer equipment and storage medium

Similar Documents

Publication Publication Date Title
US10554857B2 (en) Method for noise-robust color changes in digital images
US9813614B2 (en) Method and system for analog/digital image simplification and stylization
TW200842758A (en) Efficient 2-D and 3-D graphics processing
TW201127057A (en) Image processing method for enhancing the resolution of image boundary
Grundland et al. Cross dissolve without cross fade: Preserving contrast, color and salience in image compositing
KR100860968B1 (en) Image-resolution-improvement apparatus and method
CN114049420B (en) Model training method, image rendering method, device and electronic equipment
US8072464B2 (en) 3-dimensional graphics processing method, medium and apparatus performing perspective correction
CN113658064A (en) Texture image generation method and device and electronic equipment
Northam et al. Stereoscopic 3D image stylization
JP6558365B2 (en) Image processing apparatus, image processing method, and program
KR20100122381A (en) Apparatus and method for painterly rendering
CN114862729A (en) Image processing method, image processing device, computer equipment and storage medium
US9619864B2 (en) Image processing apparatus and method for increasing sharpness of images
JP2018028710A (en) Image generator, image generating method, and program
CN103514593B (en) Image processing method and device
Aksoy et al. Interactive 2D-3D image conversion for mobile devices
JP4009289B2 (en) Method for determining a weighting factor for color-calculating a texel color value associated with a footprint
US8358867B1 (en) Painterly filtering
WO2015144563A1 (en) Image processing system and method
Chen et al. Importance-driven composition of multiple rendering styles
Abebe et al. Application of radial basis function interpolation for content aware image retargeting
US11928757B2 (en) Partially texturizing color images for color accessibility
KR20010014320A (en) Image interpolation
Huang et al. Stereoscopic oil paintings from RGBD images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination