CN110648372B - Method and system for determining color of pixel - Google Patents

Method and system for determining color of pixel Download PDF

Info

Publication number
CN110648372B
CN110648372B CN201910251106.1A CN201910251106A CN110648372B CN 110648372 B CN110648372 B CN 110648372B CN 201910251106 A CN201910251106 A CN 201910251106A CN 110648372 B CN110648372 B CN 110648372B
Authority
CN
China
Prior art keywords
coordinates
color
dot
pixel
virtual space
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910251106.1A
Other languages
Chinese (zh)
Other versions
CN110648372A (en
Inventor
李毅
项维康
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Perfect World Beijing Software Technology Development Co Ltd
Original Assignee
Perfect World Beijing Software Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Perfect World Beijing Software Technology Development Co Ltd filed Critical Perfect World Beijing Software Technology Development Co Ltd
Priority to CN201910251106.1A priority Critical patent/CN110648372B/en
Publication of CN110648372A publication Critical patent/CN110648372A/en
Application granted granted Critical
Publication of CN110648372B publication Critical patent/CN110648372B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Graphics (AREA)
  • Image Generation (AREA)

Abstract

The invention discloses a method and a system for realizing spherical color mapping. The embodiment of the invention combines the attenuation characteristic of the light source and the concept of color mapping, realizes a spherical color mapping technology with the illumination effect similar to that of a point light source, and enables a scene producer to adjust the color of certain spherical areas in a virtual space when a later effect is added, thereby generating the light attenuation effect of the point light source.

Description

Method and system for determining color of pixel
Technical Field
The present invention relates to computer graphics technology, and more particularly, to a method and system for combining light source characteristics and color mapping characteristics.
Background
Some point light sources are typically provided in a scene such as a game or animation. A point light source refers to a light source at a certain position in a scene, which emits light in all directions, and the light intensity gradually decreases with distance. When a computer renders a scene, the illumination calculation of a point light source is a time-consuming calculation.
In general, after adding a post-effect such as fog effect when making a scene, the lighting effect of the rendered point light sources is changed or is not expected by the producer, so that the scene producer has to readjust the parameters of the already set point light sources in the scene, and the change of the parameters affects the representation effect of objects illuminated by the point light sources in the scene. Therefore, a scene producer needs to continuously adjust parameters of the light source and parameters of the later effects such as fog effect and the like, and the finally rendered picture can achieve the performance effect expected by the scene producer.
There are techniques in the prior art, such as color mapping, which can modify or adjust the overall color effect of a picture relatively quickly to show a certain specific effect. Color mapping is commonly used as a post-processing technique that can color transform an original picture on a screen according to a certain reference image, thereby changing the color of the original picture. The basic principle of Color mapping can be seen in https:// en. wikipedia. org/wiki/Color _ mapping. The rendering speed of a scene employing a color mapping technique is generally faster than that of a scene in which only point light sources are provided.
However, the conventional color mapping technology can only adjust the overall color of the original picture by the color of one reference image, that is, the color of the picture can only be modified in the two-dimensional space where the screen is located, and the color cannot be modified or mapped in a certain three-dimensional area in the space where the virtual scene is located. For example, in a case where a mystery atmosphere needs to be expressed in a distant view of a forest, which is expressed by adjusting the color tone of a certain block area in a picture, such atmosphere adjustment in a three-dimensional area cannot be realized by only relying on a conventional color mapping technique.
Disclosure of Invention
Embodiments of the present invention combine the characteristics of a light source with attenuation characteristics (e.g., a point source) with color mapping. The embodiment of the invention realizes a spherical color mapping technology with the illumination effect similar to that of a point light source, so that a scene maker can adjust the color of certain spherical areas in a virtual space when adding a later effect to generate the light attenuation effect of the point light source. Embodiments of the present invention are also applicable to combining the characteristics of any other type of light source having attenuation characteristics with the characteristics of color mapping.
In the present application, a space in which a virtual scene is located is referred to as a "virtual space", and all objects (e.g., light sources, models, color mapping objects, etc.) in the virtual scene are located in the virtual space.
One aspect of the invention is a method for determining attenuation coefficients of pixels on a screen relative to an object used to simulate a light source having attenuation characteristics, comprising: setting a position of a center of the object in a scene; and determining the attenuation coefficient according to the coordinates of the pixel relative to the center.
Embodiments of the invention may determine the attenuation coefficient using the following formula: radialtanten (lightspatpos, FalloutExp) × radialtanten ((LightPos-camera aPos)/globaltendedistance, globalt), where Atten represents the attenuation coefficient, LightSpacePos represents the coordinate of the pixel in the local space of the object, LightPos represents the coordinate of the center in the virtual space of the scene, CameraPos represents the coordinate of the camera in the virtual space, globaltendedistance represents the distance extending and attenuating in the direction from the coordinate of the camera to the coordinate of the center, globalt represents the attenuation intensity in the direction from the coordinate of the camera to the coordinate of the center, FalloutExp represents the attenuation intensity in all directions from the center, where radialtanten is a monotonically decreasing function over the [0,1] interval.
The radialtanten function defined as follows may be used in embodiments of the present invention: radialtanten (x, y) ═ pow (1-saturrate (dot (x, x)), y), where pow is a power function, which means (1-saturrate (dot (x, x)))yWherein saturate is a function for saturation processing for dot (x, x), and is defined as follows: saturrate (dot (x, x)) returns a result of 1 when dot (x, x) is greater than or equal to 1, saturrate (dot (x, x)) returns a result of 0 when dot (x, x) is less than or equal to 0, saturrate (dot (x, x)) returns a result of dot (x, x) when dot (x, x) is greater than 0 and less than 1, where dot (x, x) is a function for finding the dot product of vector x and the vector x.
Another aspect of the present invention is a method of calculating a final color of a pixel, comprising: determining an attenuation coefficient for the pixel; performing color mapping on the source color of the pixel to obtain a target color; and weighting the source color and the target color by using the attenuation coefficient to determine the final color.
Yet another aspect of the invention is a system for determining attenuation coefficients of pixels on a screen relative to an object used to simulate a light source having attenuation characteristics, comprising: means for setting a position of a center of the object in a scene; and means for determining the attenuation coefficient from the coordinates of the pixel with respect to the center.
Yet another aspect of the invention is a system for calculating a final color of a pixel, comprising: means for determining an attenuation coefficient for the pixel; means for color mapping a source color of the pixel to obtain a target color; and means for determining the final color by weighting the source and target colors with the attenuation coefficients.
Yet another aspect of the present invention discloses a computer-readable medium having computer-readable instructions stored thereon which, when executed by a computer, are capable of performing a method according to an embodiment of the present invention.
Drawings
FIG. 1 shows a flow diagram of a production scenario according to an embodiment of the present invention.
FIG. 2 shows a flow diagram for computing pixel colors in a spherical object, according to an embodiment of the invention.
Detailed Description
The content of the invention will now be discussed with reference to a number of exemplary embodiments. It is to be understood that these examples are discussed only to enable those of ordinary skill in the art to better understand and thus implement the teachings of the present invention, and are not meant to imply any limitations on the scope of the invention.
As used herein, the term "include" and its variants are to be read as open-ended terms meaning "including, but not limited to. The term "based on" is to be read as "based, at least in part, on". The terms "one embodiment" and "an embodiment" are to be read as "at least one embodiment". The term "another embodiment" is to be read as "at least one other embodiment". The term "coordinate" is to be read as corresponding to the position of an object in space (e.g., two-dimensional space or three-dimensional space), and thus the term "coordinate" is equivalent to the meaning of the term "position".
Embodiments of the present invention may combine the characteristics of a light source with attenuation characteristics (e.g., a point source) with the characteristics of a color map. For example, embodiments of the present invention propose a spherical color mapping model to simulate the attenuation characteristics of a spherical point light source and combine the characteristics of color mapping. Embodiments of the present invention may also simulate the attenuation characteristics of other types of light sources with attenuation characteristics (e.g., surface light sources) and combine the characteristics of color mapping.
For example, in an embodiment of simulating a spherical point light source, for a pixel to be rendered on a screen, a center (for a spherical object, the center) of an object (hereinafter referred to as a "spherical object") simulating the spherical point light source is first set according to an area to be adjusted in a scene, and then an attenuation coefficient suitable for the pixel is calculated according to a distance and a direction of the pixel relative to the center. The calculated attenuation coefficient will be combined with the color mapping to determine the final color of the pixel.
FIG. 1 shows a flowchart of a generalized production scenario according to an embodiment of the present invention, which includes the steps of:
(1) the spherical object according to the embodiment of the invention is added when a scene is edited manually, and various parameters of the object are set.
The parameters that can be set include: name of a spherical object displayed in a scene, Layer displaying the object, whether the object is visible or not, location LightPos of the center of sphere of the object in a virtual space, Radius of the object, degree of zoom X/Y/ZAxis Scale of the object in the direction of three axes of X/Y/Z of a cartesian coordinate system of the virtual space, distance globaltendedistance extending and attenuating in the direction from camera coordinates to center of sphere coordinates, attenuation intensity globalten in the direction from camera coordinates to center of sphere coordinates, attenuation intensity fairoutexp in all directions from the center of sphere of the spherical object, and the like. According to an embodiment of the present invention, the types of the above parameters may be defined as follows: the mathematical meaning of LightPos is a point in space that can be represented in a computer using a Vector (Vector). A vector may be a custom data structure that typically contains three floating-point values (a three-dimensional vector contains three floating-point values and an N-dimensional vector contains N floating-point values) and some operating functions defined specifically for vector operations in mathematics. Other of the above parameters may be floating point numbers (real numbers).
A camera is a virtual object that observes a virtual space in the virtual space, and the camera is responsible for projecting the observed scene onto a screen or other visual device. The camera coordinates and the center of sphere coordinates are both coordinates in a virtual space. The parameter globalsettentistance influences the attenuation effect of the object towards the camera direction; for example, if the distance between the local space covered by the spherical object and the camera is less than or equal to the set value of the globalsettentitance, the local space covered by the object can be observed by the camera, otherwise the object cannot be observed by the camera. Local space generally refers to the space in which an object or model is modeled in virtual space; for a spherical object, its local space is the space represented by a cartesian coordinate system with the center of the sphere of the object as the origin of coordinates and the same as the coordinate system of the virtual space. The parameters GlobalAtten and FalloutExp are used to describe the decay rate of the color.
According to an embodiment of the present invention, a color parameter of the spherical object may also be set or adjusted, which parameter is generally related to the color of the color map employed by the embodiment of the present invention. For example, a scene producer may adjust color parameters of spherical objects according to different color channels (e.g., luminance channel RGB, red cyan channel R, green magenta channel G, and blue yellow channel B).
According to the embodiment of the invention, various parameters and colors of the object can be set through a software interface. For example, operations that may be implemented via a software interface include, but are not limited to: checking the parameters to be set, filling the specific numerical values of the parameters, and adjusting or setting the color of the object.
(2) Fog effect parameters in the scene are set, as well as parameters for various other post-processing effects (e.g., tone mapping).
How to set various post-treatment effects such as fog effect belongs to the conventional operation in the field, and is not described herein.
The order of the steps shown in fig. 1 may be interchanged, according to an embodiment of the invention. Embodiments of the present invention also include apparatuses that, in various building blocks, perform the steps contained and/or implied in fig. 1.
After the scene is made, a computer is needed to render the scene. FIG. 2 shows a flow chart for computing the color of a pixel using a spherical object according to an embodiment of the invention, comprising the steps of:
(1) and calculating the coordinates of the pixels in the virtual space according to the screen coordinates and the screen space depth of the pixels.
In the prior art rendering process, coordinates of an object in a virtual space are transformed into coordinates in a camera space through a View Matrix (View Matrix) of a camera, transformed into coordinates in a normalized device space (normalized device space) through a perspective projection Matrix (Project Matrix) and a perspective division method, and finally screen coordinates and screen space depth of a corresponding pixel are obtained through a viewport transformation (viewport transformation) (generally, one object in the virtual space corresponds to a plurality of pixels on a screen). Where the screen coordinates refer to the coordinates of the pixel in screen space (which correspond to the location of the pixel in the screen), and the screen space depth may be saved to a depth cache.
The first step in fig. 2 is to reverse the above process according to an embodiment of the present invention. Firstly, the inverse transformation of viewport transformation is utilized to obtain the coordinates of pixels in a normalized equipment space according to the screen coordinates and the screen space depth of the pixels, then the coordinates of the pixels in a camera space are obtained according to the inverse operation of perspective division and the inverse matrix calculation of a perspective projection matrix, and finally the coordinates of the pixels in a virtual space are obtained according to the inverse matrix calculation of a camera view matrix.
(2) And calculating the coordinates of the pixel in the local space of the spherical object according to the coordinates of the pixel in the virtual space and the position information of the spherical object added in the scene in the virtual space.
The position information of the spherical object in the virtual space may include: the coordinates of the center of sphere in virtual space, the angle of rotation of the object in virtual space, and the degree of zoom of the object in the direction of the three X/Y/Z axes. According to a preferred embodiment of the present invention, the spherical object may also have other shapes such as an ellipsoid shape by adjusting the respective degrees of zoom in the three X/Y/Z axes. And a transformation matrix transformed from the virtual space to the local space of the spherical object can be calculated according to the position information of the object in the virtual space, and the coordinates of the pixel in the local space of the spherical object can be calculated through the transformation matrix.
(3) The attenuation coefficient applicable to the pixel is calculated.
The attenuation coefficient will be used together with the color mapping to determine the final color of the pixel. In embodiments using spherical objects that simulate spherical point sources, this attenuation coefficient can achieve effects similar to the point source light intensity attenuation. In an embodiment according to the invention, the previously calculated coordinates of the pixel in the local space of the spherical object reflect the distance and direction of the pixel with respect to the center of the sphere of the spherical object, both factors substantially determining the attenuation coefficient applicable to the pixel.
Various models may be used to specifically calculate the attenuation coefficient. One embodiment of the invention uses the following formula to calculate the attenuation coefficient:
Atten=RadialAtten(LightSpacePos,FalloutExp)*RadialAtten((Li ghtPos-CameraPos)/GlobalAttenDistance,GlobalAtten)
where Atten denotes the calculated attenuation coefficient, LightSpacePos denotes the coordinate of the pixel in the local space of the spherical object, LightPos denotes the coordinate of the center of the sphere of the spherical object in the virtual space, and CameraPos denotes the coordinate of the camera in the virtual space. The types of lightSpacePos and CameraPos are the same as the type of lightPos.
The actual calculation formula of the radialtanten function can be determined according to the desired effect, but it should be generally ensured that it is monotonically decreasing over the interval [0,1 ]. An example of a radialtanten function may be:
RadialAtten(x,y)=pow(1-saturate(dot(x,x)),y)
wherein pow is a power function, and pow (a, b) represents ab. saturrate is a function used to saturate the input parameters, defined as: when the input parameter a is greater than or equal to 1, the calculation result of saturrate (a) is 1; when a is less than or equal to 0, the calculation result of saturate (a) is 0; when a is greater than 0 and less than 1, the calculation result of saturate (a) is the input parameter a. dot is a function used to dot product the vectors.
(4) The source color of the pixel is color mapped using a color map to obtain a target color of the pixel.
Mapping or transforming the color of the pixel using the color map is a technique known in the art and will not be described in detail herein. In the present invention, the scene producer can select or set the parameters of the required color map as required.
(5) Based on the source color and the target color, the final color of the pixel is calculated using the attenuation coefficient as a weighting factor.
For example, the final color of a pixel may be calculated using the following formula:
Color=SrcColor*(1-Atten)+DestColor*Atten
wherein, Color represents the final Color of the pixel, SrcColor represents the source Color, DestColor is the target Color obtained by sampling the Color mapping map in the fourth step, and Atten is the attenuation coefficient obtained by calculation in the third step. Other weighting schemes may also be used.
The effect on the final color of the pixel when certain parameters are changed will be described below. For example, when the globaltendedistance is much larger than the distance between the camera and the spherical object, the Atten will approach 1, and the color finally calculated by the color mapping approaches DestColor, whereas when the globaltendedistance is much smaller than the distance between the camera and the spherical object, the final color approaches SrcColor.
The method and apparatus of the embodiments of the present invention may be implemented as a pure software module (for example, a software program written in C + + language and HLSL language), as a pure hardware module (for example, a dedicated ASIC chip or FPGA chip) as required, or as a module combining software and hardware (for example, a firmware system storing fixed codes).
Another aspect of the invention is a computer-readable medium having computer-readable instructions stored thereon that, when executed, perform a method of embodiments of the invention.
Embodiments of the present invention combine the attenuation characteristics of a light source with the characteristics of a color map. For a point light source, the embodiment of the invention combines the attenuation characteristic of the point light source and the characteristic of color mapping, thereby flexibly selecting a local three-dimensional space in a scene, which needs to adjust the atmosphere, and quickly calculating the final color of a pixel by using the color mapping.
It will be appreciated by persons skilled in the art that the foregoing description is only exemplary of the invention and is not intended to limit the invention. The present invention may include various modifications and variations. Any modifications and variations within the spirit and scope of the present invention should be included within the scope of the present invention.

Claims (15)

1. A method for determining an attenuation coefficient of a pixel on a screen relative to an object used to model a light source having attenuation characteristics, comprising:
setting a position of a center of the object in a scene; and
determining the attenuation coefficient from coordinates of the pixel relative to the center, wherein the attenuation coefficient is determined according to the following formula:
Atten=RadialAtten(LightSpacePos,FalloutExp)*RadialAtten((LightPos-CameraPos)/GlobalAttenDistance,GlobalAtten),
wherein, Atten represents the attenuation coefficient, LightSpacePos represents coordinates of the pixel in a local space of the object, LightPos represents coordinates of the center in a virtual space of the scene, CameraPos represents coordinates of a camera in the virtual space, GlobalAttribute represents a distance extending and attenuating in a direction from the coordinates of the camera to the coordinates of the center, GlobalAttenten represents an attenuation intensity in a direction from the coordinates of the camera to the coordinates of the center, FalloutExp represents an attenuation intensity in all directions from the center,
where radialtanten is a monotonically decreasing function over the [0,1] interval.
2. The method of claim 1, wherein the radialtanten function is defined as follows:
RadialAtten(x,y)=pow(1-saturate(dot(x,x)),y),
where pow is a power function, which represents (1-satrate (dot (x, x)))y
Wherein saturate is a function for saturation processing for dot (x, x), and is defined as follows: saturrate (dot (x, x)) returns a result of 1 when dot (x, x) is greater than or equal to 1, saturrate (dot (x, x)) returns a result of 0 when dot (x, x) is less than or equal to 0, saturrate (dot (x, x)) returns a result of dot (x, x) when dot (x, x) is greater than 0 and less than 1, where dot (x, x) is a function for finding the dot product of vector x and the vector x.
3. The method of claim 1, further comprising:
calculating the coordinates of the pixels in the virtual space according to the screen coordinates and the screen space depth of the pixels; and
and calculating the LightSpacePos according to the coordinates of the pixel in the virtual space and the position information of the object in the virtual space.
4. The method of claim 3, wherein the location information comprises: a rotation angle of the object in the virtual space and a degree of scaling of the object in directions of respective coordinate axes of the virtual space.
5. The method of claim 1, wherein the light source is a point light source.
6. A method of calculating a final color of a pixel, comprising:
determining an attenuation coefficient applicable to the pixel using a method according to any one of claims 1-4;
performing color mapping on the source color of the pixel to obtain a target color; and
and weighting the source color and the target color by using the attenuation coefficient to determine the final color.
7. The method of claim 6, wherein the step of determining the final color uses the following formula:
Color=SrcColor*(1-Atten)+DestColor*Atten
wherein Color represents the final Color of the pixel, SrcColor represents the source Color, DestColor represents the target Color, and Atten represents the attenuation coefficient.
8. A system for determining attenuation coefficients of pixels on a screen relative to an object used to model a light source having attenuation characteristics, comprising:
means for setting a position of a center of the object in a scene; and
means for determining the attenuation coefficient from the coordinates of the pixel relative to the center, wherein the means for determining the attenuation coefficient from the coordinates of the pixel relative to the center determines the attenuation coefficient according to the formula:
Atten=RadialAtten(LightSpacePos,FalloutExp)*RadialAtten((LightPos-CameraPos)/GlobalAttenDistance,GlobalAtten),
wherein, Atten represents the attenuation coefficient, LightSpacePos represents coordinates of the pixel in a local space of the object, LightPos represents coordinates of the center in a virtual space of the scene, CameraPos represents coordinates of a camera in the virtual space, GlobalAttribute represents a distance extending and attenuating in a direction from the coordinates of the camera to the coordinates of the center, GlobalAttenten represents an attenuation intensity in a direction from the coordinates of the camera to the coordinates of the center, FalloutExp represents an attenuation intensity in all directions from the center,
where radialtanten is a monotonically decreasing function over the [0,1] interval.
9. The system of claim 8, wherein the radialtanten function is defined as follows:
RadialAtten(x,y)=pow(1-saturate(dot(x,x)),y),
where pow is a power function, which represents (1-satrate (dot (x, x)))y
Wherein saturate is a function for saturation processing for dot (x, x), and is defined as follows: saturrate (dot (x, x)) returns a result of 1 when dot (x, x) is greater than or equal to 1, saturrate (dot (x, x)) returns a result of 0 when dot (x, x) is less than or equal to 0, saturrate (dot (x, x)) returns a result of dot (x, x) when dot (x, x) is greater than 0 and less than 1, where dot (x, x) is a function for finding the dot product of vector x and the vector x.
10. The system of claim 8, further comprising:
means for calculating coordinates of the pixel in the virtual space from the screen coordinates and screen space depth of the pixel; and
means for calculating the LightSpacePos according to the coordinates of the pixel in the virtual space and the position information of the object in the virtual space.
11. The system of claim 10, wherein the location information comprises: a rotation angle of the object in the virtual space and a degree of scaling of the object in directions of respective coordinate axes of the virtual space.
12. The system of claim 8, wherein the light source is a point light source.
13. A system for calculating a final color of a pixel, comprising:
means for determining an attenuation coefficient applicable to the pixel using a method according to any one of claims 1 to 4;
means for color mapping a source color of the pixel to obtain a target color; and
means for determining the final color by weighting the source and target colors with the attenuation coefficients.
14. The system of claim 13, wherein the means for weighting the source and target colors with the attenuation coefficients determines the final color using the formula:
Color=SrcColor*(1-Atten)+DestColor*Atten
wherein Color represents the final Color of the pixel, SrcColor represents the source Color, DestColor represents the target Color, and Atten represents the attenuation coefficient.
15. A computer readable medium having computer readable instructions stored thereon which, when executed by a computer, are capable of performing the method of any one of claims 1-7.
CN201910251106.1A 2019-03-29 2019-03-29 Method and system for determining color of pixel Active CN110648372B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910251106.1A CN110648372B (en) 2019-03-29 2019-03-29 Method and system for determining color of pixel

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910251106.1A CN110648372B (en) 2019-03-29 2019-03-29 Method and system for determining color of pixel

Publications (2)

Publication Number Publication Date
CN110648372A CN110648372A (en) 2020-01-03
CN110648372B true CN110648372B (en) 2022-04-22

Family

ID=69009346

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910251106.1A Active CN110648372B (en) 2019-03-29 2019-03-29 Method and system for determining color of pixel

Country Status (1)

Country Link
CN (1) CN110648372B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102158524A (en) * 2010-12-30 2011-08-17 北京像素软件科技股份有限公司 Rendering-based distributed behavior control system
CN102542612A (en) * 2010-12-27 2012-07-04 新奥特(北京)视频技术有限公司 Method for setting light source parameters and calculating light color based on OpenGL core mode
CN103279633A (en) * 2013-03-26 2013-09-04 浙江工业大学 Brain fiber three-dimensional display method based on diffusion-weighted magnetic resonance data
CN108921810A (en) * 2018-06-20 2018-11-30 厦门美图之家科技有限公司 A kind of color transfer method and calculate equipment
CN109448098A (en) * 2018-09-29 2019-03-08 北京航空航天大学 A method of virtual scene light source is rebuild based on individual night scene image of building

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102542612A (en) * 2010-12-27 2012-07-04 新奥特(北京)视频技术有限公司 Method for setting light source parameters and calculating light color based on OpenGL core mode
CN102158524A (en) * 2010-12-30 2011-08-17 北京像素软件科技股份有限公司 Rendering-based distributed behavior control system
CN103279633A (en) * 2013-03-26 2013-09-04 浙江工业大学 Brain fiber three-dimensional display method based on diffusion-weighted magnetic resonance data
CN108921810A (en) * 2018-06-20 2018-11-30 厦门美图之家科技有限公司 A kind of color transfer method and calculate equipment
CN109448098A (en) * 2018-09-29 2019-03-08 北京航空航天大学 A method of virtual scene light source is rebuild based on individual night scene image of building

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
OpenGL颜色混合;ZinanJau;《CSDN》;20130923;第1-5页 *

Also Published As

Publication number Publication date
CN110648372A (en) 2020-01-03

Similar Documents

Publication Publication Date Title
JP7395577B2 (en) Motion smoothing of reprojected frames
US8803879B1 (en) Omnidirectional shadow texture mapping
CN112316420B (en) Model rendering method, device, equipment and storage medium
JP7007348B2 (en) Image processing equipment
US20070139408A1 (en) Reflective image objects
KR100567204B1 (en) An improved method and apparatus for per pixel mip mapping and trilinear filtering
US6922193B2 (en) Method for efficiently calculating texture coordinate gradient vectors
JPH0757117A (en) Forming method of index to texture map and computer control display system
JP2022511273A (en) Generate and modify object representations in augmented or virtual reality scenes
KR19990045321A (en) Image processing in which polygons are divided
US6529194B1 (en) Rendering method and apparatus, game system, and computer readable program product storing program for calculating data relating to shadow of object in virtual space
US8072464B2 (en) 3-dimensional graphics processing method, medium and apparatus performing perspective correction
US20230230311A1 (en) Rendering Method and Apparatus, and Device
US7576746B1 (en) Methods and systems for rendering computer graphics
US7071937B1 (en) Dirt map method and apparatus for graphic display system
US20040257364A1 (en) Shadow casting within a virtual three-dimensional terrain model
CN110648372B (en) Method and system for determining color of pixel
US20210090322A1 (en) Generating and Modifying Representations of Objects in an Augmented-Reality or Virtual-Reality Scene
US20030025706A1 (en) System and method for rendering a texture map utilizing an illumination modulation value
JP2007272847A (en) Lighting simulation method and image composition method
KR101071952B1 (en) Method and system for interactively editing lighting effects for 3d rendered images
KR100848687B1 (en) 3-dimension graphic processing apparatus and operating method thereof
US9514566B2 (en) Image-generated system using beta distribution to provide accurate shadow mapping
GB2432499A (en) Image generation of objects distant from and near to a virtual camera
KR20060082736A (en) Method and apparatus for 3 dimension rendering processing using the monochromatic lighting

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant