CN101271588A - Recreatable geometric shade pattern method - Google Patents

Recreatable geometric shade pattern method Download PDF

Info

Publication number
CN101271588A
CN101271588A CNA2008100961357A CN200810096135A CN101271588A CN 101271588 A CN101271588 A CN 101271588A CN A2008100961357 A CNA2008100961357 A CN A2008100961357A CN 200810096135 A CN200810096135 A CN 200810096135A CN 101271588 A CN101271588 A CN 101271588A
Authority
CN
China
Prior art keywords
geometric
recreatable
depth value
shade
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CNA2008100961357A
Other languages
Chinese (zh)
Other versions
CN101271588B (en
Inventor
戴庆华
杨宝光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Via Technologies Inc
Original Assignee
Via Technologies Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Via Technologies Inc filed Critical Via Technologies Inc
Publication of CN101271588A publication Critical patent/CN101271588A/en
Application granted granted Critical
Publication of CN101271588B publication Critical patent/CN101271588B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Generation (AREA)

Abstract

The invention provides a method which can rebuild geometrical shadowgraphs. Standard shadowgraphs provide a method of rapidly and suitably drawing shadows in a scene. The invention provides a new arithmetic: the arithmetic can rebuild the geometrical shadowgraphs. The invention reaches the target via using a depth value. The depth value is truly rebuilt via a geometrical surface instead of a method of sampling from a depth map point. At the same time, compared with most shadowgraph technologies which can set a constant offset value, the arithmetic which can rebuild the geometrical shadowgraphs uses a smaller depth offset value and can generate correct graphics without wrong ''self shadow'' or wrong ''non shadow''. The method which can rebuild the geometrical shadowgraphs provided by the invention can generate accurate shadow edge via reducing perspective sawtooth and projective sawtooth.

Description

Recreatable geometric shade pattern method
Technical field
The invention relates to a kind of graphics process, and particularly draw relevant for a kind of shade.
Background technology
In computer graphics, echo (Shadow mapping) and shadow awl (shadow volumes) are two kinds of real-time (real-time) shade technology commonly used.Shadow awl is the technology that Frank Crow was proposed in 1977, uses method of geometry to calculate 3 dimension (3-D) object lightproof areas.This algorithm utilizes template buffer memory (stencil buffer) to calculate a certain pixel (test pixel) whether in shade.The major advantage of shadow awl is to be accurately for pixel, otherwise the accuracy of echo need on texture storage device (texture memory) size and how the projection shade be decided.The a large amount of hardware of shadow awl Technology Need is filled out the time of painting, and that its execution speed tends to is slower than shadowgraph, especially the large-scale complex geometric scene.
Echo is that shade is added technology in the 3-D computer picture, and it is proposed 1978 by LanceWilliams.This algorithm is widely used in describing in advance (pre-rendered) scene, and in the application of real-time (real-time).Compare the degree of depth of shade and test pixel by the light source observation point, that is whether as seen to test, so that set up the shade of shade certain test pixel of light source.Echo is a kind of simple and effective image space method (image space method).Echo be the shade technique of expression one of them, it usually is applied on the high-speed requirement.Yet it has run into sawtooth problem (aliasing error) and depth shift problem (depthbias issues).Solving these two shortcomings is research topics of shade performance technical field.
Sawtooth mistake in echo can be divided into two classes: perspective sawtooth mistake (perspective aliasing errors) and projection sawtooth mistake (projectivealiasing errors).Can have an X-rayed the sawtooth mistake when amplifying at the shade edge.When the almost parallel geometric jacquard patterning unit surface of light and when extending beyond depth range, projection sawtooth mistake will take place.Another problem of most of shadowgraph is the depth shift problem.For fear of " from the shade " of mistake (self-shadowing) problem, William discloses a kind of constant depth shift technology, and it is just adding it in depth-sampling before relatively with real surface (true surface).Unfortunately, " shadow-free " that too many skew may lead to errors (non-shadowing appears to shade and floats over the top that light receives thing) and that shade is retreated is too far away.In fact, determine that directly off-set value is very difficult, and can't find out a general acceptable value in each scene.
Summary of the invention
The invention provides a kind of Recreatable geometric shade pattern method, reducing " perspective sawtooth " (perspective aliasing) and these two kinds of sawtooth mistakes of " projection sawtooth " (projective aliasing), and solve depth shift and the mistake " from shade " that causes (falsesclf-shadowing) with the problem of wrong " shadow-free " problems such as (false non-shadowing).
The present invention proposes a kind of Recreatable geometric shade pattern method.At first be observation point, store the geological information of a plurality of shading geometric forms of object front surface (fonrt-facing) with the light source.Test pixel is carried out uniformity test, so that from a plurality of shading geometric forms, find out the shading geometric form that corresponds to test pixel.Wherein the shading geometric form has chopping point; With the light source is observation point, and this test pixel and this chopping point are overlapping.Use the geological information of above-mentioned shading geometric form and the positional information of test pixel, rebuild the depth value of chopping point.Compare the depth value of chopping point and the depth value of test pixel at last, judge with the shade of finishing test pixel.
In one embodiment of this invention, above-mentioned geological information can comprise the apex coordinate or the figure index of described geometric form.Above-mentioned uniformity test can comprise the steps.At first select described geometric form one of them, read the geological information of selected geometric form then, comprise the apex coordinate (v of this geometric form in its geological information 0.x, v 0.y, v 0.z), (v 1.x, v 1.y, v 1.z) and (v 2.x, v 2.y, v 2.z).Next calculation equation p . x p . y 1 = w 1 w 2 w 3 * v 0 . x v 0 . y 1 v 1 . x v 1 . y 1 v 2 . x v 2 . y 1 , To ask for the barycentric coordinates value (w of chopping point 1, w 2, w 3); Wherein (p.x, p.y p.z) are the coordinate of test pixel.Barycentric coordinates value (w according to chopping point 1, w 2, w 3) judge whether selected geometric form is consistent.If selected geometric form judged result is consistent, then this geometric form is the shading geometric form.
In one embodiment of this invention, the step of above-mentioned reconstruction chopping point depth value comprises: calculation equation T . z = [ w 1 , w 2 , w 3 ] v 0 . z v 1 . z v 2 . z , To ask for the depth value T.z of this chopping point.
In one embodiment of this invention, the step of above-mentioned reconstruction chopping point depth value comprises: calculation equation T . z = [ w 1 , w 2 , w 3 ] * v 0 . x v 0 . y 1 v 1 . x v 1 . y 1 v 2 . x v 2 . y 1 * v 0 . x v 0 . y 1 v 1 . x v 1 . y 1 v 2 . x v 2 . y 1 - 1 * v 0 . z v 1 . z v 2 . z = p . x p . y 1 * v 0 . x v 0 . y 1 v 1 . x v 1 . y 1 v 2 . x v 2 . y 1 - 1 * v 0 . z v 1 . z v 2 . z , To ask for the depth value T.z of chopping point.
In one embodiment of this invention, the step of above-mentioned reconstruction chopping point depth value comprises: calculation equation T . z = v 0 . z + ∂ z ∂ x ( p . x - v 0 . x ) + ∂ z ∂ y ( p . y - v 0 . y ) = v 0 . z + p . x - v 0 . x p . y - v 0 . y * v 1 . x - v 0 . x v 1 . y - v 0 . y v 2 . x - v 0 . x v 2 . y - v 0 . y - 1 * v 1 · z - v 0 · z v 2 · z - v 0 · z , To ask for the depth value T.z of chopping point.
Therefore the present invention can use the positional information of this test pixel and stored geological information because of with the light source being the geological information that observation point stores a plurality of geometric forms of front surface of object, rebuilds the depth value of chopping point.After obtaining the depth value of chopping point, just can compare the two depth value of chopping point and test pixel, to finish the shade judgement of this test pixel.
Recreatable geometric shade pattern method of the present invention by lowering perspective sawtooth and projection sawtooth, can produce accurate shade edge.
Description of drawings
Fig. 1 is the process flow diagram that a kind of Recreatable geometric shade pattern method is described according to the embodiment of the invention.
Fig. 2 is the spatial relationship according to embodiment of the invention explanation echo, body surface (part) and test pixel.
Two adjacent triangle T R0 of Fig. 3 A explanation and TR1.
Fig. 3 B explanation is at rasterisation zone AR0 and the AR1 of Fig. 3 A intermediate cam shape TR0 and TR1.
Fig. 3 C is the pattern example that two kinds of sampling templates are described according to the present invention.
The projection sawtooth mistake that Fig. 4 A description standard echo is produced.
Fig. 4 B is according to the embodiment of the invention explanation projection sawtooth result that Recreatable geometric shade pattern produced.
The test scene that Fig. 5 A description standard echo is produced with constant depth shift technology (depth shift value 1e-3).
The test scene that Fig. 5 B description standard echo is produced with constant depth shift technology (depth shift value 1e-6).
Fig. 5 C is the figure depth shift test scene that is produced according to embodiment of the invention explanation Recreatable geometric shade pattern (depth shift value 1e-6).
Embodiment
For above-mentioned feature and advantage of the present invention can be become apparent, preferred embodiment cited below particularly, and cooperate appended graphicly, be described in detail below.
Those skilled in the art can realize the present invention with reference to following examples.Certainly, the form that following embodiment can also computer program realizes, and utilizes computer readable storage media to store this computer program, carries out the method for Recreatable geometric shade pattern in order to computing machine.
Fig. 1 is the process flow diagram that a kind of Recreatable geometric shade pattern method is described according to the embodiment of the invention.Present embodiment can be handled a plurality of light sources.For simply clearly demonstrating present embodiment, will with the single light source method that example illustrates Recreatable geometric shade pattern below.In the figure that computing machine is drawn, body surface can be made of a plurality of geometric forms (for example triangle or other geometric forms).Present embodiment will suppose that body surface is made of a plurality of triangle.Those of ordinary skills can the above-mentioned body surface of any technology to drawing.
Fig. 2 illustrates the spatial relationship of echo, body surface (part) and test pixel according to the embodiment of the invention.Can draw scene from light source viewpoint light source observation point (light ' s point ofview).With pointolite (point light source), this observation point can be perspective projection (perspective projection).For directive property light source (directional light), can use rectangular projection (orthographicprojection).As shown in Figure 2, the shade surface comprises triangle T R0, TR1, TR2 and TR3.From above-mentioned drafting, can capture the information of each shading triangle (occludingtriangles) TR0~TR3, and it is left among the geometrical shadow figure (geometryshadow maps).That is, be observation point with light source viewpoint light source, the geological information (step S110) of a plurality of geometric forms of the front surface of storage jobbie.In present embodiment, above-mentioned geological information can comprise the apex coordinate of each geometric form, the apex coordinate of for example shading triangle T R0~TR3 or comprise the figure index of each geometric form.In the standard view volume (light canonical view volume) of light source observation point and light source viewpoint space (light view space), this leg-of-mutton linear characteristic can let us to rebuild these shading triangles down at pointolite (the same) with the directive property light source.
Next carry out step S120, test pixel is carried out uniformity test, from all geometric forms, to find out a shading geometric form.Wherein, this shading geometric form has chopping point (with the light source is among the geometrical shadow figure of observation point, and this test pixel and this chopping point are overlapping).Step S120 can draw scene from photography observation point (camera viewpoint) by applicating geometric echo (geometry shadowmaps).This processing has three main members.For each test pixel (testing pixel, for example the test pixel P among Fig. 2) of object, at first to find out coordinate from pixel that light source is seen (p.x, p.y, p.z).(x p.z) and y value be corresponding to the position in geometric graph texture (geometry map texture) for p.x, p.y, and be used in triangle uniformity test (triangle consistency tests) so that find out the shading triangle for coordinate.
The shading triangle that above-mentioned steps S120 can find out test pixel P is TR0.Next carry out step S130, use the geological information of shading geometric form and the positional information of test pixel, rebuild the depth value of chopping point.Just use the stored geological information of step S110 to rebuild the chopping point depth value of pixel P (for example depth value of chopping point Pd among Fig. 2).
Next carry out step S140, relatively the depth value of chopping point Pd and the depth value of test pixel P are judged with the shade of finishing test pixel P.Contrast in the reconstruction depth value of shading triangle T R0, the z value of test pixel P (depth value derives from the standard view volume (light canonical view volume) of light source observation point) is judged with the shade of finishing test pixel P tested.At last, be depicted in institute's test pixel in the shade or in light.If a plurality of light sources are arranged, then each light source is used different separately geometrical shadow figure.
Those of ordinary skills can realize present embodiment according to above-mentioned explanation.
Below with the detailed enforcement example of each step in the key diagram 1, yet implementation of the present invention should be not limited with this.Fig. 2 explanation is converted to directive property light source the standard view volume of light source observation point from the pointolite of the observation visual space of light.Suppose that the scene in the standard view volume of light source observation point is made up of four adjacent triangle T R0, TR1, TR2 and TR3.
At first (step S110), triangle T R0~TR3 are projected (projected) and rasterisation (rasterized) its corresponding regional AR0, AR1, AR2 and AR3 to the geometrical shadow figure respectively.Each texel (texel) in each regional AR0~AR3 comprises its corresponding leg-of-mutton geological information (being apex coordinate in the present embodiment), and for example the texel in regional AR0 comprises the apex coordinate (v of triangle T R0 0.x, v 0.y, v 0.z), (v 1.x, v 1.y, v 1.z) and (v 2.x, v 2.y, v 2.z).Step S110 is except being stored in geological information (prior art is stored in shade and seeks for depth value, rather than geological information) the echo, and the operation of step S110 almost is equal to the standard echo.For pointolite,, leg-of-mutton three apex coordinates have been stored then in the rasterisation zone of its echo with the standard view volume (light canonical view volume) of scene conversion to the light source observation point.Another mode can obtain coordinate vertices from adjacent triangle.For example in Fig. 2, all be stored in the rasterisation zone of triangle T R0 with 6 apex coordinates of adjacent triangle T R1, TR2 of triangle T R0 and TR3.For the directive property light source, then store the apex coordinate in the standard view volume space of specifying " in the processing " light source observation point, the light in this observation point space is parallel to the z axle.
Next, visible pixels (visiblepixel) P in visible space (eye space) be converted to the light source observation point standard view volume coordinate (p.x, p.y, p.z).The described uniformity test of step S120 may comprise select geometric form (for example triangle T R0~TR3) one of them.Step S120 may comprise the geological information (for example, if select triangle T R0, then reading the geological information of regional AR0 from geometrical shadow figure) that reads selected geometric form.The apex coordinate that can comprise geometric form in the above-mentioned geological information, for example apex coordinate (v of triangle T R0 0.x, v 0.y, v 0.z), (v 1.x, v 1.y, v 1.z) and (v 2.x, v 2.y, v 2.z).(p.x p.y), can find out corresponding sampling points T among the geometrical shadow figure with two dimension (2-D) coordinate.S120 may comprise calculation equation 1 in this step:
p . x p . y 1 = w 1 w 2 w 3 * v 0 . x v 0 . y 1 v 1 . x v 1 . y 1 v 2 . x v 2 . y 1 , (equation 1)
To ask for three-dimensional (3-D) barycentric coordinates value (w corresponding to the chopping point Pd on triangle T R0 summit 1, w 2, w 3).Barycentric coordinates value (w according to chopping point Pd 1, w 2, w 3) judge whether selected geometric form (triangle T R0) is consistent.
For each visible pixels P, shading triangle T R0 need be properly oriented, so that next can rebuild the depth value of this chopping point Pd from the geological information that is stored in geometrical shadow figure.This processing is so-called triangle " uniformity test ".Yet (x, sampling texture maps y) (sampling texture maps) differs and returns the relevant information of blocking the triangle T R0 of this test pixel P surely to have the test pixel coordinate.If calculate three barycentric coordinates value (w that obtain from equation 1 1, w 2, w 3) be in [0,1] scope (meaning is that this triangle has blocked this test pixel), claim that just this triangle test is consistent (consistent).Otherwise this test is inconsistent (inconsistent).If selected geometric form judged result is consistent, then geometric form (triangle T R0) is the shading geometric form of test pixel P.
Since the limited resolution of echo, the inconsistent results that may cause triangle to be tested.If the resolution of texture mapping is lower, then more likely allow the triangle test result become inconsistent.Two adjacent triangle T R0 of Fig. 3 A explanation and TR1, and triangle T R0 and TR1 are limited resolution.Fig. 3 B explanation is at rasterisation zone AR0 and the AR1 of Fig. 3 A intermediate cam shape TR0 and TR1.Under limited resolution, regional AR0 is the rasterisation zone of triangle T R0, and regional AR1 is the rasterisation zone of triangle T R1.Point T is sampled point (sampled point), and it has (x, y) coordinate identical with test visible pixels P.Yet by sampled point T, the accessor of present embodiment institute is the texel A that has triangle T R0 geological information.Shown in Fig. 3 B, sampled point T should be in the rasterisation zone of triangle T R1, yet the depth value that the information of triangle T R0 may lead to errors because of limited resolution is rebuild (the wrong chopping point that sampled point T is considered as triangle T R0).Sampled point T ' also has similar problem among Fig. 3 B.
With adjacent leg-of-mutton geological information, T can find out the shading triangle that blocks the test pixel P of institute by the sampling corresponding point.Yet, when two adjacent areas by rasterisation, just can not use in abutting connection with leg-of-mutton geological information.In order to address this problem, present embodiment increases sampled point to comprise how leg-of-mutton geological information, has therefore also increased the chance of finding out consistent triangle test.Fig. 3 C is the pattern example that two kinds of sampling template (sampling kernels) T and T ' are described according to the present invention.If the test pixel P of institute blocked by the multilayer geometric jacquard patterning unit surface, this template degree of depth result of all consistent triangles tests that also can sort, and get the ultimate depth value of its minimum value as chopping point.Pattern with sampling template T is an example, the texture primitive that has regional AR0 information except access is usually the calculating sampling point T, the access texture primitive depth value of calculating sampling point T2 usually that has regional AR0 information more, the texture primitive that access has regional AR2 information is the depth value of calculating sampling point T1 usually, and the access texture primitive depth value of calculating sampling point T3 and T4 usually that has regional AR1 information.Next the degree of depth result's (depth value of T, T1, T2, T3 and T4) of all consistent triangles tests that sorts, and get the ultimate depth value of its minimum value as chopping point Pd.
For accuracy, it is very important selecting the appropriate template pattern.Compared to less template pattern, bigger template pattern usually provides high accuracy.Yet what comprise many sampled points may be unfavorable for performance than large form.Special template pattern shown in Fig. 3 C can be less sampling and realize close accuracy.By setting the triangle uniformity test total amount of a certain test pixel, more can reduce number of samples.
For the pixel P that is tested, when the texture resolution is critical (subcritical, it will cause some shading triangles can't deposit echo in), these corresponding triangles tests must be inconsistent.Based on this, the test of these triangles is just according to being sorted to the order of central leg-of-mutton Weighted distance, so as use corresponding to " minimum distance " (closest-distance) the weighted value triangle information representing rebuild.Rebuilding chopping point when rational hypothesis is at " minimum distance " leg-of-mutton same level, and this Weighted distance that obtains as calculated can be the computing method of Euclidean geometry (Euclidean).
After having obtained correct triangle information representing, the chopping point depth value of institute's test pixel can be rebuilt.Via triangle interpolation (triangle interpolation), the depth value of chopping point Pd can be rebuilt among the shading triangle T R0.After calculating above-mentioned weighted value from equation 1, the depth value T.z of chopping point Pd can utilize following formula to rebuild among the step S130:
T . z = [ w 1 , w 2 , w 3 ] v 0 . z v 1 . z v 2 . z (equation 2)
Perhaps, in conjunction with equation 1 and equation 2, can obtain equation 3:
T . z = [ w 1 , w 2 , w 3 ] * v 0 . x v 0 . y 1 v 1 . x v 1 . y 1 v 2 . x v 2 . y 1 * v 0 . x v 0 . y 1 v 1 . x v 1 . y 1 v 2 . x v 2 . y 1 - 1 * v 0 . z v 1 . z v 2 . z
= p . x p . y 1 * v 0 . x v 0 . y 1 v 1 . x v 1 . y 1 v 2 . x v 2 . y 1 - 1 * v 0 . z v 1 . z v 2 . z (equation 3)
The depth value T.z of chopping point Pd can utilize equation 3 to rebuild among the step S130.In above-mentioned equation 3, must carry out 3 * 3 inverse of a matrix computings.(Graphics Processing Unit, GPU) hardware is not directly supported 3 * 3 inverse of a matrix computings to Graphics Processing Unit at present.Therefore we must be broken down into some common arithmetic logical units (arithmetic and logic unit, ALU) instruction.Yet the ALU instruction set can not guarantee accuracy, and may introduce the error of more heterogeneous pass and have influence on final reconstruction depth value the inverse operation result.
In order to improve the problems referred to above, present embodiment is rewritten as equation 3 equation of following equivalence:
T . z = v 0 . z + ∂ z ∂ x ( p . x - v 0 . x ) + ∂ z ∂ y ( p . y - v 0 . y )
= v 0 . z + p . x - v 0 . x p . y - v 0 . y * v 1 . x - v 0 . x v 1 . y - v 0 . y v 2 . x - v 0 . x v 2 . y - v 0 . y - 1 * v 1 · z - v 0 · z v 2 · z - v 0 · z (equation 4)
Therefore, the depth value T.z of chopping point Pd can also utilize equation 4 to rebuild among the step S130.
Pass through to compare the light source standard view volume depth value (canonical volume depth values) of chopping point Pd and pixel P at last, that is compare T.z and p.z, can finish the shade of pixel P and judge (step S140).
The projection sawtooth mistake that Fig. 4 A description standard echo is produced.The scene that Fig. 4 A shows is a quadrangular plate that is suspended in the top, baseplane, so quadrangular plate forms a band-like shadow in the baseplane.Fig. 4 A lower left corner shows the partial enlarged drawing of described band-like shadow.Can find out clearly that from Fig. 4 A the projection sawtooth mistake that the traditional standard echo is produced is clearly.Compared to Fig. 4 A, Fig. 4 B is according to the embodiment of the invention explanation projection sawtooth result that Recreatable geometric shade pattern produced.That is Fig. 4 B has used the new algorithm that the invention described above embodiment introduced: and Recreatable geometric shade pattern (Reconstructable Geometry Shadow Map, RGSM), as the solution of sawtooth problem.The scene that Fig. 4 B shows is identical with Fig. 4 A.Can clearly find out the projection sawtooth mistake that the employed RGSM algorithm of the embodiment of the invention is produced significantly being improved clearly from Fig. 4 B.
Another problem of most of shadowgraph is the depth shift problem.Fig. 5 A, 5B are identical with the scene of the figure depth shift test of 5C, all are house and railing.The test scene that Fig. 5 A description standard echo is produced with constant depth shift technology (depth shift value 1e-3) is to avoid wrong self shade (self-shadowing) problem.That is it is relatively just adding the depth shift value in the depth-sampling before with real surface (true surface).Because the depth shift value of Fig. 5 A is excessive, " shadow-free " that leads to errors (non-shadowing appears to the top that shade floats over light reception thing) phenomenon and that shade is retreated is too far away.In fact, determine that directly off-set value is very difficult, and can't find out an acceptable value in each scene.For example, the test scene that produced with constant depth shift technology (depth shift value 1e-6) of Fig. 5 B description standard echo.Use small depth off-set value (1e-6) in order to improve wrong " shadow-free " phenomenon,, produced wrong " from shade " (self-shadowing) problem (shown in Fig. 5 B) though improved " shadow-free " phenomenon.Fig. 5 C is according to the embodiment of the invention explanation figure depth shift test scene that Recreatable geometric shade pattern produced.That is Fig. 5 C has used RGSM algorithm that the invention described above embodiment the introduced solution as the depth shift problem.The depth shift value of Fig. 5 C is identical with Fig. 5 B, is 1e-6 all.Can find out clearly that from Fig. 5 C the employed RGSM algorithm of the embodiment of the invention can use minimum depth shift value, and can not produce wrong " from shade " problem.
In sum, present embodiment can guarantee (pixel-wise) depth accuracy of pixel scale, has following advantage:
1. by lowering perspective sawtooth and projection sawtooth, it can produce accurate shade edge.It more can remove (jittering) phenomenon of shade edge " shake " in dynamic scene.
2. compared with other shadowgraph, present embodiment can have very little depth shift value.By setting single and fixing off-set value, use the program designer of RGSM can meet the demand of major applications, and produce correct images and avoid (falsenon-shadowing) problem of wrong " from shade " (false self-shadowing) or wrong " shadow-free ".
3. under the prerequisite that identical output shade quality and high speed are carried out, it uses the small amount of memory space of standard echo.
The above only is preferred embodiment of the present invention; so it is not in order to limit scope of the present invention; any personnel that are familiar with this technology; without departing from the spirit and scope of the present invention; can do further improvement and variation on this basis, so the scope that claims were defined that protection scope of the present invention is worked as with the application is as the criterion.
Being simply described as follows of symbol in the accompanying drawing:
A, B: texel
AR0, AR1, AR2, AR3: the corresponding region among the geometrical shadow figure
P: test pixel
Pd: chopping point
S110~S140: according to each step of embodiment of the invention explanation Recreatable geometric shade pattern method
TR0, TR1, TR2, TR3: the triangle of shade surface
T, T ': sampled point.

Claims (10)

1. a Recreatable geometric shade pattern method is characterized in that, comprising:
With a light source is observation point, stores the geological information of a plurality of shading geometric forms of an object front surface;
One test pixel is carried out uniformity test, from described a plurality of shading geometric forms, to find out a shading geometric form that corresponds to this test pixels;
Rebuild the depth value of a chopping point; And
Finishing the shade of this test pixel judges.
2. Recreatable geometric shade pattern method according to claim 1 is characterized in that described geometric form comprises triangle.
3. Recreatable geometric shade pattern method according to claim 1 is characterized in that, described geological information comprises apex coordinate or how much index of described geometric form.
4. Recreatable geometric shade pattern method according to claim 1 is characterized in that, the coordinate of this test pixel be (p.z), and this uniformity test comprises for p.x, p.y:
Select described geometric form one of them;
Read the geological information of selected this geometric form, comprise the apex coordinate (v of this geometric form in this geological information 0.x, v 0.y, v 0.z), (v 1.x, v 1.y, v 1.z) and (v 2.x, v 2.y, v 2.z);
Calculation equation p . x p . y 1 = w 1 w 2 w 3 * v 0 . x v 0 . y 1 v 1 . x v 1 . y 1 v 2 . x v 2 . y 1 , To ask for the barycentric coordinates value (w of this chopping point 1, w 2, w 3);
Barycentric coordinates value (w according to this chopping point 1, w 2, w 3) judge whether selected this geometric form is consistent; And
If selected this geometric form judged result is consistent, then this geometric form is this shading geometric form.
5. Recreatable geometric shade pattern method according to claim 4 is characterized in that, the depth value of rebuilding this chopping point comprises:
Calculation equation T . z = [ w 1 , w 2 , w 3 ] v 0 . z v 1 . z v 2 . z , To ask for the depth value T.z of this chopping point.
6. Recreatable geometric shade pattern method according to claim 4 is characterized in that, the depth value of rebuilding this chopping point comprises:
Calculation equation T . z = [ w 1 , w 2 , w 3 ] * v 0 . x v 0 . y 1 v 1 . x v 1 . y 1 v 2 . x v 2 . y 1 * v 0 . x v 0 . y 1 v 1 . x v 1 . y 1 v 2 . x v 2 . y 1 - 1 * v 0 . z v 1 . z v 2 . z = p . x p . y 1 * v 0 . x v 0 . y 1 v 1 . x v 1 . y 1 v 2 . x v 2 . y 1 - 1 * v 0 . z v 1 . z v 2 . z , To ask for the depth value T.z of this chopping point.
7. Recreatable geometric shade pattern method according to claim 4 is characterized in that, the depth value of rebuilding this chopping point comprises:
Calculation equation T . z = v 0 . z + ∂ z ∂ x ( p . x - v 0 . x ) + ∂ z ∂ y ( p . y - v 0 . y ) = v 0 . z + p . x - v 0 . x p . y - v 0 . y * v 1 . x - v 0 . x v 1 . y - v 0 . y v 2 . x - v 0 . x v 2 . y - v 0 . y - 1 * v 1 · z - v 0 · z v 2 · z - v 0 · z , To ask for the depth value T.z of this chopping point.
8. Recreatable geometric shade pattern method according to claim 1 is characterized in that, this shading geometric form has this chopping point; With this light source is observation point, and this test pixel and this chopping point are overlapping.
9. Recreatable geometric shade pattern method according to claim 1 is characterized in that, the depth value of rebuilding this chopping point need utilize the geological information of this shading geometric form and the positional information of this test pixel.
10. Recreatable geometric shade pattern method according to claim 1 is characterized in that, the shade of finishing this test pixel judges it is by the relatively depth value of this chopping point and the depth value of this test pixel.
CN2008100961357A 2007-10-26 2008-05-06 Recreatable geometric shade pattern method Active CN101271588B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US98272707P 2007-10-26 2007-10-26
US60/982,727 2007-10-26

Publications (2)

Publication Number Publication Date
CN101271588A true CN101271588A (en) 2008-09-24
CN101271588B CN101271588B (en) 2012-01-11

Family

ID=40005539

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2008100961357A Active CN101271588B (en) 2007-10-26 2008-05-06 Recreatable geometric shade pattern method

Country Status (2)

Country Link
CN (1) CN101271588B (en)
TW (1) TWI417808B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101882324A (en) * 2010-05-19 2010-11-10 北京航空航天大学 Soft shadow real-time rendering method based on bidirectional half-shadow graph
CN104205173A (en) * 2012-03-29 2014-12-10 汤姆逊许可公司 Method for estimating the opacity level in a scene and corresponding device
CN104966313A (en) * 2015-06-12 2015-10-07 浙江大学 Geometric shadow map method for triangle reconstruction
US10074211B2 (en) 2013-02-12 2018-09-11 Thomson Licensing Method and device for establishing the frontier between objects of a scene in a depth map
CN109712211A (en) * 2018-12-21 2019-05-03 西安恒歌数码科技有限责任公司 Active isomer shadow generation method based on OSG

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9083960B2 (en) * 2013-01-30 2015-07-14 Qualcomm Incorporated Real-time 3D reconstruction with power efficient depth sensor usage
CN104966297B (en) * 2015-06-12 2017-09-12 浙江大学 A kind of method that general echo generates shade

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5870097A (en) * 1995-08-04 1999-02-09 Microsoft Corporation Method and system for improving shadowing in a graphics rendering system
US6208361B1 (en) * 1998-06-15 2001-03-27 Silicon Graphics, Inc. Method and system for efficient context switching in a computer graphics system
US6903741B2 (en) * 2001-12-13 2005-06-07 Crytek Gmbh Method, computer program product and system for rendering soft shadows in a frame representing a 3D-scene

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101882324A (en) * 2010-05-19 2010-11-10 北京航空航天大学 Soft shadow real-time rendering method based on bidirectional half-shadow graph
CN101882324B (en) * 2010-05-19 2012-03-28 北京航空航天大学 Soft shadow real-time rendering method based on bidirectional half-shadow graph
CN104205173A (en) * 2012-03-29 2014-12-10 汤姆逊许可公司 Method for estimating the opacity level in a scene and corresponding device
CN104205173B (en) * 2012-03-29 2017-03-29 汤姆逊许可公司 For estimating the method and corresponding equipment of the opacity level in scene
US10074211B2 (en) 2013-02-12 2018-09-11 Thomson Licensing Method and device for establishing the frontier between objects of a scene in a depth map
CN104966313A (en) * 2015-06-12 2015-10-07 浙江大学 Geometric shadow map method for triangle reconstruction
CN104966313B (en) * 2015-06-12 2017-09-19 浙江大学 The geometrical shadow drawing method that triangle is rebuild
CN109712211A (en) * 2018-12-21 2019-05-03 西安恒歌数码科技有限责任公司 Active isomer shadow generation method based on OSG
CN109712211B (en) * 2018-12-21 2023-02-10 西安恒歌数码科技有限责任公司 Efficient body shadow generation method based on OSG

Also Published As

Publication number Publication date
TWI417808B (en) 2013-12-01
TW200919369A (en) 2009-05-01
CN101271588B (en) 2012-01-11

Similar Documents

Publication Publication Date Title
US7280121B2 (en) Image processing apparatus and method of same
JP3390463B2 (en) Shadow test method for 3D graphics
CN101271588B (en) Recreatable geometric shade pattern method
US6618047B1 (en) Visibility calculations for 3d computer graphics
CN104183005B (en) Graphics processing unit and rendering intent based on segment
US9965892B2 (en) Rendering tessellated geometry with motion and defocus blur
Zhang et al. Conservative voxelization
US7362332B2 (en) System and method of simulating motion blur efficiently
US10529117B2 (en) Systems and methods for rendering optical distortion effects
TWI517090B (en) Hierarchical motion blur rasterization
US20020190982A1 (en) 3D computer modelling apparatus
US10699467B2 (en) Computer-graphics based on hierarchical ray casting
KR100833842B1 (en) Method for processing pixel rasterization at 3-dimensions graphic engine and device for processing the same
CN107657655A (en) The method and system sorted out in the graphics system based on segment
EP1519317B1 (en) Depth-based antialiasing
KR20150034296A (en) Hit testing method and apparatus
US8605088B2 (en) Method for reconstructing geometry mapping
US7379599B1 (en) Model based object recognition method using a texture engine
Rauwendaal Hybrid computational voxelization using the graphics pipeline
Hoppe et al. Adaptive meshing and detail-reduction of 3D-point clouds from laser scans
Svensson Occlusion Culling on the GPU: Inner Conservative Occluder Rasterization
Boldt et al. Selfintersections with cullide
Borodavka et al. The Hardware Adapted Ray Tracing Algorithm
Ernst et al. Entkerner: A System for Removal of Globally Invisible Triangles from Large Meshes.
Forstmann et al. Visualizing run-length-encoded volume data on modern GPUs

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant