CN104103092A - Real-time dynamic shadowing realization method based on projector lamp - Google Patents

Real-time dynamic shadowing realization method based on projector lamp Download PDF

Info

Publication number
CN104103092A
CN104103092A CN201410354846.5A CN201410354846A CN104103092A CN 104103092 A CN104103092 A CN 104103092A CN 201410354846 A CN201410354846 A CN 201410354846A CN 104103092 A CN104103092 A CN 104103092A
Authority
CN
China
Prior art keywords
space
scene
camera
light
depth
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201410354846.5A
Other languages
Chinese (zh)
Inventor
张翼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Of Ancient India Day Infotech Share Co Ltd In Wuxi
Original Assignee
Of Ancient India Day Infotech Share Co Ltd In Wuxi
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Of Ancient India Day Infotech Share Co Ltd In Wuxi filed Critical Of Ancient India Day Infotech Share Co Ltd In Wuxi
Priority to CN201410354846.5A priority Critical patent/CN104103092A/en
Publication of CN104103092A publication Critical patent/CN104103092A/en
Pending legal-status Critical Current

Links

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a real-time dynamic shadowing realization method based on a projector lamp. The method mainly comprises the following steps that: a visible object list in a scene is traversed, and only visible objects or partial visible objects in the scene are rendered; when the scene is rendered in a lamp light space, a depth diagram projected by all pixels nearest to a light source is obtained; when the scene is rendered normally, depth values of new positions of the pixels obtained when a camera space is converted into the lamp light space are compared, and when the depth values of the new positions of the pixels are greater than the depth diagram, the pixels are in a shadow; and then, the shadow and the scene are subjected to color mixing and are protected onto a screen. The real-time dynamic shadowing realization method based on the projector lamp can overcome the defects of narrow use range, easy mistake making, great occupied memory, slow calculation speed and the like in the prior art, so that the method realizes the advantages that the use range is wide; mistakes cannot be easily made; the occupied memory is small; and the calculation speed is high.

Description

A kind of implementation method based on the in real time dynamic shade of spotlight
Technical field
The present invention relates to technical field of image processing, particularly, relate to a kind of implementation method based on the in real time dynamic shade of spotlight.
Background technology
Development and programmable continuous popularizing along with computer graphics, virtual three-dimensional world simulating reality authenticity, mostly in game is the performances that realize visualization, more and more to the research of shade in recent years, the realization of this effect has increased stereoscopic sensation and the stereovision of image, and real-time shadow calculating is a senior problem of field of Computer Graphics.
In game, utilize Z-buffer efficient rendering algorithm to realize at present, it is that to take on the Z axis of new coordinate system be some coordinate system of center of circle foundation, by the object point all in space centered by light source to the projection of new coordinate system XY face, and carry out the processing that rounds a little, then utilize Z-buffer elimination of hidden way, what judgement was changed the time is subject to light feature.This algorithm usable range is narrow, easily makes mistakes, and occupies a large amount of internal memories, and algorithm speed is slow.Based on the problems referred to above, need to provide a kind of and generate hatching effect based on screen space spotlight.
In realizing process of the present invention, inventor find at least to exist in prior art usable range narrow, easily make mistakes, the defect such as committed memory is large and computing velocity is slow.
Summary of the invention
The object of the invention is to, for the problems referred to above, propose a kind of based on the real-time dynamically implementation method of shade of spotlight, with realize usable range wide, be difficult for the advantage of makeing mistakes, committed memory is little and computing velocity is fast.
For achieving the above object, the technical solution used in the present invention is: a kind of implementation method based on the in real time dynamic shade of spotlight, mainly comprises:
Viewable objects list in a, traversal scene, only plays up the visible or visible object of part in scene;
B, while playing up scene under light space, obtain from the depth map of the nearest all pixels projections of light source;
C, while normally playing up scene, the depth value by camera spatial transformation to the reposition of the pixel obtaining under light space compares, when the reposition depth value of pixel is greater than depth map, this pixel is in shade; Then by shade and scene blend of colors, project on screen.
Further, described step b, specifically comprises:
B1, when camera is put into light source position and plays up scene, build a light space;
B2, under light space, the position, the world under light space is converted into view frustums projector space, and it is normalized, obtain the depth map under projector space;
B3, while normally playing up scene, the pixel U based on screen space, V, be converted into world coordinates, then be converted into light space, obtain the depth value of the reposition of the pixel under this space.
Further, described step b1, specifically comprises:
First the model space is transformed into world coordinates space, this realizes in Direct3D, and the world space matrix of process hardware coordinate conversion is denoted as: WorldMatrix;
To the camera in any one scene all based on world space, when camera is put into light source position and plays up scene, first build a light source space, by camera towards with light source direction towards consistent, the direction gage of camera is decided to be Z axis, by camera position with towards a definite light space.
Further, described by camera position with towards the operation of determining a light space, specifically:
In interface under employing DirectX, realize LightViewMatrix:
D3DXMATRIX * D3DXMatrixLookAtLH(
__inout D3DXMATRIX *pOut,
__in const D3DXVECTOR3 *pEye,
__in const D3DXVECTOR3 *pAt,
__in const D3DXVECTOR3 *pUp
);
The matrix L ightViewMatrix that generates a light space by above code, this matrix can only represent the local location of camera.
Further, described step b2, specifically comprises:
Based on step b1, in order to determine that the exact position that camera is seen is referred to as the projection matrix in light space in graphics;
Under light space based on above-mentioned, according to default several points of visible view frustums and according to its bounding box, build a projection matrix LightProjMatrix;
This light locus is transformed into view frustums projector space, the matrix after transforming is denoted as to ViewMatrix, specifically pass through
ViewMatrix = LightViewMatrix *LightProjMatrix;
ViewMatrix is imported in GPU, calculate the projection coordinate of each point, the Z value of projection coordinate is depth value shadow map, and this depth value is recording from the degree of depth of the nearest all pixel projections of light source;
These depth values are saved in to two-dimentional playing up in target, then depth value are normalized, coordinate convergent-divergent at X [1,1], Y [1 ,-1], Z between [0,1].
Further, described step b3, specifically comprises:
When normally playing up scene, based on camera space, played up a texture maps, the texture that comprises object and depth value, be then first converted into world space position;
Light space is also based on world space position, if do depth ratio, world space position must be transformed into light space can compare.
Further, in step a, in scene, the observability of list object comprises: (1) completely visible; (2) completely invisible; (3) part is visible.
The implementation method based on the in real time dynamic shade of spotlight of various embodiments of the present invention, owing to mainly comprising: viewable objects list in traversal scene, only visible the or visible object of part in scene is played up; While playing up scene under light space, obtain from the depth map of the nearest all pixel projections of light source; While normally playing up scene, the reposition depth value by camera spatial transformation to the pixel obtaining under light space compares, and when the depth value of the reposition of pixel is greater than depth map, this pixel is in shade; Then by shade and scene blend of colors, project on screen; Thereby can overcome usable range in prior art narrow, easily make mistakes, committed memory is large and computing velocity is slow defect, with realize usable range wide, be difficult for the advantage of makeing mistakes, committed memory is little and computing velocity is fast.
Other features and advantages of the present invention will be set forth in the following description, and, partly from instructions, become apparent, or understand by implementing the present invention.
Below by drawings and Examples, technical scheme of the present invention is described in further detail.
Accompanying drawing explanation
Accompanying drawing is used to provide a further understanding of the present invention, and forms a part for instructions, for explaining the present invention, is not construed as limiting the invention together with embodiments of the present invention.In the accompanying drawings:
Fig. 1 (a) is for the present invention is based on the in real time dynamically vertical view of the bounding box of a model in the scene of the implementation method of shade of spotlight, and Fig. 1 (b) the present invention is based on the spotlight dynamic side view of the visible situation of each model bounding box in the camera of the implementation method of shade in real time;
Fig. 2 the present invention is based on the spotlight schematic diagram that in real time dynamically video camera spatial transformation is projector space in the implementation method of shade;
Fig. 3 be the present invention is based on spotlight in real time dynamically in the implementation method of shade video camera spatial transformation be the schematic diagram in light space;
Fig. 4 is the process flow diagram that the present invention is based on the implementation method of the in real time dynamic shade of spotlight.
Embodiment
Below in conjunction with accompanying drawing, the preferred embodiments of the present invention are described, should be appreciated that preferred embodiment described herein, only for description and interpretation the present invention, is not intended to limit the present invention.
According to the embodiment of the present invention, as shown in Figure 1, Figure 2, Figure 3 and Figure 4, provide a kind of implementation method based on the in real time dynamic shade of spotlight.
In game, GPU is often to drawing a large amount of effect of shadow, present game hatching effect becomes increasingly complex, because too complicated shade can not be supported video card, in reality, hatching effect, because light is blocked and produces by object, also can reach this effect by graphic modeling in 3D environment, when in scene, solid or light position change, can realize dynamic Shadow Techniques, this is that a senior technology is also an emphasis of computer graphics study simultaneously.
First analyze, spotlight is the light that uses optically focused camera lens or catoptron etc. to be polymerized to, and can regulate spot size, the diffused ray on side is smaller, compare with other light source, the direction of illumination of spotlight is single direction, and the direction of illumination of light is cone.
First: we first travel through all objects in scene, whether detected object is visible in camera.Here there are three kinds of situations: (1) completely visible; (2) completely invisible; (3) part is visible.If Fig. 1 (a) is the vertical view of the bounding box of a model in scene, Fig. 1 (b) is the visible situation of each model bounding box in camera.Only to magazine, play up as seen as seen and partly.
Second: first the model space is transformed into world coordinates space, this conversion realizes in Direct3D, world space matrix through hardware coordinate conversion is denoted as: WorldMatrix, this to the camera in any one scene all based on world space, when camera is put into light source position and plays up scene, first build a light source space, by camera towards with light source direction towards consistent, the direction gage of camera is decided to be Z axis, there is camera position and towards determining a light space, the present invention realizes LightViewMatrix in the interface adopting under DirectX:
D3DXMATRIX * D3DXMatrixLookAtLH(
__inout D3DXMATRIX *pOut,
__in const D3DXVECTOR3 *pEye,
__in const D3DXVECTOR3 *pAt,
__in const D3DXVECTOR3 *pUp
);
By above code, generate the matrix L ightViewMatrix in a light space, this matrix can only represent the local location of camera, to seen object, it is not an accurate position, in order to determine an accurate position, we also will determine that object is to the distance of camera, we can be said to is " focal length ", and he determines that our camera sees the size of a scope of object around.
The the 3rd: based on above-mentioned, for the exact position of determining that camera is seen, in graphics, we can be referred to as the projection matrix in light space; Under light space based on above-mentioned, according to 8 of visible view frustums points and according to its bounding box, build a projection matrix LightProjMatrix, as shown in Figure 1.
This light locus is transformed into view frustums projector space, the matrix after transforming is denoted as to ViewMatrix, specifically pass through
ViewMatrix = LightViewMatrix *LightProjMatrix
ViewMatrix is imported in GPU, calculate the projection coordinate of each point, the Z value of projection coordinate is depth value (shadow map), and this depth value is recording from the degree of depth of the nearest all pixel projections of light source.We are saved in two-dimentional playing up in target these depth values.Then depth value is normalized, coordinate convergent-divergent at X [1,1], Y [1 ,-1], Z between [0,1], as shown in Figure 2.
The the 4th: when normally playing up scene, based on camera space, played up a texture maps, the texture that comprises object and depth value, then be first converted into world space position, light space is also based on world space position, if do depth ratio, world space position must be transformed into light space and just can compare.The pixel a of camera space any point point for example, its depth value is s2, and in light space, the depth value s1 of this point, as shown in Figure 3.
The the 5th: camera spatial transformation, depth value and the depth map (shadow map) to the reposition of the pixel in light space compares for we, when the degree of depth is greater than the degree of depth in depth map, i.e. (s2>s1), the pixel of seeing is subject to shade, then by shade and scene color addition and project on screen, realize the vivid effect of shade.
Technical scheme of the present invention, is only suitable for this light sources such as optically focused, and the maximum programmability of utilizing GPU is drawn high-level efficiency and draws fast, and committed memory is few, and such algorithm is realized simple, and applicability is good.
Finally it should be noted that: the foregoing is only the preferred embodiments of the present invention, be not limited to the present invention, although the present invention is had been described in detail with reference to previous embodiment, for a person skilled in the art, its technical scheme that still can record aforementioned each embodiment is modified, or part technical characterictic is wherein equal to replacement.Within the spirit and principles in the present invention all, any modification of doing, be equal to replacement, improvement etc., within all should being included in protection scope of the present invention.

Claims (7)

1. the implementation method based on the in real time dynamic shade of spotlight, is characterized in that, mainly comprises:
Viewable objects list in a, traversal scene, only plays up the visible or visible object of part in scene;
B, while playing up scene under light space, obtain from the depth map of the nearest all pixels projections of light source;
C, while normally playing up scene, the new depth value by camera spatial transformation to the pixel obtaining under light space compares, when the reposition depth value of pixel is greater than depth map, this pixel is in shade; Then by shade and scene blend of colors, project on screen.
2. the implementation method based on the in real time dynamic shade of spotlight according to claim 1, is characterized in that, described step b, specifically comprises:
B1, when camera is put into light source position and plays up scene, build a light space;
B2, under light space, the position, the world under light space is converted into view frustums projector space, and it is normalized, obtain the depth map under projector space;
B3, while normally playing up scene, the pixel U based on screen space, V, be converted into world coordinates, then be converted into light space, obtain the depth value of the reposition of the pixel under this space.
3. the implementation method based on the in real time dynamic shade of spotlight according to claim 2, is characterized in that, described step b1, specifically comprises:
First the model space is transformed into world coordinates space, this realizes in Direct3D, and the world space matrix of process hardware coordinate conversion is denoted as: WorldMatrix;
To the camera in any one scene all based on world space, when camera is put into light source position and plays up scene, first build a light source space, by camera towards with light source direction towards consistent, the direction gage of camera is decided to be Z axis, by camera position with towards a definite light space.
4. the implementation method based on the in real time dynamic shade of spotlight according to claim 3, is characterized in that, described by camera position with towards the operation of determining a light space, specifically:
In interface under employing DirectX, realize LightViewMatrix:
D3DXMATRIX * D3DXMatrixLookAtLH(
__inout D3DXMATRIX *pOut,
__in const D3DXVECTOR3 *pEye,
__in const D3DXVECTOR3 *pAt,
__in const D3DXVECTOR3 *pUp
);
The matrix L ightViewMatrix that generates a light space by above code, this matrix can only represent the local location of camera.
5. the implementation method based on the in real time dynamic shade of spotlight according to claim 3, is characterized in that, described step b2, specifically comprises:
Based on step b1, in order to determine that the exact position that camera is seen is referred to as the projection matrix in light space in graphics;
Under light space based on above-mentioned, according to default several points of visible view frustums and according to its bounding box, build a projection matrix LightProjMatrix;
This light locus is transformed into view frustums projector space, the matrix after transforming is denoted as to ViewMatrix, specifically pass through
ViewMatrix = LightViewMatrix *LightProjMatrix;
ViewMatrix is imported in GPU, calculate the projection coordinate of each point, the Z value of projection coordinate is depth value shadow map, and this depth value is recording from the degree of depth of the nearest all pixel projections of light source;
These depth values are saved in to two-dimentional playing up in target, then depth value are normalized, coordinate convergent-divergent at X [1,1], Y [1 ,-1], Z between [0,1].
6. the implementation method based on the in real time dynamic shade of spotlight according to claim 5, is characterized in that, described step b3, specifically comprises:
When normally playing up scene, based on camera space, played up a texture maps, the texture that comprises object and depth value, be then first converted into world space position;
Light space is also based on world space position, if do depth ratio, world space position must be transformed into light space can compare.
7. according to the implementation method based on the in real time dynamic shade of spotlight described in any one in claim 1-6, it is characterized in that, in step a, in scene, the observability of list object comprises: (1) completely visible; (2) completely invisible; (3) part is visible.
CN201410354846.5A 2014-07-24 2014-07-24 Real-time dynamic shadowing realization method based on projector lamp Pending CN104103092A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410354846.5A CN104103092A (en) 2014-07-24 2014-07-24 Real-time dynamic shadowing realization method based on projector lamp

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410354846.5A CN104103092A (en) 2014-07-24 2014-07-24 Real-time dynamic shadowing realization method based on projector lamp

Publications (1)

Publication Number Publication Date
CN104103092A true CN104103092A (en) 2014-10-15

Family

ID=51671209

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410354846.5A Pending CN104103092A (en) 2014-07-24 2014-07-24 Real-time dynamic shadowing realization method based on projector lamp

Country Status (1)

Country Link
CN (1) CN104103092A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105677395A (en) * 2015-12-28 2016-06-15 珠海金山网络游戏科技有限公司 Game scene pixel blanking system and method
CN105913472A (en) * 2015-08-28 2016-08-31 深圳市彬讯科技有限公司 Indoor scene rendering method and device thereof
CN106846455A (en) * 2017-01-10 2017-06-13 努比亚技术有限公司 A kind of dynamic shadow processing terminal and method
CN108038897A (en) * 2017-12-06 2018-05-15 北京像素软件科技股份有限公司 Shadow map generation method and device
CN109658494A (en) * 2019-01-07 2019-04-19 北京达美盛科技有限公司 A kind of Shading Rendering method in three-dimensional visualization figure
CN109993823A (en) * 2019-04-11 2019-07-09 腾讯科技(深圳)有限公司 Shading Rendering method, apparatus, terminal and storage medium
CN111028357A (en) * 2018-10-09 2020-04-17 北京嘀嘀无限科技发展有限公司 Soft shadow processing method and device of augmented reality equipment
CN112184878A (en) * 2020-10-15 2021-01-05 洛阳众智软件科技股份有限公司 Method, device and equipment for automatically generating and rendering three-dimensional night scene light
CN112750188A (en) * 2019-10-29 2021-05-04 福建天晴数码有限公司 Method and terminal for automatically rendering object
CN116109758A (en) * 2023-04-07 2023-05-12 北京渲光科技有限公司 Method and device for positioning projection position of light source and rendering scene

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020018063A1 (en) * 2000-05-31 2002-02-14 Donovan Walter E. System, method and article of manufacture for shadow mapping
US6593923B1 (en) * 2000-05-31 2003-07-15 Nvidia Corporation System, method and article of manufacture for shadow mapping
CN101840566A (en) * 2010-04-16 2010-09-22 中山大学 Real-time shadow generating method based on GPU parallel calculation and system thereof
US20110032256A1 (en) * 2009-08-06 2011-02-10 Samsung Electronics Co., Ltd. Image processing apparatus and method
CN102768765A (en) * 2012-06-25 2012-11-07 南京安讯网络服务有限公司 Real-time soft shadow rendering method for point light sources

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020018063A1 (en) * 2000-05-31 2002-02-14 Donovan Walter E. System, method and article of manufacture for shadow mapping
US6593923B1 (en) * 2000-05-31 2003-07-15 Nvidia Corporation System, method and article of manufacture for shadow mapping
US20110032256A1 (en) * 2009-08-06 2011-02-10 Samsung Electronics Co., Ltd. Image processing apparatus and method
CN101840566A (en) * 2010-04-16 2010-09-22 中山大学 Real-time shadow generating method based on GPU parallel calculation and system thereof
CN102768765A (en) * 2012-06-25 2012-11-07 南京安讯网络服务有限公司 Real-time soft shadow rendering method for point light sources

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
LANCE WILLIAMS: "Casting curved shadows on curved surfaces", 《ACM SIGGRAPH COMPUTER GRAPHICS》 *
叶至军: "《VisualC++/ DirectX9 3D游戏开发导引》", 28 February 2006, 人民邮电出版社 *
曾晓一: "《实时动态阴影算法的研究与实现》", 《中国优秀硕士学文论文全文数据库 信息科技辑》 *
李恋: "图形引擎中阴影算法的研究与实现", 《中国优秀硕士学文论文全文数据库 信息科技辑》 *
董江亥 等: "基于场景图的视锥裁减优化", 《计算机工程应用技术》 *

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105913472A (en) * 2015-08-28 2016-08-31 深圳市彬讯科技有限公司 Indoor scene rendering method and device thereof
CN105677395A (en) * 2015-12-28 2016-06-15 珠海金山网络游戏科技有限公司 Game scene pixel blanking system and method
CN105677395B (en) * 2015-12-28 2019-09-10 珠海金山网络游戏科技有限公司 A kind of system and method for scene of game pixel blanking
CN106846455A (en) * 2017-01-10 2017-06-13 努比亚技术有限公司 A kind of dynamic shadow processing terminal and method
CN108038897A (en) * 2017-12-06 2018-05-15 北京像素软件科技股份有限公司 Shadow map generation method and device
CN108038897B (en) * 2017-12-06 2021-06-04 北京像素软件科技股份有限公司 Shadow map generation method and device
CN111028357B (en) * 2018-10-09 2020-11-17 北京嘀嘀无限科技发展有限公司 Soft shadow processing method and device of augmented reality equipment
CN111028357A (en) * 2018-10-09 2020-04-17 北京嘀嘀无限科技发展有限公司 Soft shadow processing method and device of augmented reality equipment
CN109658494A (en) * 2019-01-07 2019-04-19 北京达美盛科技有限公司 A kind of Shading Rendering method in three-dimensional visualization figure
CN109658494B (en) * 2019-01-07 2023-03-31 北京达美盛软件股份有限公司 Shadow rendering method in three-dimensional visual graph
CN109993823A (en) * 2019-04-11 2019-07-09 腾讯科技(深圳)有限公司 Shading Rendering method, apparatus, terminal and storage medium
CN109993823B (en) * 2019-04-11 2022-11-25 腾讯科技(深圳)有限公司 Shadow rendering method, device, terminal and storage medium
CN112750188A (en) * 2019-10-29 2021-05-04 福建天晴数码有限公司 Method and terminal for automatically rendering object
CN112750188B (en) * 2019-10-29 2023-11-24 福建天晴数码有限公司 Method and terminal for automatically rendering object
CN112184878A (en) * 2020-10-15 2021-01-05 洛阳众智软件科技股份有限公司 Method, device and equipment for automatically generating and rendering three-dimensional night scene light
CN112184878B (en) * 2020-10-15 2023-08-25 洛阳众智软件科技股份有限公司 Method, device and equipment for automatically generating and rendering three-dimensional night scene lamplight
CN116109758A (en) * 2023-04-07 2023-05-12 北京渲光科技有限公司 Method and device for positioning projection position of light source and rendering scene
CN116109758B (en) * 2023-04-07 2023-06-16 北京渲光科技有限公司 Method and device for positioning projection position of light source and rendering scene

Similar Documents

Publication Publication Date Title
CN104103092A (en) Real-time dynamic shadowing realization method based on projector lamp
CN102768765B (en) Real-time soft shadow rendering method for point light sources
CN104331918B (en) Based on earth's surface occlusion culling and accelerated method outside depth map real-time rendering room
Raskar et al. Table-top spatially-augmented realty: bringing physical models to life with projected imagery
CN107341853B (en) Virtual-real fusion method and system for super-large virtual scene and dynamic screen shooting
JP5531093B2 (en) How to add shadows to objects in computer graphics
US20070126864A1 (en) Synthesizing three-dimensional surround visual field
US11488348B1 (en) Computing virtual screen imagery based on a stage environment, camera position, and/or camera settings
CN102243768B (en) Method for drawing stereo picture of three-dimensional virtual scene
JP2008090617A (en) Device, method and program for creating three-dimensional image
CN105513112A (en) Image processing method and device
US20070126932A1 (en) Systems and methods for utilizing idle display area
Widmer et al. An adaptive acceleration structure for screen-space ray tracing
CN104299257B (en) A kind of method that real-time dynamic shadow is realized based on outdoor sunlight
CN104217461B (en) A parallax mapping method based on a depth map to simulate a real-time bump effect
US20230281912A1 (en) Method and system for generating a target image from plural multi-plane images
WO2014170757A2 (en) 3d rendering for training computer vision recognition
US20240087219A1 (en) Method and apparatus for generating lighting image, device, and medium
CN105976423B (en) A kind of generation method and device of Lens Flare
US20140306953A1 (en) 3D Rendering for Training Computer Vision Recognition
CN109829962B (en) Object space hidden line elimination calculation acceleration method using OPENGL
CN103093491A (en) Three-dimensional model high sense of reality virtuality and reality combination rendering method based on multi-view video
CN116468845A (en) Shadow mapping method and device
Oishi et al. An instant see-through vision system using a wide field-of-view camera and a 3d-lidar
JP2024523729A (en) SYSTEM AND METHOD FOR REAL-TIME RAY TRACING IN A 3D ENVIRONMENT - Patent application

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20141015

RJ01 Rejection of invention patent application after publication