KR20140013292A - Ray-tracing arithmetic operation method and system - Google Patents

Ray-tracing arithmetic operation method and system Download PDF

Info

Publication number
KR20140013292A
KR20140013292A KR1020120079914A KR20120079914A KR20140013292A KR 20140013292 A KR20140013292 A KR 20140013292A KR 1020120079914 A KR1020120079914 A KR 1020120079914A KR 20120079914 A KR20120079914 A KR 20120079914A KR 20140013292 A KR20140013292 A KR 20140013292A
Authority
KR
South Korea
Prior art keywords
ray tracing
ray
operation system
intersection
information
Prior art date
Application number
KR1020120079914A
Other languages
Korean (ko)
Inventor
박진홍
윤형민
정철호
신철호
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020120079914A priority Critical patent/KR20140013292A/en
Publication of KR20140013292A publication Critical patent/KR20140013292A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/06Ray-tracing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/503Blending, e.g. for anti-aliasing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/80Shading
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Generation (AREA)

Abstract

The present invention relates to a ray tracing arithmetic operation system. The ray tracing arithmetic operation system which is configured to calculate ray tracing to generate a three-dimensional image reflecting optical effects according to the present invention comprises: a ray tracing unit performing ray generation which generates rays towards an object from a pixel on a screen, data traversal which searches along the direction of the rays in order to identify which polygon intersects with the generated rays, and an intersection test which generates intersected point data including location information within the polygon intersecting with the rays; and a graphic processing unit which generates color information of the pixel by calculating the color and texture of the intersected point. Through the above configuration, the ray tracing arithmetic operation system according to the present invention can reduce a ray tracing unit size by performing a ray shading operation by an existing graphic process unit (GPU) and enhance the utilization of GPU, thereby reducing costs to establish a ray tracing arithmetic operation system.

Description

RAY-TRACING ARITHMETIC OPERATION METHOD AND SYSTEM}

The present invention relates to a ray tracing arithmetic system, and more particularly, to a ray tracing arithmetic system composed of a conventional graphic processing unit (GPU) and a hybrid structure for efficiently performing a ray tracing arithmetic. will be.

Three-dimensional graphics technology is a graphics technology that uses a three-dimensional representation of geometric data stored in a computer, and is widely used in various industries including the media industry and the game industry today. In general, three-dimensional graphics technology requires a separate high performance graphics processor due to the large amount of computation.

In particular, with recent advances in processors, ray tracing technology that can generate highly realistic three-dimensional graphics has been studied. In particular, ray tracing technology has various optical effects including reflection, refraction, and shadow. Effects) can be simulated.

In order to generate 3D graphics using ray tracing technology, an existing graphic processing unit (GPU) may be used. However, due to the processing of ray tracing that requires many operations, other graphic processing speeds may be reduced. .

And, it may be considered to add a new graphics processing unit for the ray tracing operation, but this causes a problem that an additional cost of a new processor occurs.

The present invention has been made to solve the problems of the prior art described above, and an object of the present invention is to provide a ray tracing calculation system configured to efficiently calculate ray tracing at a low cost.

In order to achieve the above object, the present invention is a ray tracing calculation system configured to calculate a ray tracing for generating a three-dimensional image reflecting the optical effect, the object in the pixel (Pixel) in the screen Ray Generation, which generates rays toward the edges, Data Traversal, which traverses the rays in directions to determine which polygons the generated rays intersect. A ray tracing unit configured to perform an intersection test for generating intersected point data including position information in intersecting polygons; And a graphic processing unit configured to calculate color and texture of the intersection by using the intersection data and generate color information of the pixel.

In addition, the present invention includes an embodiment in which the ray tracing unit performs an object structure for generating the object on the screen.

The present invention also includes an embodiment further including an intersection data buffer including the intersection data generated by the ray tracing unit.

In addition, the present invention includes an embodiment in which the intersection data includes at least one of coordinates, normals, colors, and texture information of intersections.

The present invention also includes an embodiment further including a memory including information of the object.

The present invention also includes an embodiment in which the memory includes at least one of structure information, geometry information, and texture information of the object.

The present invention also includes an embodiment in which the memory includes color information of the pixel generated in the graphic processing unit.

The present invention also includes an embodiment in which the graphic processing unit performs an operation for generating a 2D image.

The present invention also includes an embodiment in which the graphic processing unit generates color information of the pixel by performing at least one of Fog, Alpha-Blending, and Anti-Aliasing operations of the intersection.

The present invention also includes an embodiment in which the graphic processing unit generates color information of the pixel using a shader program.

It will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. I will reveal.

Through the above configuration, the ray tracing calculation system according to the present invention processes a ray shading operation in an existing graphic processing unit (GPU) to reduce the size of the ray tracing unit (RTU), The utilization of the graphics processing unit (GPU) can be improved. Accordingly, the present invention can reduce the construction cost of the ray tracing calculation system.

1 is a view for explaining ray tracing according to an embodiment of the present invention.
2 is a hardware configuration diagram of a ray tracing calculation system according to an embodiment of the present invention.
3 is a flowchart of a ray tracing calculation method according to an exemplary embodiment of the present invention.
4 is a hardware configuration diagram of a ray tracing calculation system reflecting a ray tracing calculation method according to an embodiment of the present invention.
5 is a software block diagram of a ray tracing calculation system according to an exemplary embodiment of the present invention.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the like elements throughout. In the following description, the same constituent elements are given the same names and the same symbols for convenience of explanation.

The term used in the present invention is selected from general terms that are widely used at present. However, in some cases, some terms selected arbitrarily by the applicant, and in this case, the meaning is described in detail in the description of the relevant invention. The present invention should be understood as meaning with respect to a term other than a name.

The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role.

Terms such as “first” and “second” are intended to distinguish one component from another component, and the scope of rights should not be limited by these terms. For example, the first component may be referred to as a second component, and similarly, the second component may also be referred to as a first component.

It should be understood that the term "and / or" includes all possible combinations from one or more related items. For example, the meaning of "first item, second item and / or third item" may be given from two or more of the first, second or third items as well as the first, second or third items. Any combination of the possible items.

It is to be understood that when an element is referred to as being "connected" to another element, it may be directly connected to the other element, but there may be other elements in between. On the other hand, when an element is referred to as being "directly connected" to another element, it should be understood that there are no other elements in between. On the other hand, other expressions describing the relationship between the components, such as "between" and "immediately between" or "neighboring to" and "directly neighboring to", should be interpreted as well.

It should be understood that the singular " include "or" have "are to be construed as including a stated feature, number, step, operation, component, It is to be understood that the combination is intended to specify that it is present and not to preclude the presence or addition of one or more other features, numbers, steps, operations, components, parts or combinations thereof.

Each step may take place differently from the stated order unless explicitly stated in a specific order in the context. That is, each step may occur in the same order as described, may be performed substantially concurrently, or may be performed in reverse order.

All terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which the disclosed technology belongs, unless otherwise defined. Terms defined in commonly used dictionaries should be interpreted to be consistent with meaning in the context of the relevant art and can not be construed as having ideal or overly formal meaning unless expressly defined in the present application.

FIG. 1 is a diagram for explaining ray tracing according to an exemplary embodiment of the present invention.

Ray tracing is one of the techniques used in three-dimensional computer graphics. It is a technique that forms the shape of each object while tracking the path that light from a virtual light source reflects off the surfaces of several objects.

In other words, ray tracing is a method of finding a ray that enters an observer's eye and displaying the color of the ray on a screen, and generating a 3D image by finding a color of rays that enter the observer's eye in various directions.

There are two ways to track the rays. One is forward ray tracing, which subdivides and traces the direction of light emitted by the light source, and the other is backward ray tracing, which traces the gaze from the observer's eye toward the light source.

Forward ray tracing has the advantage of simulating natural phenomena more faithfully than reverse ray tracing, but it is inefficient in that it must track all the rays from different directions.

That is, even if you trace all of the rays broken down in numerous directions, most of them will not come into your eye, and those rays will not help you create an image at all. Therefore, it can be seen that in order to make a sufficient number of light rays enter the eye (screen), the light rays originating from the light source must be subdivided almost infinitely. However, since it is impossible to count the light leaving the light source, it is almost impossible to simulate natural phenomena in this way.

Backward ray tracing is a method of finding an object that emits light by following the observer's line of sight. Since the line of sight hits an object, the light of the object enters the eye in that direction, so if the light is traced back in the direction of the line of sight, only the ray that is entering the eye can be traced. Therefore, it can be seen that the reverse ray tracing method which traces the light back in the eye direction is more efficient than the forward ray tracing method which finds the light which will enter the eye among the infinitely many lights from the light source.

Backward ray tracing refers to this backtracking method, which is a method of determining the color and brightness of a ray by using a shading model when it follows a path of the eye and arrives at an object that scatters light. In general, ray tracing refers to a reverse ray tracing method, which is a method of eye tracking from an eye to a light source, not a forward ray tracing method of tracing a ray from a light source.

The principle of setting the gaze starting from the eye is to place the screen between the eye and the object and correspond to the pixel and the gaze. When you follow a line of sight from your eyes and meet the mirror, you change your direction by reflecting your line of sight, and then you find another object that meets your line of sight. And when the line of sight touches an object that scatters light, such as a cloth, it is difficult to track the line of sight anymore, so the color of the cloth is painted on the pixels.In this case, the color of the light source and the surface of the object Use relationships to add brightness to colors.

In ray tracing, the line of sight is represented by a linear equation, a method of calculating the intersection of a line and an object, and the closest point of intersection with various objects is regarded as the place where the line of sight arrives.

When the light is irradiated on a smooth plane, some of it reflects and some of it travels inward. Reflected light reflects at the same angle as the angle of incidence, but the direction of the light inwards depends on the refractive index of the material making up the object. The two lights separated from the surface are irradiated to the other surface again and separated, and a plurality of reflections and refractions are repeated, so that the light toward the viewpoint is actually observed. That is, the brightness observed at the viewpoint is given as the sum of reflected light and transmitted light from inside the object.

In the case of using a conventional shaded model, when calculating the repetition of reflection and separation starting from the light source, the light reached to the point of view is extremely small, and many other computational processes are unnecessary for the graphic representation. Accordingly, ray tracing uses a method of obtaining color information of only pixel points of a two-dimensional image, starting from the view direction in the reverse direction.

Accordingly, ray tracing consists of a task of generating and emitting a unit light source toward each pixel point of the image plane from the viewpoint, finding the intersection point of the light beam and the object, and finding the color information of the object at the intersection point. At this time, if the light ray connecting the pixel point and the pinhole does not intersect the object once, it becomes a background color (for example, black). If it intersects with an object, if the object is opaque, the shadow model calculates the brightness of the point.

Hereinafter, the ray tracing calculation system 100 configured to calculate the above ray tracing will be described.

2 is a hardware configuration diagram of a ray tracing calculation system 100 according to an embodiment of the present invention.

The ray tracing operation system 100 includes a ray tracing unit (RTU) 10, a data buffer 20, a graphic processing unit (GPU) 30, and a memory (DRAM) 40. . The ray tracing associating system 100 may include software (SW) 50 for driving the above-described components.

The Ray Tracing Unit (RTU) 10 performs the ray tracing operation described above.

Specific ray tracing calculation methods include object structure build (S100), ray generation (S200), data traversal (S300), intersection test (S400), and ray shading (ray). Shading, S500). This will be described later in detail with reference to FIG. 3.

Here, the ray tracing unit 10 may perform an object structure build (S100), a ray generation (S200), a data traversal (S300), and an intersection test (S400). Configured to perform.

First, the ray tracing unit 10 performs an object structure build (S100). Objects are created on the screen by using the information of the objects stored in the memory 40.

The ray tracing unit 10 performs ray generation (S200).

The ray tracing unit 10 generates a ray in order to determine color information of pixels in the screen through ray tracing. Initially, the ray that is emitted from each pixel toward the object is called Eye Ray (Primary Ray). When the ray meets the object, the ray, shadow, reflection, and refraction ( Refraction A ray is generated, which can be referred to as a secondary ray.

The ray tracing unit 10 performs data traversal S300.

The ray tracing unit 10 is a scene graph in which the ray tracing unit 10 aligns the objects to be rendered according to each coordinate in order to search for which polygon rays intersect with respect to the generated rays. Use (Scene Graph). This is composed of a tree-type data structure, and each node means a polygon to be rendered. The ray tracing unit 10 searches for a node of the tree according to the direction of the generated light ray.

The ray tracing unit 10 performs an intersection test S400.

The ray tracing unit 10 determines whether a polygon and a ray included in the found node are intersected. The ray tracing unit 10 calculates and collects point information in the ray and the intersection polygon.

That is, the ray tracing unit 10 generates ray (Eye Ray and Secondary Ray) for each pixel (Pixel) of the screen (Ray Generation), and the data traversal of the three-dimensional model (Geometry) for the generated rays (Data Traversal) and Intersection Test.

The ray tracing unit 10 generates intersected point data and stores it in the data buffer 20. Intersected Point Data includes coordinates, normals, colors, texture coordinates, and the like of the intersection points.

Accordingly, the ray tracing unit 10 performs a ray tracing operation using the object structure, the geometry object, and the texture information stored in the memory 40, and calculates the result of the calculation. It is stored in the buffer 20.

The data buffer 20 stores intersected point data generated by the ray tracing unit 10.

Intersected Point Data includes Intersected Polygon Information, Address of Polygon in Geometry, Texture ID of Intersected Polygon, and Fog Information. ), And the like.

As shown in the drawing, the data buffer 20 may be implemented as a buffer located separately from the ray tracing unit 10 and the graphic processing unit 30. In addition, the data buffer 20 may be implemented to be located in the ray tracing unit 10, the graphics processing unit 30, or the memory 40.

The graphic processing unit (GPU) 30 is configured to perform ray shading S500.

That is, the graphic processing unit 30 calculates the color of the intersection point using a shader (Lighting Calculation), calculates the texture value of the intersection point (Texture Mapping), fog, color blending, and anti- Aliasing calculation is performed to generate color information of the final pixel. The graphic processing unit 30 stores the color information of the generated last pixel Pixel in the frame buffer 44.

Data storage (DRAM) 40 includes Object Structure Memory (41), Geometry Object Memory (42), Texture Memory (43) and Frame Buffer (44). do.

The object structure memory 41 stores structure information of objects.

The Geometry Object Memory 42 stores geometric information in the screen of objects.

The texture memory 43 includes texture information of objects.

The frame buffer 44 stores color information of the pixel Pixel in which ray tracing is calculated through the ray tracing unit 10 and the graphic processing unit 30.

The software (Software, SW) 50 generates a shader program for the graphics processing unit 30 to perform ray shading S500 and transmits the shader program to the graphics processing unit 30.

The ray tracing arithmetic apparatus 100 described above is an apparatus which jointly uses the ray trace unit 10 and the existing graphic processing unit 30 which are hardware dedicated to ray trace calculation in order to calculate ray tracing.

Accordingly, the ray tracing arithmetic apparatus 100 processes the ray shading operation in the existing graphic processing unit 30 to reduce the size of the ray tracing unit 10 and to reduce the size of the graphic processing unit 30. It can increase the utilization. In addition, the construction cost of the ray tracing operation system 100 may be reduced.

3 is a flowchart of a ray tracing calculation method according to an exemplary embodiment of the present invention.

Ray tracing calculation methods include Object Strcuture Build (S100), Ray Generation (S200), Data Traversal (S300), Intersection Test (S400), and Ray Shading. , S500).

The object structure build (S100) generates objects on the screen by using the information of the objects stored in the memory 40.

The object structure build S100 is performed in the ray tracing unit 10.

Ray generation S200 generates rays in order to determine color information of pixels in the screen through ray tracing. Initially, the ray that is emitted from each pixel toward the object is called Eye Ray (Primary Ray). When the ray meets the object, the ray, shadow, reflection, and refraction ( Refraction A ray is generated, which can be referred to as a secondary ray.

The ray generation S200 is performed by the ray tracing unit 10.

The data traversal S300 detects which polygon rays intersect with respect to the generated rays, and the ray tracing unit 10 according to the coordinates of the objects to render. Use an aligned scene graph. This is composed of a tree-type data structure, and each node means a polygon to be rendered. Then, the nodes of the tree are searched according to the generated light beams.

The data traversal S300 is performed by the ray tracing unit 10.

The intersection test S400 determines whether a polygon and a ray included in the found node are intersected with each other. The point information in the light ray and the intersection polygon is calculated and collected.

The intersection test S400 is performed in the ray tracing unit 10.

Ray Shading (S500) uses a shader to calculate the color of the intersection (Lighting Calculation), calculate the texture of the intersection (Texture Mapping), Fog, Color blending, Anti -Aliasing Calculation is performed to generate color information of the final pixel. The color information of the generated final pixel Pixel is stored in the frame buffer 44.

Ray shading S500 is performed in the graphic processing unit 30.

4 is a hardware configuration diagram of a ray tracing calculation system 100 reflecting the ray tracing calculation method according to an embodiment of the present invention.

The ray tracing calculation system 100 may include a ray tracing unit (RTU) 10, an intersection point data buffer 20, a graphic processing unit (GPU) 30, and a memory (DRAM). , 40). The ray tracing associating system 100 may include software (SW) 50 for driving the above-described components.

As described above, the ray tracing unit (RTU) 10 performs a ray tracing operation.

In detail, the ray tracing unit 10 may include an object structure build (S100), a ray generation (S200), a data traversal (S300), an intersection test (S400), Perform Ray Shading (S500). Each step is described above with reference to FIGS. 2 and 3.

The ray tracing unit 10 generates intersected point data and stores it in the data buffer 20. Intersected Point Data includes coordinates, normals, colors, texture coordinates, and the like of the intersection points.

The intersection point data buffer 20 stores the intersection point data generated by the ray tracing unit 10.

As described above, the intersected point data may include intersected polygon information, address of polygon in geometry, texture ID of intersected polygon, Fog information and the like.

As illustrated, the intersection data buffer 20 may be implemented as a buffer located separately from the ray tracing unit 10 and the graphic processing unit 30. In addition, the intersection data buffer 20 may be implemented to be located in the ray tracing unit 10, the graphics processing unit 30, or the memory 40.

The graphic processing unit (GPU) 30 is configured to perform the above-described ray shading S500.

The data storage 40 includes an object structure memory 41, a geometry object memory 42, a texture memory 43, and a frame buffer 44. do.

The software (Software, SW) 50 generates a shader program for the graphics processing unit 30 to perform ray shading S500 and transmits the shader program to the graphics processing unit 30.

5 is a software configuration diagram 200 of a ray tracing calculation system 100 according to an embodiment of the present invention.

The ray tracing operation system 100 may include a ray tracing API 50 with the above-described software (Software, SW, 50).

The ray tracing API 50 performs lighting calculation, texture mapping, fog, alpha-blending, anti-aliasing calculation, etc. for each pixel using the data in the intersection data buffer 20 as a parameter. A shader program is generated, and the generated shader program is input to the graphic processing unit 30.

In addition, the graphic processing unit 30 calculates the color of the intersection using a shader program (Lighting Calculation), calculates the texture value of the intersection (Texture Mapping), Fog, Alpha-Blending. And anti-aliasing calculation to generate color information of the final pixel. The graphic processing unit 30 stores the color information of the generated last pixel Pixel in the frame buffer 44.

In addition, the ray tracing API 50 stores the intersected point data calculated by the ray tracing unit 10 in the intersect data buffer 20, and stores the intersected point data in the shader program. The graphics processing unit 30 including) may process the color information of the generated final pixel Pixel in the frame buffer 44.

While the present invention has been described in connection with what is presently considered to be practical exemplary embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, Of course.

Claims (10)

A ray tracing calculation system configured to calculate ray tracing for generating a three-dimensional image in which an optical effect is reflected,
Ray Generation, which generates rays from the pixels in the screen toward the object, depending on the direction of the rays to determine which polygons the generated rays intersect. A ray tracing unit for performing an intersection test for generating data traversal to search, and intersected point data including position information in a polygon that intersects a ray; And
And a graphic processing unit configured to calculate color and texture of the intersection by using the intersection data and to generate color information of the pixel.
The method of claim 1,
The ray tracing unit,
Ray tracing operation system for performing an object structure (Object Structure Build) for generating the object on the screen.
The method of claim 1,
And a cross point data buffer including the cross point data generated by the ray tracing unit.
The method of claim 1,
The intersection data is,
Ray tracing operation system including at least one of the coordinates, normals, colors, texture information of the intersection.
The method of claim 1,
Ray tracing operation system further comprises a memory containing the information of the object.
The method of claim 5, wherein
The memory comprising:
Ray tracing operation system including at least one of the structure information, geometry information, texture information of the object.
The method of claim 5, wherein
The memory comprising:
A ray tracing operation system including color information of the pixel generated in the graphics processing unit.
The method of claim 1,
The graphic processing unit,
Ray tracing calculation system for performing a calculation for generating a two-dimensional image.
The method of claim 1,
The graphic processing unit,
Ray tracing operation system for generating color information of the pixel by performing at least one of the fog, alpha-blending, anti-aliasing operation of the intersection.
The method of claim 9,
The graphic processing unit,
A ray tracing operation system for generating color information of the pixel using a shader program.
KR1020120079914A 2012-07-23 2012-07-23 Ray-tracing arithmetic operation method and system KR20140013292A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020120079914A KR20140013292A (en) 2012-07-23 2012-07-23 Ray-tracing arithmetic operation method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020120079914A KR20140013292A (en) 2012-07-23 2012-07-23 Ray-tracing arithmetic operation method and system

Publications (1)

Publication Number Publication Date
KR20140013292A true KR20140013292A (en) 2014-02-05

Family

ID=50263749

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020120079914A KR20140013292A (en) 2012-07-23 2012-07-23 Ray-tracing arithmetic operation method and system

Country Status (1)

Country Link
KR (1) KR20140013292A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180021817A (en) * 2015-06-26 2018-03-05 일렉트로닉 아트 아이엔씨. Simplify small mesh components with redundant backplanes
WO2017099557A3 (en) * 2015-12-09 2018-03-08 삼성전자 주식회사 Method and device for determining illumination of 3d virtual scene

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180021817A (en) * 2015-06-26 2018-03-05 일렉트로닉 아트 아이엔씨. Simplify small mesh components with redundant backplanes
WO2017099557A3 (en) * 2015-12-09 2018-03-08 삼성전자 주식회사 Method and device for determining illumination of 3d virtual scene
US10510184B2 (en) 2015-12-09 2019-12-17 Samsung Electronics Co., Ltd. Method and device for determining illumination of 3D virtual scene

Similar Documents

Publication Publication Date Title
CN110827389B (en) Tight ray triangle intersection
US9984492B2 (en) Efficient hierarchy traversal in ray tracing applications
KR101054702B1 (en) Determining Pixel Colors in Ray-traced Image Processing Systems
US7940265B2 (en) Multiple spacial indexes for dynamic scene management in graphics rendering
US8018453B2 (en) Deferred acceleration data structure optimization for improved performance
CN101506847B (en) Methods and systems for partitioning a spatial index
US8619078B2 (en) Parallelized ray tracing
KR101076807B1 (en) Ray tracing apparatus and method
JP5460259B2 (en) System, method and program for realistic image processing using ambient occlusion
US8339398B2 (en) Integrated acceleration data structure for physics and ray tracing workload
JP2009525526A (en) Method for synthesizing virtual images by beam emission
US20080122838A1 (en) Methods and Systems for Referencing a Primitive Located in a Spatial Index and in a Scene Index
CN108090947A (en) A kind of ray tracing optimization method towards 3D scenes
CN108205819A (en) For passing through the complicated device and method for illuminating lower path tracing and carrying out scene rendering
Sabino et al. A hybrid GPU rasterized and ray traced rendering pipeline for real time rendering of per pixel effects
KR102151443B1 (en) Graphics processing apparatus based on hybrid gpu architecture
CN101441774B (en) Dynamic scene real time double face refraction drafting method based on image mapping space
Ganestam et al. Real-time multiply recursive reflections and refractions using hybrid rendering
CN114758051A (en) Image rendering method and related equipment thereof
JP2011138445A (en) Transparent object display circuit
Hu et al. Interactive approximate rendering of reflections, refractions, and caustics
CN104240286A (en) Real-time reflection method based on screen space
KR20140013292A (en) Ray-tracing arithmetic operation method and system
KR20160071774A (en) Apparatus, Method and recording medium for processing image
KR101281156B1 (en) Ray tracing core and processing mehtod for ray tracing

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E601 Decision to refuse application