CN107633549B - Real-time rendering method and device based on ambient illumination probe - Google Patents

Real-time rendering method and device based on ambient illumination probe Download PDF

Info

Publication number
CN107633549B
CN107633549B CN201710950129.2A CN201710950129A CN107633549B CN 107633549 B CN107633549 B CN 107633549B CN 201710950129 A CN201710950129 A CN 201710950129A CN 107633549 B CN107633549 B CN 107633549B
Authority
CN
China
Prior art keywords
probe
illumination
rendered
distance
rendered object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710950129.2A
Other languages
Chinese (zh)
Other versions
CN107633549A (en
Inventor
刘捷
郝展
陆利民
柳尧顺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Snail Digital Technology Co Ltd
Original Assignee
Suzhou Snail Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Snail Digital Technology Co Ltd filed Critical Suzhou Snail Digital Technology Co Ltd
Priority to CN201710950129.2A priority Critical patent/CN107633549B/en
Publication of CN107633549A publication Critical patent/CN107633549A/en
Application granted granted Critical
Publication of CN107633549B publication Critical patent/CN107633549B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Generation (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a real-time rendering method based on an ambient light probe. Respectively acquiring illumination information by using a group of environment illumination probes, then respectively correcting the shielding influence of the group of illumination information, and finally performing rendering calculation on a rendered object by using the corrected illumination information; the shading influence correction is specifically as follows: for each environment illumination probe, respectively connecting the probe with a plurality of preset different position points on the rendered object by straight line segments; assigning each line segment according to whether the line segment intersects with other objects: if the intersection is crossed, the value is assigned to 0, otherwise, the value is assigned to 1; and taking the ratio of the sum of the assigned values of the straight line segments to the total number of the straight line segments as a shading influence correction coefficient, wherein the product of the shading influence correction coefficient and the illumination information acquired by the environmental illumination probe is the illumination information after the shading influence of the environmental illumination probe is corrected. The invention also discloses a real-time rendering device based on the ambient light probe. The invention can make the rendering effect approach to the actual situation.

Description

Real-time rendering method and device based on ambient illumination probe
Technical Field
The invention relates to an image rendering method, in particular to a real-time rendering method based on an ambient light probe, and belongs to the technical field of computer image processing.
Background
An ambient light probe (light probe) is a lighting rendering technology, and its basic principle is, as its name implies, to arrange a series of "sampling points" (i.e. ambient light probes) recording lighting intensities received from all directions around in a scene, collect the lighting intensities (lighting information) received from all directions around, and apply the lighting information to a model to be rendered. The environment illumination probe records illumination information in the surrounding environment, and the illumination information recorded by the environment illumination probe is used for influencing the final illumination effect of the rendered object during real-time rendering. The technology can not consume too much energy like global real-time illumination, so that the real-time fusion effect with a static object and a static scene is realized. The basic principles of ambient light intensity information generation by the Environment probe and acting on the rendered objects can be referred to Chapter 10.Real-Time Computation of Dynamic Irradition environmental Maps in GPU Gems 2 (Chinese translation: GPU essence 2-Dynamic Irradiance Environment mapping Real-Time Computation in high-performance graphics chip and general computing programming skills) and other references attached later in this document.
However, the existing ambient light probe implementation method does not consider whether the probe and the rendered object are shielded or not. For example, there is a red light source and the object to be rendered spaced by a wall, the object to be rendered is on the right side of the wall, the red light source is on the left side of the wall, and there is a probe on the left side of the wall, the illumination data recorded by the probe includes the illumination of the red light source, so that the object to be rendered is affected by the red light source, while in the real world, because the wall blocks the light of the red light source, the red light source should not irradiate the object to be rendered.
Therefore, it is necessary to improve the existing real-time rendering technology based on the existing environmental illumination probe, and fully consider the occlusion effect, so that the rendering result based on the illumination probe is more real and credible, and the performance requirement of real-time rendering is satisfied.
Disclosure of Invention
The technical problem to be solved by the invention is to overcome the defects of the prior art and provide a real-time rendering method based on an environment illumination probe, wherein the illumination information of the environment illumination probe is corrected according to the shielding condition, so that the illumination influence degree of the environment illumination probe on a rendered object is adjusted, and the final rendering effect is closer to the real environment.
The invention specifically adopts the following technical scheme to solve the technical problems:
a real-time rendering method based on an environment illumination probe comprises the steps of utilizing a group of environment illumination probes to respectively acquire a group of illumination information of a rendered object, wherein the group of illumination information corresponds to the environment illumination probes one to one, then respectively correcting shading influence of the group of illumination information, and finally utilizing the corrected illumination information to conduct rendering calculation on the rendered object; the occlusion impact correction is specifically as follows: for each environment illumination probe, respectively connecting the environment illumination probe with a plurality of preset different position points on the rendered object by straight line segments; assigning each line segment according to whether the line segment intersects with other objects: intersecting with other objects, and assigning a value of 0 to the straight line segment, or assigning a value of 1 to the straight line segment; and taking the ratio of the sum of the assigned values of the straight-line segments of the environment illumination probe to the total number of the straight-line segments as a shielding influence correction coefficient of the environment illumination probe, wherein the product of the shielding influence correction coefficient and the illumination information acquired by the environment illumination probe is the illumination information after the shielding influence of the environment illumination probe is corrected.
Further, the method further comprises the step of performing distance influence correction on the illumination information after shading influence correction, which specifically comprises the following steps: let the average distance between the ambient light probe and the rendered object be D0The distance between the ith ambient light probe and the rendered object is DiIn 2D0/(D0+Di) As a distance influence correction coefficient, multiplying the distance influence correction coefficient by the illumination information corrected by the shielding influence of the ith environmental illumination probe; or, the maximum distance between the environment illumination probe and the rendered object is set as DmaxThe distance between the ith ambient light probe and the rendered object is DiThen, with (D)max-Di)/DmaxAnd as the distance influence correction coefficient, multiplying the distance influence correction coefficient by the illumination information after the shading influence correction of the ith environment illumination probe.
Preferably, the plurality of preset different position points on the rendered object comprise a plurality of different position points on the outline of the rendered object.
Preferably, the plurality of different position points on the contour line of the object to be rendered are substantially equally divided into a circle with the center point of the object to be rendered as the center point.
Preferably, the plurality of preset different location points on the rendered object further includes a center point of the rendered object.
Preferably, the plurality of preset different position points on the rendered object include a center point of each block of the rendered object after the rendered object is split into the plurality of blocks.
Preferably, in the rendering calculation, the illumination information E of the object to be rendered is calculated by using the following formula:
Figure BDA0001432672480000021
wherein M is the total number of ambient light probes;
Figure BDA0001432672480000022
Withe total correction factor for the ith ambient light probe; and E' is the corrected illumination information of the ith environment illumination probe.
The following technical scheme can be obtained according to the same invention concept:
an apparatus for real-time rendering based on an ambient light probe, the apparatus comprising:
the system comprises a group of environment illumination probes, a group of image processing devices and a control device, wherein the group of environment illumination probes are used for respectively acquiring a group of illumination information of a rendered object, which corresponds to the environment illumination probes one to one;
the shielding influence correction module is used for respectively correcting the shielding influence of the group of illumination information acquired by the environmental illumination probe according to the following method: for each environment illumination probe, respectively connecting the environment illumination probe with a plurality of preset different position points on the rendered object by straight line segments; assigning each line segment according to whether the line segment intersects with other objects: intersecting with other objects, and assigning a value of 0 to the straight line segment, or assigning a value of 1 to the straight line segment; taking the ratio of the sum of the assigned values of the straight-line segments of the environment illumination probe to the total number of the straight-line segments as a shielding influence correction coefficient of the environment illumination probe, wherein the product of the shielding influence correction coefficient and the illumination information acquired by the environment illumination probe is the illumination information after the shielding influence of the environment illumination probe is corrected;
and the rendering calculation module is used for performing rendering calculation on the object to be rendered by using the corrected illumination information.
Further, the apparatus further comprises:
the distance influence correction module is used for performing distance influence correction on the illumination information after the shading influence correction according to the following method: let the average distance between the ambient light probe and the rendered object be D0The distance between the ith ambient light probe and the rendered object is DiIn 2D0/(D0+Di) As the distance-influence correction coefficient, multiplying the distance-influence correction coefficient by the second factorShielding the i ambient illumination probes to influence the corrected illumination information; or, the maximum distance between the environment illumination probe and the rendered object is set as DmaxThe distance between the ith ambient light probe and the rendered object is DiThen, with (D)max-Di)/DmaxAnd as the distance influence correction coefficient, multiplying the distance influence correction coefficient by the illumination information after the shading influence correction of the ith environment illumination probe.
Preferably, the plurality of preset different position points on the rendered object comprise a plurality of different position points on the outline of the rendered object.
Preferably, the plurality of different position points on the contour line of the object to be rendered are substantially equally divided into a circle with the center point of the object to be rendered as the center point.
Preferably, the plurality of preset different location points on the rendered object further includes a center point of the rendered object.
Preferably, the plurality of preset different position points on the rendered object include a center point of each block of the rendered object after the rendered object is split into the plurality of blocks.
Preferably, the rendering calculation module calculates the illumination information E of the object to be rendered by using the following formula:
Figure BDA0001432672480000031
wherein M is the total number of ambient light probes;
Figure BDA0001432672480000032
Withe total correction factor for the ith ambient light probe; and E' is the corrected illumination information of the ith environment illumination probe.
Compared with the prior art, the technical scheme and the further improved technical scheme of the invention have the following beneficial effects:
the method fully considers the influence degree of the illumination information acquired by the environment illumination probe on the illumination condition of the rendered object and the relation of the shielding condition between the environment illumination probe and the rendered object, accurately detects the shielding condition between the environment illumination probe and the rendered object by using a simple and convenient algorithm, and corrects the original illumination information acquired by the environment illumination probe according to the detection result, so that the final rendering effect approaches to the actual condition to a greater extent.
The invention further corrects the distance influence of the illumination information according to the distance between the environment illumination probe and the object to be rendered, improves the influence degree of the illumination information acquired by the environment illumination probe with a short distance on the rendering calculation result, and reduces the influence degree of the illumination information acquired by the environment illumination probe with a long distance on the rendering calculation result, thereby further improving the reality of the rendering effect.
The method has the advantages of simple algorithm, easy realization, low requirements on software and hardware of the system, almost no extra resource consumption and capability of fully meeting the real-time requirement of rendering.
Detailed Description
Aiming at the problem that the shielding condition between the probe and the object to be rendered is not considered in the existing real-time rendering technology based on the ambient illumination probe, the shielding condition between the ambient illumination probe and the object to be rendered is accurately detected by a simple and convenient method, and the original illumination information acquired by the ambient illumination probe is corrected according to the detection result, so that the final rendering effect is close to the actual condition to the maximum extent.
Specifically, a group of environment illumination probes are used for respectively acquiring a group of illumination information of the rendered object, which corresponds to the environment illumination probes one by one, then the group of illumination information is respectively subjected to shading influence correction, and finally the rendered object is subjected to rendering calculation by using the corrected illumination information; the occlusion impact correction is specifically as follows: for each environment illumination probe, respectively connecting the environment illumination probe with a plurality of preset different position points on the rendered object by straight line segments; assigning each line segment according to whether the line segment intersects with other objects: intersecting with other objects, and assigning a value of 0 to the straight line segment, or assigning a value of 1 to the straight line segment; and taking the ratio of the sum of the assigned values of the straight-line segments of the environment illumination probe to the total number of the straight-line segments as a shielding influence correction coefficient of the environment illumination probe, wherein the product of the shielding influence correction coefficient and the illumination information acquired by the environment illumination probe is the illumination information after the shielding influence of the environment illumination probe is corrected.
The preset different position points on the rendered object can be flexibly selected according to actual conditions, on one hand, all shelters can be detected as far as possible, and on the other hand, the selected position points are as few as possible so as to avoid occupying too much computing resources. For example, a plurality of different position points may be selected from the contour line of the object to be rendered, which position points preferably substantially equally divide a circle centered around a center point of the object to be rendered, such that the position points are substantially evenly distributed on the contour line of the object to be rendered. Furthermore, the center point of the rendered object can be further added on the basis of the image. Or selecting the central point of each block obtained by splitting the rendered object into a plurality of blocks. For example, a person may be divided into six parts, namely, a trunk, a head and four limbs, and the central points of the six parts are taken as preset position points.
In order to further improve the reality of the rendering effect, on the basis of shading influence correction, distance influence correction is further performed on the illumination information according to the distance condition between the environment illumination probe and the rendered object, so that the influence degree of the illumination information acquired by the environment illumination probe with a short distance on the rendering calculation result is improved, and the influence degree of the illumination information acquired by the environment illumination probe with a long distance on the rendering calculation result is reduced. The specific distance influence correction mode can be flexibly constructed, and two preferable modes are given in the invention as follows:
the first way, let D be the average distance between the ambient light probe and the object to be rendered0The distance between the ith ambient light probe and the rendered object is DiIn 2D0/(D0+Di) And as the distance influence correction coefficient, multiplying the distance influence correction coefficient by the illumination information after the shading influence correction of the ith environment illumination probe to obtain the illumination information of the environment illumination probe after the distance influence correction.
The second mode is as follows: setting the maximum distance between the environment illumination probe and the rendered object as DmaxThe distance between the ith ambient light probe and the rendered object is DiThen, with (D)max-Di)/DmaxAnd as the distance influence correction coefficient, multiplying the distance influence correction coefficient by the illumination information after the shading influence correction of the ith environment illumination probe to obtain the illumination information of the environment illumination probe after the distance influence correction.
To facilitate understanding of the public, the real-time rendering method of the present invention is further described in detail in a specific embodiment below.
The real-time rendering device in this embodiment includes:
the system comprises a group of environment illumination probes, a group of image processing devices and a control device, wherein the group of environment illumination probes are used for respectively acquiring a group of illumination information of a rendered object, which corresponds to the environment illumination probes one to one;
the shielding influence correction module is used for respectively correcting the shielding influence of the group of illumination information acquired by the environmental illumination probe;
the distance influence correction module is used for carrying out distance influence correction on the illumination information subjected to shading influence correction;
and the rendering calculation module is used for performing rendering calculation on the object to be rendered by using the corrected illumination information.
The real-time rendering process of the device specifically comprises the following steps:
step S1, the ambient illumination probe acquires ambient illumination information, and when the rendered object is within the influence range of the ambient illumination probe, the occlusion judgment between the rendered object and the probe is triggered by the occlusion influence correction module;
the ambient light probe in this embodiment is provided by the prior art, and for brevity, will not be described herein again. Step S2, for each environment illumination probe, the shading influence correction module connects the environment illumination probe with a plurality of preset different position points on the rendered object by straight line segments respectively;
suppose that n different location points are preset on the rendered object, including the central point of the rendered object and the center point of the rendered objectN-1 position points on the contour line of the rendered object, which are basically equally divided by the circumference with the center point as the center of a circle, are generated in total, and are respectively marked as L1,L2,L3,…,Ln-1,Ln
Step S3, the shading influence correction module assigns a value to each straight line segment according to whether the straight line segment intersects with other objects;
if any object intersects with the straight line segment, the fact that the space between the ambient light probe and the rendered object is blocked in the direction of the straight line segment is represented, and the straight line segment is assigned to be 0; conversely, if the straight line segment does not intersect any object, it indicates that the ambient light probe and the rendered object are not occluded in the direction of the straight line segment, and is assigned a value of 1. Each straight line segment has only two possibilities, occluded and unoccluded, and therefore has a value of either 0 or 1.
Step S4, the shading influence correction module calculates the shading influence correction coefficient of each environment illumination probe;
for the ith of these ambient illumination probes, assume that the raw illumination information it acquires is represented as E by a vectoriThe ratio of the sum of the assignments of the n straight line segments to n is the shading influence correction coefficient of the environmental illumination probe and is recorded as Bi. As can be seen, BiHas a value range of [0, 1 ]]When B is presentiWhen the value of (1) is 0, the environment illumination probe and the object to be rendered are completely shielded; when B is presentiWhen the value of (1) is 1, the environment illumination probe and the object to be rendered are completely free from shielding; when B is presentiWhen the value of (A) is between 0 and 1, it indicates that there is partial occlusion between the ambient light probe and the object to be rendered, BiThe smaller the value of (A), the more severe the occlusion degree is.
Step S5, the shading influence correction module calculates the lighting information of each environment lighting probe after shading influence correction;
for the ith environment illumination probe, the illumination information after the shading influence correction is the product of the shading influence correction coefficient and the original illumination information acquired by the environment illumination probe, namely E'i=Ei×Bi. Thus, the device is provided withThe illumination information corrected by the shielding influence of each ambient illumination probe can be obtained.
Step S6, the distance influence correction module calculates the distance influence correction coefficient of each environment illumination probe and corrects the distance influence;
let the average distance between the ambient light probe and the rendered object be D0The distance between the ith ambient light probe and the rendered object is DiIn 2D0/(D0+Di) As a distance-influence correction coefficient Hi(ii) a Or, the maximum distance between the environment illumination probe and the rendered object is set as DmaxThe distance between the ith ambient light probe and the rendered object is DiThen, with (D)max-Di)/DmaxAs a distance-influence correction coefficient Hi
And after the distance influence correction coefficient of each environmental illumination probe is obtained, multiplying the distance influence correction coefficient by the illumination information after the shielding influence correction of the corresponding environmental illumination probe to obtain the illumination information of the environmental illumination probe after the distance influence correction, namely the illumination information vector of the ith environmental illumination probe after the shielding influence correction and the distance influence correction.
E″i=E′i×Hi=Ei×Bi×Hi
Step S7, the rendering calculation module uses the corrected illumination information to perform rendering calculation of the object to be rendered;
the rendering calculation of the rendered object by utilizing the illumination information of the environment illumination probe is the existing mature technology, and the final illumination result of the rendered object is calculated by utilizing the existing standard rendering calculation method based on the illumination texture or the spherical harmonic function according to the illumination information storage mode used by the environment illumination probe. Various existing improved illumination probe rendering technologies can also be utilized, for example, "a rendering method for dynamically calculating indirect reflection highlight based on optical probe interpolation" disclosed in chinese patent application CN106210741A (published: 2016/12/7) ".
When performing rendering calculation, the illumination information E of the object to be rendered can be calculated by using the following simple weighting formula:
Figure BDA0001432672480000071
wherein M is the total number of ambient light probes;
Figure BDA0001432672480000072
Wias the total correction coefficient of the ith ambient light probe, in the present embodiment, since the shading influence correction and the distance influence correction are performed simultaneously, W isi=Bi×Hi
In the above embodiment, the shading influence correction coefficient and the distance influence correction coefficient are used to perform shading influence correction and distance influence correction on the illumination information, and only the original illumination information is multiplied by the corresponding correction coefficient. Therefore, the calculation of the shading influence correction coefficient and the distance influence correction coefficient has no precedence relationship, and can be carried out successively or in parallel so as to improve the overall rendering efficiency.

Claims (14)

1. A real-time rendering method based on an environment illumination probe is characterized in that a group of environment illumination probes are used for respectively acquiring a group of illumination information of a rendered object, which corresponds to the environment illumination probes one by one, then the group of illumination information is respectively subjected to shading influence correction, and finally the rendered object is rendered and calculated by using the corrected illumination information; the occlusion impact correction is specifically as follows: for each environment illumination probe, respectively connecting the environment illumination probe with a plurality of preset different position points on the rendered object by straight line segments; assigning each line segment according to whether the line segment intersects with other objects: intersecting with other objects, and assigning a value of 0 to the straight line segment, or assigning a value of 1 to the straight line segment; and taking the ratio of the sum of the assigned values of the straight-line segments of the environment illumination probe to the total number of the straight-line segments as a shielding influence correction coefficient of the environment illumination probe, wherein the product of the shielding influence correction coefficient and the illumination information acquired by the environment illumination probe is the illumination information after the shielding influence of the environment illumination probe is corrected.
2. The method of claim 1, further comprising performing distance influence correction on the illumination information after shading influence correction, specifically as follows: let the average distance between the ambient light probe and the rendered object be D0The distance between the ith ambient light probe and the rendered object is DiIn 2D0/(D0+Di) As a distance influence correction coefficient, multiplying the distance influence correction coefficient by the illumination information corrected by the shielding influence of the ith environmental illumination probe; or, the maximum distance between the environment illumination probe and the rendered object is set as DmaxThe distance between the ith ambient light probe and the rendered object is DiThen, with (D)max-Di)/DmaxAnd as the distance influence correction coefficient, multiplying the distance influence correction coefficient by the illumination information after the shading influence correction of the ith environment illumination probe.
3. The method of claim 1, wherein the plurality of different preset location points on the rendered object comprises a plurality of different location points on a contour of the rendered object.
4. The method of claim 3, wherein the plurality of different points located on the object contour line to be rendered are substantially equally divided by a circle centered at the center point of the object to be rendered.
5. The method of claim 3 or 4, wherein the plurality of predetermined different location points on the rendered object further comprises a center point of the rendered object.
6. The method of claim 1, wherein the plurality of predetermined different location points on the rendered object comprises a center point of each of the plurality of segments into which the rendered object is split.
7. The method according to claim 1 or 2, wherein in the rendering calculation, the illumination information E of the object to be rendered is calculated using the following formula:
Figure FDA0001432672470000011
wherein M is the total number of ambient light probes;
Figure FDA0001432672470000021
Withe total correction factor for the ith ambient light probe; and E' is the corrected illumination information of the ith environment illumination probe.
8. A real-time rendering device based on an ambient light probe, the device comprising:
the system comprises a group of environment illumination probes, a group of image processing devices and a control device, wherein the group of environment illumination probes are used for respectively acquiring a group of illumination information of a rendered object, which corresponds to the environment illumination probes one to one;
the shielding influence correction module is used for respectively correcting the shielding influence of the group of illumination information acquired by the environmental illumination probe according to the following method: for each environment illumination probe, respectively connecting the environment illumination probe with a plurality of preset different position points on the rendered object by straight line segments; assigning each line segment according to whether the line segment intersects with other objects: intersecting with other objects, and assigning a value of 0 to the straight line segment, or assigning a value of 1 to the straight line segment; taking the ratio of the sum of the assigned values of the straight-line segments of the environment illumination probe to the total number of the straight-line segments as a shielding influence correction coefficient of the environment illumination probe, wherein the product of the shielding influence correction coefficient and the illumination information acquired by the environment illumination probe is the illumination information after the shielding influence of the environment illumination probe is corrected;
and the rendering calculation module is used for performing rendering calculation on the object to be rendered by using the corrected illumination information.
9. The apparatus of claim 8, further comprising:
the distance influence correction module is used for performing distance influence correction on the illumination information after the shading influence correction according to the following method: let the average distance between the ambient light probe and the rendered object be D0The distance between the ith ambient light probe and the rendered object is DiIn 2D0/(D0+Di) As a distance influence correction coefficient, multiplying the distance influence correction coefficient by the illumination information corrected by the shielding influence of the ith environmental illumination probe; or, the maximum distance between the environment illumination probe and the rendered object is set as DmaxThe distance between the ith ambient light probe and the rendered object is DiThen, with (D)max-Di)/DmaxAnd as the distance influence correction coefficient, multiplying the distance influence correction coefficient by the illumination information after the shading influence correction of the ith environment illumination probe.
10. The apparatus of claim 8, wherein the plurality of different preset location points on the rendered object comprises a plurality of different location points on a contour line of the rendered object.
11. The apparatus of claim 10, wherein the plurality of different points located on the object contour line to be rendered are substantially equally divided by a circle centered at the center point of the object to be rendered.
12. The apparatus of claim 10 or 11, wherein the plurality of predetermined different location points on the rendered object further comprises a center point of the rendered object.
13. The apparatus of claim 8, wherein the plurality of predetermined different location points on the rendered object comprises a center point of each of the plurality of segments into which the rendered object is split.
14. The apparatus according to claim 8 or 9, wherein the rendering calculation module calculates the illumination information E of the object to be rendered by using the following formula:
Figure FDA0001432672470000031
wherein M is the total number of ambient light probes;
Figure FDA0001432672470000032
Withe total correction factor for the ith ambient light probe; and E' is the corrected illumination information of the ith environment illumination probe.
CN201710950129.2A 2017-10-13 2017-10-13 Real-time rendering method and device based on ambient illumination probe Active CN107633549B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710950129.2A CN107633549B (en) 2017-10-13 2017-10-13 Real-time rendering method and device based on ambient illumination probe

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710950129.2A CN107633549B (en) 2017-10-13 2017-10-13 Real-time rendering method and device based on ambient illumination probe

Publications (2)

Publication Number Publication Date
CN107633549A CN107633549A (en) 2018-01-26
CN107633549B true CN107633549B (en) 2021-02-09

Family

ID=61105470

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710950129.2A Active CN107633549B (en) 2017-10-13 2017-10-13 Real-time rendering method and device based on ambient illumination probe

Country Status (1)

Country Link
CN (1) CN107633549B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110193193B (en) * 2019-06-10 2022-10-04 网易(杭州)网络有限公司 Rendering method and device of game scene
CN110557092B (en) * 2019-09-06 2021-05-25 中国计量科学研究院 Irradiance compensation method for photoelectric performance test of solar cell
CN112712582B (en) * 2021-01-19 2024-03-05 广州虎牙信息科技有限公司 Dynamic global illumination method, electronic device and computer readable storage medium
CN112755535B (en) * 2021-02-05 2022-07-26 腾讯科技(深圳)有限公司 Illumination rendering method and device, storage medium and computer equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101606181B (en) * 2006-07-24 2012-05-30 迈克尔·邦内尔 System and methods for real-time rendering of deformable geometry with global illumination
CN103995700A (en) * 2014-05-14 2014-08-20 无锡梵天信息技术股份有限公司 Method for achieving global illumination of 3D game engine
CN106204701A (en) * 2016-06-22 2016-12-07 浙江大学 A kind of rendering intent based on light probe interpolation dynamic calculation indirect reference Gao Guang
CN107093204A (en) * 2017-04-14 2017-08-25 苏州蜗牛数字科技股份有限公司 It is a kind of that the method for virtual objects effect of shadow is influenceed based on panorama

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN203178434U (en) * 2013-03-06 2013-09-04 深圳市大族激光科技股份有限公司 Probe testing device
US11064891B2 (en) * 2014-09-05 2021-07-20 Canon Kabushiki Kaisha Object information acquiring apparatus
JP6704760B2 (en) * 2016-03-14 2020-06-03 株式会社東芝 Ultrasonic diagnostic device and biopsy device
CN106452362B (en) * 2016-11-11 2018-09-21 苏州阿特斯阳光电力科技有限公司 A kind of QE test devices and test method for solar cell

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101606181B (en) * 2006-07-24 2012-05-30 迈克尔·邦内尔 System and methods for real-time rendering of deformable geometry with global illumination
CN103995700A (en) * 2014-05-14 2014-08-20 无锡梵天信息技术股份有限公司 Method for achieving global illumination of 3D game engine
CN106204701A (en) * 2016-06-22 2016-12-07 浙江大学 A kind of rendering intent based on light probe interpolation dynamic calculation indirect reference Gao Guang
CN107093204A (en) * 2017-04-14 2017-08-25 苏州蜗牛数字科技股份有限公司 It is a kind of that the method for virtual objects effect of shadow is influenceed based on panorama

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于Unity3D与VR头盔的虚拟现实体感游戏开发;张阳等;《软件导刊》;20170831;第16卷(第8期);第119-122页 *

Also Published As

Publication number Publication date
CN107633549A (en) 2018-01-26

Similar Documents

Publication Publication Date Title
CN107633549B (en) Real-time rendering method and device based on ambient illumination probe
CN104598915B (en) A kind of gesture identification method and device
CN108205803B (en) Image processing method, and training method and device of neural network model
CN110574371A (en) Stereo camera depth determination using hardware accelerators
CN113610889B (en) Human body three-dimensional model acquisition method and device, intelligent terminal and storage medium
US20230419610A1 (en) Image rendering method, electronic device, and storage medium
EP0919042A1 (en) Integration of monocular cues to improve depth perception
EP1229499A2 (en) System and method for creating real-time shadows of transparent objects
CN108564551A (en) Fish eye images processing method and fish eye images processing unit
CN109087325A (en) A kind of direct method point cloud three-dimensional reconstruction and scale based on monocular vision determines method
CN111382618B (en) Illumination detection method, device, equipment and storage medium for face image
CN114638950A (en) Method and equipment for drawing virtual object shadow
CN110935171A (en) Method for loading, optimizing and unitizing live-action three-dimensional model in game engine
CN109461197B (en) Cloud real-time drawing optimization method based on spherical UV and re-projection
DE102021104310A1 (en) RESERVOIR-BASED SPATIO-TIME RESAMPLING BY IMPORTANCE USING A GLOBAL LIGHTING DATA STRUCTURE
CN113538704A (en) Method and equipment for drawing virtual object shadow based on light source position
US20230351555A1 (en) Using intrinsic functions for shadow denoising in ray tracing applications
CN110838167B (en) Model rendering method, device and storage medium
CN109509246B (en) Photon map clustering method based on self-adaptive sight division
CN116645291A (en) Self-adaptive Gamma correction glare suppression method based on regional brightness perception
CN111870953A (en) Height map generation method, device, equipment and storage medium
CN109934777B (en) Image local invariant feature extraction method, device, computer equipment and storage medium
CN116310060A (en) Method, device, equipment and storage medium for rendering data
CN114565537B (en) Infrared imaging device based on local information entropy
US11308684B2 (en) Ray-tracing for auto exposure

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant