CN115359172A - Rendering method and related device - Google Patents

Rendering method and related device Download PDF

Info

Publication number
CN115359172A
CN115359172A CN202211004101.7A CN202211004101A CN115359172A CN 115359172 A CN115359172 A CN 115359172A CN 202211004101 A CN202211004101 A CN 202211004101A CN 115359172 A CN115359172 A CN 115359172A
Authority
CN
China
Prior art keywords
light source
scene
target
determining
rendered
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211004101.7A
Other languages
Chinese (zh)
Inventor
孙翌峰
李旻昊
邹良辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202211004101.7A priority Critical patent/CN115359172A/en
Publication of CN115359172A publication Critical patent/CN115359172A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/06Ray-tracing

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Generation (AREA)

Abstract

The application provides a rendering method and a related device, the method includes determining a target light source, which is less than a preset distance from a reference position in a plurality of light sources of a scene picture of a scene to be rendered, in the scene picture of the scene to be rendered, the reference position being a position of a main camera viewport, which characterizes a scene camera, in the scene picture of the scene to be rendered, the scene camera being a control of a plurality of camera viewports in an editing interface of the scene to be rendered, the main camera viewport being a camera viewport, which simulates a user to observe a main viewpoint of the scene picture, in the plurality of camera viewports; determining a target object illuminated by the target light source; performing ray tracing calculation on the target object to obtain a ray tracing calculation result; and creating a target image of the scene to be rendered according to the ray tracing calculation result. Therefore, hardware pressure and power consumption can be reduced, a better rendering picture can be obtained, and user experience is improved.

Description

Rendering method and related device
Technical Field
The application belongs to the field of image rendering, and particularly relates to a rendering method and a related device.
Background
The image rendering is a process of converting three-dimensional light energy transmission processing into a two-dimensional image, and scenes and entities are represented in a three-dimensional form, are closer to the real world and are convenient to manipulate and transform. Then, because the calculation power of the current mobile terminal is limited, when rendering an image, careful rendering cannot be performed, or hardware power consumption is increased, so that the user experience is not good.
Disclosure of Invention
The embodiment of the application provides a rendering method and a related device, so that on the basis of ensuring a rendering effect, hardware pressure and power consumption are reduced, and user experience is improved.
In a first aspect, an embodiment of the present application provides a rendering method, where the method includes:
determining a target light source, whose distance from a reference position is smaller than a preset distance, in a plurality of light sources of a scene picture of a scene to be rendered, where the reference position is a position in the scene picture of the scene to be rendered, which represents a main camera viewport of a scene camera, the scene camera is a control of a plurality of camera viewports in an editing interface of the scene to be rendered, and the main camera viewport is a camera viewport, of the plurality of camera viewports, which simulates a user to observe a main viewpoint of the scene picture;
determining a target object illuminated by the target light source;
performing ray tracing calculation on the target object to obtain a ray tracing calculation result;
and creating a target image of the scene to be rendered according to the ray tracing calculation result.
In a second aspect, an embodiment of the present application provides a rendering apparatus, including:
a first determining unit, configured to determine a target illuminant whose distance from a reference position is smaller than a preset distance from among multiple illuminants of a scene picture of a scene to be rendered, where the reference position is a position in the scene picture of the scene to be rendered, where the reference position represents a main camera viewport of a scene camera, the scene camera is a control of multiple camera viewports in an editing interface of the scene to be rendered, and the main camera viewport is a camera viewport of a simulated user in the multiple camera viewports, where the main viewport simulates a main viewpoint of the scene picture;
a second determination unit for determining a target object illuminated by the target light source;
the calculation unit is used for carrying out ray tracing calculation on the target object to obtain a ray tracing calculation result;
and the creating unit is used for creating a target image of the scene to be rendered according to the ray tracing calculation result.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor, a memory, a communication interface, and one or more programs, stored in the memory and configured to be executed by the processor, the programs including instructions for performing the steps in the first aspect of the embodiment of the present application.
In a fourth aspect, an embodiment of the present application provides a computer storage medium, which is characterized by storing a computer program for electronic data exchange, where the computer program makes a computer perform part or all of the steps as described in the first aspect of the embodiment.
In a fifth aspect, embodiments of the present application provide a computer program product, where the computer program product includes a non-transitory computer-readable storage medium storing a computer program, where the computer program is operable to cause a computer to perform some or all of the steps as described in the first aspect of the embodiments of the present application. The computer program product may be a software installation package.
It can be seen that, in the embodiment of the present application, a target light source whose distance from a reference position is smaller than a preset distance is first determined among a plurality of light sources of a scene picture of a scene to be rendered, where the reference position is a position of a main camera view port characterizing a scene camera in the scene picture of the scene to be rendered, the scene camera is a control of a plurality of camera view ports in an editing interface of the scene to be rendered, the main camera view port is a camera view port simulating a user to observe a main viewpoint of the scene picture in the plurality of camera view ports, then a target object illuminated by the target light source is determined, then ray tracing calculation is performed on the target object to obtain a ray tracing calculation result, and finally a target image of the scene to be rendered is created according to the ray tracing calculation result. Therefore, scene information of the light source needing to be subjected to light tracking calculation can be selectively determined, so that the number of light rays needing to be calculated is reduced, hardware pressure and power consumption are reduced, a better rendering picture can be obtained, and user experience is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1a is a schematic structural diagram of an image processing system according to an embodiment of the present application
Fig. 1b is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 2a is a schematic flowchart of a rendering method according to an embodiment of the present application;
FIG. 2b is a schematic block diagram of an embodiment of the present application;
FIG. 2c is a schematic view of an object occlusion provided by an embodiment of the present application;
fig. 3 is a block diagram of functional units of a rendering apparatus according to an embodiment of the present disclosure;
fig. 4 is a block diagram of functional units of another rendering apparatus according to an embodiment of the present disclosure.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein may be combined with other embodiments.
The following explains terms referred to in the present application.
A camera viewport: each scene requires a viewpoint when rendered, and the user is just observing the scene from the viewpoint position. 3D graphics systems typically use a scene camera (camera) to define the relative position and orientation of the user and the scene, as do cameras in our real world, cameras in the 3D world also have properties similar to the size of the field of view. The size of the field of view determines the perspective (e.g., distant scenes appear smaller). The various attributes of the camera combine to convey the final 3D scene rendered image to a 2D viewport (viewport), which is determined by a browser window or other element. The plane corresponding to the camera viewport is the rendered 2D image we finally see on the screen.
At present, when a picture in a scene is rendered, ray tracing calculation is directly performed on all objects in a renderable area, so that a large amount of calculation is required, and hardware pressure and power consumption of electronic equipment are high. Particularly in a game scene, in the rendering mode, the temperature of the electronic equipment can also rise rapidly when the user plays the game, so that the user experience is poor.
In view of the foregoing problems, embodiments of the present application provide a rendering method and a related apparatus, and the following describes embodiments of the present application in detail with reference to the accompanying drawings.
Referring to fig. 1a, fig. 1a is a schematic structural diagram of an image processing system according to an embodiment of the present disclosure. As shown, the image processing system 10 includes a calculation engine 101 and an image processing engine 102, the image processing engine 102 may obtain data to be calculated and send the data to the calculation engine for calculation, the calculation engine may include a ray tracing engine, that is, the ray tracing engine is configured to perform ray tracing calculation according to the obtained data, and then send the calculated result to the image processing engine 102, and the image processing engine 102 performs image rendering according to the obtained calculation result. The image processing system 10 may be located in the same electronic device or in a different electronic device.
Referring to fig. 1b, fig. 1b is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure. As shown, the electronic device 100 includes a processor 120, a memory 130, a communication interface 140, and one or more programs 131, wherein the one or more programs 131 are stored in the memory 130 and configured to be executed by the processor 120, and the one or more programs 131 include instructions for performing any of the steps of the method embodiments described below. In a specific implementation, the processor 120 is configured to perform any one of the steps performed by the electronic device in the method embodiments described below, and when performing data transmission such as sending, optionally invokes the communication interface 140 to complete the corresponding operation. For example, the processor 120 may be a game engine, and the rendering and corresponding calculations may be performed by the processor 120 while the electronic device is in a game scene, and the communication interface may be a graphics API.
The electronic device according to the embodiments of the present application may be an electronic device with communication capability, and the electronic device may include various handheld devices, vehicle-mounted devices, wearable devices, computing devices or other processing devices connected to a wireless modem, and various forms of User Equipment (UE), mobile Station (MS), terminal device (terminal device), and the like.
Referring to fig. 2a, fig. 2a is a schematic flowchart illustrating a rendering method according to an embodiment of the present disclosure. As shown in the figure, the rendering method includes the following steps:
s201, determining a target light source, of a plurality of light sources of a scene picture of a scene to be rendered, of which the distance from a reference position is smaller than a preset distance.
The reference position is a position of a main camera view port which represents a scene camera in a scene picture of the scene to be rendered, the scene camera is a control of a plurality of camera view ports in an editing interface of the scene to be rendered, and the main camera view port is a camera view port which simulates a user to observe a main viewpoint of the scene picture in the plurality of camera view ports. The main camera view port is a window responsible for rendering information of a main scene picture in a screen, and can be simply understood as a camera for presenting a three-dimensional space object picture. There may be multiple cameras for the content to be rendered, i.e. viewable from multiple viewpoints, and thus different camera viewports may be switched to by different scene cameras. The main camera view port can be set and changed according to the requirements of a user, namely, the view angle of the user is changed under the same scene, and the main camera view port is changed accordingly. For example, as shown in fig. 2b, fig. 2b is a schematic architecture diagram provided in this embodiment of the present application, where a position of a camera in the diagram is a reference position in this scheme, that is, a position of a main camera viewport, and when a target light source is determined, a relative position between each light source and the main camera viewport can be determined according to coordinates of the main camera viewport and coordinates of all light sources in the scene, so as to determine a daylight source. The target light source may include only one or a plurality of target light sources. The target light source may also be the light source closest to the reference position, for example, 3 light sources in fig. 2b, and according to the calculation, one target light source closest to the reference position may be determined, and only one target light source may be determined at this time.
S202, determining the target object irradiated by the target light source.
The target object is an object that can be irradiated by the target light source, that is, calculation such as ray tracing and the like is subsequently performed on the object, after the target light source is determined, the target light source and scene information that can be irradiated by the target light source need to be transmitted to a corresponding calculation engine for calculation, so as to realize rendering of a scene, and the scene information includes the target object.
In one possible example, the determining the target object illuminated by the target light source comprises: determining the irradiation range of the target light source; and determining a target object from the scene picture according to the irradiation range.
The illumination range can be determined according to the current scene and the type of the target light source.
Therefore, the target object is determined according to the irradiation range of the target light source, the target object can be quickly searched, and the rendering efficiency is improved.
In one possible example, the determining the illumination range of the target light source includes: determining a light source type of the target light source; and determining an illumination range according to the type of the light source.
The light source type may refer to a type of light source emitting light, such as a point light source, a parallel light, a spotlight, and the like, and may also refer to a type of object emitting light, such as a flashlight, a light bar, and the like. And the irradiation range is located in a range that can be rendered by the main camera viewport, as shown in fig. 2b, a region enclosed by a solid line and a dashed line in fig. 2b is a renderable region corresponding to the main camera viewport, and an intersection of the irradiation range corresponding to a target light source and the renderable range is a final irradiation range of the target light source.
Therefore, in the example, the types of the light sources are distinguished to determine the illumination range, so that the object to be rendered can be determined more accurately, and the calculation amount and the power consumption of the equipment are reduced.
In one possible example, the determining the illumination range according to the light source type includes: determining a light source brightness of the target light source; determining environmental information where the target light source is located; and determining an illumination range according to the light source type, the light source brightness and the environment information.
The environment information is used for indicating the state of the environment where the target light source is located, and the environment information may be used for indicating weather conditions, the located scene, and the like. For example, the current weather of the scene to be rendered is rainy or foggy, or the scene where the target light source is located is underwater, and the like, different pieces of environmental information may have a certain influence on the range that the target light source can illuminate. Specifically, determining the irradiation range according to the light source type, the light source brightness and the environment information includes: determining a theoretical illumination range according to the light source type and the light source brightness; determining the environmental category and the influence degree of the scene where the target light source is located according to the environmental information; and adjusting the theoretical irradiation range according to the influence degree to obtain the irradiation range. The different environmental categories have different degrees of influence on the irradiation range, for example, the irradiation ranges on a heavy fog day and a sunny day, and of course, when determining the degrees of influence, the degrees of influence are also determined according to the levels of the environmental categories, that is, for example, on a heavy fog day, and the degrees of influence corresponding to the different concentrations of fog in the environment are also different.
Therefore, in the example, the information such as the environment information of the scene to be rendered and the brightness of the target light source is fully considered, so that the irradiation range of the target light source is determined, the object to be rendered can be more accurately determined, the calculation amount and the power consumption of the equipment are reduced, and the fineness of the rendered picture is improved.
In one possible example, the light source type is a point light source, and the determining the illumination range according to the light source type, the light source brightness and the environment information includes: determining an illuminable radius according to the light source brightness and the environment information; drawing a sphere according to the illuminable radius by taking the point light source as a center to obtain a sphere space; and determining the range contained in the sphere space as the irradiation range.
A point light source refers to a light source that emits light uniformly from one point to a surrounding space. Therefore, for a point light source, the sphere is drawn by taking the illuminable range as the radius and taking the point light source as the center. The object in the ball is the object that needs to be tracked. Therefore, the radius and the coordinates of the point light source can be obtained, and the distance between the object traversing the scene and the point light source is judged. Of course, the illumination range also needs to be within the renderable range.
Therefore, in the example, the irradiation range of the target light source is accurately determined according to the characteristics of the light source, so that the object to be rendered can be more accurately determined, and the calculation amount and the power consumption of the device are reduced.
In one possible example, the determining the target object from the scene picture according to the illumination range includes: determining all objects in the scene to be rendered, which are positioned in the illumination range, as candidate objects; determining object coordinates of each candidate object; determining light source coordinates of the target light source; determining whether a target candidate object is shielded by other candidate objects according to the object coordinates and the light source coordinates, wherein the other candidate objects are the candidate objects except the target candidate object in all the candidate objects, and the target candidate object is any one of the candidate objects; and determining that the candidate object which is not occluded in all the candidate objects is the target object.
Whether an object located in the irradiation range of the target light source is shielded by other objects can be determined according to the coordinates of each object, and if the object is shielded, the object is an object which does not need to be subjected to ray tracing calculation. It should be noted that the term "blocking" herein means that the object is completely blocked, i.e., the light of the target light source cannot be irradiated to the object.
Therefore, in the example, the shielded object is determined according to the coordinates of the object, ray tracing calculation is not performed on the shielded object, the calculation amount and power consumption of the equipment can be reduced, and the rendering effect can be ensured at the same time.
In one possible example, the determining whether the target candidate object is occluded by other candidate objects according to the object coordinates and the light source coordinates includes: obtaining a coordinate set corresponding to the target candidate object according to the object coordinates of the target candidate object and the light source coordinates, wherein the coordinate set comprises a plurality of surrounding coordinates, and the position indicated by each surrounding coordinate is located on the path from the light of the target light source to the target object; determining whether object coordinates of the other candidate objects exist in the set of coordinates; if yes, acquiring a first shape of the other candidate object and a second shape of the target candidate object; determining a degree of coincidence of the first shape and the second shape on the path; and determining whether the target candidate object is shielded by other candidate objects according to the contact ratio.
Wherein, the path of the light ray to the target object may be a path of the light ray of the target light source to a center point of the target object. The first shape and the second shape are used to characterize the appearance of each candidate object, such as the length, width, height, etc. of the candidate object. The degree of coincidence can be used to characterize whether the target object candidate is completely occluded, for example, if the degree of coincidence is 100%, then the target object candidate is not the target object. As shown in fig. 2c, fig. 2c is a schematic diagram of an object occlusion provided in the embodiment of the present application. The object a in the figure is a target candidate, the object B is another candidate, the dotted line in the figure is a light path of the light emitted by the target light source, and it should be noted that only a part of the light path is shown in the figure. It can be seen that the light rays corresponding to the dashed lines 1 and 2 are the light ray boundaries theoretically irradiated by the target light source to the object a, that is, all coordinates in the interval corresponding to the two dashed lines are the surrounding coordinates in the coordinate set, and it can be seen that the coordinates of the object B are in the interval. And because the shape of the object B completely blocks the object a on the path, that is, the object a and the object B completely coincide on the path, the target light source cannot irradiate the object a through the object B, and therefore the object a is not the target object.
In a specific implementation, if for a target candidate object, coordinates of a plurality of other candidate objects exist in a corresponding coordinate set, the coordinates can be determined according to a position of each other candidate object relative to the target candidate object and a degree of coincidence between each other candidate object and the target candidate object, for example, there is also an object C, where the object C and the object B both have a certain occlusion on the object a, but the degree of coincidence between the object C and the object B is not 100%, at this time, it is necessary to determine positions of the object B and the object C relative to the object a, and determine whether the object a is simultaneously occluded by the object B and the object C according to the relative position. For example, if object B blocks a part of object a and object C blocks another part, object a is not the target object even if the degree of coincidence is not 100%.
Therefore, in the example, the shielded object is determined, ray tracing calculation is not carried out on the shielded object, the calculation amount and the power consumption of the equipment can be reduced, and the rendering effect can be ensured at the same time.
S203, performing ray tracing calculation on the target object to obtain a ray tracing calculation result.
Where ray tracing is used to generate images by tracing the path of light in pixel units in the image plane and simulating the effect of it encountering a virtual object, soft shadows, depth of field, motion blur, caustic, ambient light occlusion and indirect illumination can be produced, which works by tracing the path from the imaginary eye to each pixel in the virtual screen and calculating the color of the object visible through it. The ray tracing calculation result includes shadow, reflection, refraction, and ambient light mask map texture data. When performing ray tracing calculation on the target object, light source coordinates, color, intensity, and radiation range information of the target light source, and coordinate vertex and material information of the target object may be obtained first to perform ray tracing calculation.
And S204, creating a target image of the scene to be rendered according to the ray tracing calculation result.
In one possible example, the creating a target image of the scene to be rendered according to the ray tracing calculation result includes: performing rasterization calculation on the scene to be rendered to obtain a rasterization calculation result; and creating a target image of the scene to be rendered according to the ray tracing calculation result and the rasterization calculation result.
After the ray tracing calculation result is obtained, data obtained corresponding to the result can be merged into the original rendering pipeline, so that a technology which is closest to real rendering in theory and a universal rendering scheme are obtained, and the image effects of soft shadows, off-screen reflection and ambient light shielding are mainly shown. The original rendering pipeline comprises the step of carrying out rasterization calculation on a scene picture in a scene to be rendered, namely combining a ray tracing calculation result with data such as shadow, reflection, ambient light shielding mapping and the like obtained by rasterization to obtain a final rendering result.
Therefore, in the example, the light tracking calculation and the grating word calculation are simultaneously carried out on the scene to be rendered, so that the rendering effect can be ensured, and the calculation pressure and the power consumption of equipment hardware cannot be increased.
It can be seen that, in the embodiment of the present application, a target light source whose distance from a reference position is smaller than a preset distance is first determined among a plurality of light sources of a scene picture of a scene to be rendered, where the reference position is a position of a main camera view port characterizing a scene camera in the scene picture of the scene to be rendered, the scene camera is a control of a plurality of camera view ports in an editing interface of the scene to be rendered, the main camera view port is a camera view port simulating a user to observe a main viewpoint of the scene picture in the plurality of camera view ports, then a target object illuminated by the target light source is determined, then ray tracing calculation is performed on the target object to obtain a ray tracing calculation result, and finally a target image of the scene to be rendered is created according to the ray tracing calculation result. Therefore, the scene information of the light source needing to be subjected to light tracking calculation can be selectively determined, so that the number of light rays needing to be calculated is reduced, the hardware pressure and the power consumption are reduced, a better rendering picture can be obtained, and the use experience of a user is improved.
Referring to fig. 3, fig. 3 is a block diagram illustrating functional units of a rendering apparatus according to an embodiment of the present disclosure. The rendering apparatus 30 includes: a first determining unit 301, configured to determine a target illuminant whose distance from a reference position is smaller than a preset distance from among multiple illuminants of a scene picture of a scene to be rendered, where the reference position is a position in the scene picture of the scene to be rendered, where the reference position represents a main camera viewport of a scene camera, the scene camera is a control of multiple camera viewports in an editing interface of the scene to be rendered, and the main camera viewport is a camera viewport of a simulated user in the multiple camera viewports, where the simulated user observes a main viewpoint of the scene picture; a second determination unit 302, configured to determine a target object illuminated by the target light source; a calculating unit 303, configured to perform ray tracing calculation on the target object to obtain a ray tracing calculation result; a creating unit 304, configured to create a target image of the scene to be rendered according to a ray tracing calculation result.
In one possible example, in the aspect of creating the target image of the scene to be rendered according to the ray tracing calculation result, the creating unit 304 is specifically configured to: performing rasterization calculation on the scene to be rendered to obtain a rasterization calculation result; and creating a target image of the scene to be rendered according to the ray tracing calculation result and the rasterization calculation result.
In a possible example, in the aspect of determining the irradiation range of the target light source, the second determining unit 302 is specifically configured to: determining a light source type of the target light source; and determining an illumination range according to the type of the light source.
In a possible example, in the aspect of determining the illumination range according to the light source type, the second determining unit 302 is specifically configured to: determining a light source brightness of the target light source; determining environmental information where the target light source is located; and determining an illumination range according to the light source type, the light source brightness and the environment information.
In a possible example, in the aspect that the light source type is a point light source, and the determining the irradiation range according to the light source type, the light source brightness, and the environment information, the second determining unit 302 is specifically configured to: determining an illuminable radius according to the light source brightness and the environment information; drawing a sphere according to the illuminable radius by taking the point light source as a center to obtain a sphere space; and determining the range contained in the sphere space as the irradiation range.
In one possible example, in the aspect of determining the target object from the scene picture according to the illumination range, the second determining unit 302 is specifically configured to: determining all objects in the scene to be rendered, which are positioned in the illumination range, as candidate objects; determining object coordinates for each candidate object; determining light source coordinates of the target light source; determining whether a target candidate object is shielded by other candidate objects according to the object coordinates and the light source coordinates, wherein the other candidate objects are the candidate objects except the target candidate object in all the candidate objects, and the target candidate object is any one of the candidate objects; and determining that the candidate object which is not occluded in all the candidate objects is the target object.
In a possible example, in the aspect of determining whether the target candidate object is occluded by other candidate objects according to the object coordinates and the light source coordinates, the second determining unit 302 is specifically configured to: obtaining a coordinate set corresponding to the target candidate object according to the object coordinates and the light source coordinates of the target candidate object, wherein the coordinate set comprises a plurality of surrounding coordinates, and the position indicated by each surrounding coordinate is located on the path from the light of the target light source to the target object; determining whether object coordinates of the other candidate objects exist in the set of coordinates; if yes, acquiring a first shape of the other candidate object and a second shape of the target candidate object; determining a degree of coincidence of the first shape and the second shape on the path; and determining whether the target candidate object is shielded by other candidate objects according to the contact ratio.
It can be understood that, since the method embodiment and the apparatus embodiment are different presentation forms of the same technical concept, the content of the method embodiment portion in the present application should be synchronously adapted to the apparatus embodiment portion, and is not described herein again.
In the case of using an integrated unit, please refer to fig. 4, where fig. 4 is a block diagram of a functional unit of another rendering apparatus according to an embodiment of the present application. In fig. 4, the rendering apparatus 400 includes: a processing module 412 and a communication module 411. The processing module 412 is used to control and manage actions of the rendering device, for example, to perform the steps of the first determining unit 301, the second determining unit 302, the calculating unit 303, and the creating unit 304, and/or other processes for performing the techniques described herein. The communication module 411 is used for interaction between the rendering apparatus and other devices. As shown in fig. 4, the rendering apparatus may further include a storage module 413, and the storage module 413 is used to store program codes and data of the rendering apparatus.
The Processing module 412 may be a Processor or a controller, and may be, for example, a Central Processing Unit (CPU), a general-purpose Processor, a Digital Signal Processor (DSP), an ASIC, an FPGA or other programmable logic device, a transistor logic device, a hardware component, or any combination thereof. Which may implement or execute the various illustrative logical blocks, modules, and circuits described in connection with the disclosure herein. The processor may also be a combination of computing functions, e.g., comprising one or more microprocessors, DSPs, and microprocessors, and the like. The communication module 411 may be a transceiver, an RF circuit or a communication interface, etc. The storage module 413 may be a memory.
All relevant contents of each scene related to the method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again. The rendering apparatus 400 may perform the rendering method shown in fig. 2 a.
The above description has introduced the solution of the embodiment of the present application mainly from the perspective of the method-side implementation process. It is understood that the electronic device includes hardware structures and software modules for performing the respective functions in order to realize the functions. Those of skill in the art will readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments provided herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, the electronic device may be divided into the functional units according to the method example, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing unit. The integrated unit may be implemented in the form of hardware, or may also be implemented in the form of a software functional unit. It should be noted that the division of the unit in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
Embodiments of the present application further provide a chip, where the chip includes a processor, configured to call and run a computer program from a memory, so that a device in which the chip is installed performs some or all of the steps described in the electronic device in the above method embodiments.
Embodiments of the present application further provide a computer storage medium, where the computer storage medium stores a computer program for electronic data exchange, the computer program enables a computer to execute part or all of the steps of any one of the methods as described in the above method embodiments, and the computer includes an electronic device.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps of any of the methods as described in the above method embodiments. The computer program product may be a software installation package, the computer comprising an electronic device.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the above-described units is only one type of logical functional division, and other divisions may be realized in practice, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit may be stored in a computer readable memory if it is implemented in the form of a software functional unit and sold or used as a separate product. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in the form of a software product stored in a memory, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the above-mentioned method of the embodiments of the present application. And the aforementioned memory comprises: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk, and various media capable of storing program codes.
Those skilled in the art will appreciate that all or part of the steps of the methods of the above embodiments may be implemented by a program, which is stored in a computer-readable memory, the memory including: flash Memory disks, read-Only memories (ROMs), random Access Memories (RAMs), magnetic or optical disks, and the like.
The foregoing embodiments have been described in detail, and specific examples are used herein to explain the principles and implementations of the present application, where the above description of the embodiments is only intended to help understand the method and its core ideas of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, the specific implementation manner and the application scope may be changed, and in summary, the content of the present specification should not be construed as a limitation to the present application.
Although the present invention is disclosed above, the present invention is not limited thereto. Various changes and modifications can be easily made by those skilled in the art without departing from the spirit and scope of the present invention, and it is within the scope of the present invention to include different functions, combination of implementation steps, software and hardware implementations.

Claims (11)

1. A method of rendering, the method comprising:
determining a target light source, whose distance from a reference position is smaller than a preset distance, in a plurality of light sources of a scene picture of a scene to be rendered, where the reference position is a position in the scene picture of the scene to be rendered, which represents a main camera viewport of a scene camera, the scene camera is a control of a plurality of camera viewports in an editing interface of the scene to be rendered, and the main camera viewport is a camera viewport, of the plurality of camera viewports, which simulates a user to observe a main viewpoint of the scene picture;
determining a target object illuminated by the target light source;
performing ray tracing calculation on the target object to obtain a ray tracing calculation result;
and creating a target image of the scene to be rendered according to the ray tracing calculation result.
2. The method of claim 1, wherein creating the target image of the scene to be rendered according to the ray tracing calculation result comprises:
performing rasterization calculation on the scene to be rendered to obtain a rasterization calculation result;
and creating a target image of the scene to be rendered according to the ray tracing calculation result and the rasterization calculation result.
3. The method of claim 1 or 2, wherein the determining the target object illuminated by the target light source comprises:
determining an illumination range of the target light source;
and determining a target object from the scene picture according to the irradiation range.
4. The method of claim 3, wherein the determining the illumination range of the target light source comprises:
determining a light source type of the target light source;
and determining an illumination range according to the type of the light source.
5. The method of claim 4, wherein determining an illumination range from the light source type comprises:
determining a light source brightness of the target light source;
determining environmental information of the target light source;
and determining an illumination range according to the light source type, the light source brightness and the environment information.
6. The method of claim 5, wherein the light source type is a point light source, and wherein determining the illumination range according to the light source type, the light source brightness, and the environmental information comprises:
determining an illuminable radius according to the light source brightness and the environment information;
drawing a sphere according to the illuminable radius by taking the point light source as a center to obtain a sphere space;
and determining the range contained in the sphere space as the irradiation range.
7. The method of claim 3, wherein determining the target object from the scene based on the illumination range comprises:
determining all objects in the scene to be rendered, which are positioned in the illumination range, as candidate objects;
determining object coordinates of each candidate object;
determining light source coordinates of the target light source;
determining whether a target candidate object is shielded by other candidate objects according to the object coordinates and the light source coordinates, wherein the other candidate objects are candidate objects except the target candidate object in all the candidate objects, and the target candidate object is any one of the candidate objects;
and determining that the candidate object which is not occluded in all the candidate objects is the target object.
8. The method of claim 7, wherein determining whether the target object candidate is occluded by other object candidates according to the object coordinates and the light source coordinates comprises:
obtaining a coordinate set corresponding to the target candidate object according to the object coordinates of the target candidate object and the light source coordinates, wherein the coordinate set comprises a plurality of surrounding coordinates, and the position indicated by each surrounding coordinate is located on the path from the light of the target light source to the target object;
determining whether object coordinates of the other candidate objects exist in the set of coordinates;
if yes, acquiring a first shape of the other candidate object and a second shape of the target candidate object;
determining a degree of coincidence of the first shape and the second shape on the path;
and determining whether the target candidate object is shielded by other candidate objects according to the contact ratio.
9. A rendering apparatus, characterized in that the apparatus comprises:
a first determining unit, configured to determine a target light source, of a plurality of light sources of a scene picture of a scene to be rendered, whose distance from a reference position is smaller than a preset distance, where the reference position is a position in the scene picture of the scene to be rendered, where the reference position characterizes a main camera viewport of a scene camera, the scene camera is a control of a plurality of camera viewports in an editing interface of the scene to be rendered, and the main camera viewport is a camera viewport, of the plurality of camera viewports, that simulates a user to observe a main viewpoint of the scene picture;
a second determination unit for determining a target object illuminated by the target light source;
the calculation unit is used for carrying out ray tracing calculation on the target object to obtain a ray tracing calculation result;
and the creating unit is used for creating a target image of the scene to be rendered according to the ray tracing calculation result.
10. An electronic device comprising a processor, memory, and one or more programs stored in the memory and configured to be executed by the processor, the programs including instructions for performing the steps in the method of any of claims 1-8.
11. A computer-readable storage medium, having stored thereon a computer program/instructions, characterized in that the computer program/instructions are executed by a processor for carrying out the steps of the method according to any one of claims 1-8.
CN202211004101.7A 2022-08-22 2022-08-22 Rendering method and related device Pending CN115359172A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211004101.7A CN115359172A (en) 2022-08-22 2022-08-22 Rendering method and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211004101.7A CN115359172A (en) 2022-08-22 2022-08-22 Rendering method and related device

Publications (1)

Publication Number Publication Date
CN115359172A true CN115359172A (en) 2022-11-18

Family

ID=84002940

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211004101.7A Pending CN115359172A (en) 2022-08-22 2022-08-22 Rendering method and related device

Country Status (1)

Country Link
CN (1) CN115359172A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116912382A (en) * 2023-09-14 2023-10-20 成都帆点创想科技有限公司 Rendering method and device, electronic equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116912382A (en) * 2023-09-14 2023-10-20 成都帆点创想科技有限公司 Rendering method and device, electronic equipment and storage medium
CN116912382B (en) * 2023-09-14 2023-12-29 成都帆点创想科技有限公司 Rendering method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
WO2022111619A1 (en) Image processing method and related apparatus
CN111369655B (en) Rendering method, rendering device and terminal equipment
CN113674389B (en) Scene rendering method and device, electronic equipment and storage medium
CN111968215A (en) Volume light rendering method and device, electronic equipment and storage medium
CN112184873B (en) Fractal graph creation method, fractal graph creation device, electronic equipment and storage medium
Arief et al. Realtime estimation of illumination direction for augmented reality on mobile devices
CN114419240B (en) Illumination rendering method and device, computer equipment and storage medium
US20230230311A1 (en) Rendering Method and Apparatus, and Device
CN112258610B (en) Image labeling method and device, storage medium and electronic equipment
CN115830208A (en) Global illumination rendering method and device, computer equipment and storage medium
CN110930497B (en) Global illumination intersection acceleration method and device and computer storage medium
CN114758051A (en) Image rendering method and related equipment thereof
CN112734896A (en) Environment shielding rendering method and device, storage medium and electronic equipment
CN116758208A (en) Global illumination rendering method and device, storage medium and electronic equipment
CN115359172A (en) Rendering method and related device
CN112819940B (en) Rendering method and device and electronic equipment
WO2018202435A1 (en) Method and device for determining lighting information of a 3d scene
KR20100075351A (en) Method and system for rendering mobile computer graphic
KR20230013099A (en) Geometry-aware augmented reality effects using real-time depth maps
US20230090732A1 (en) System and method for real-time ray tracing in a 3d environment
US20240203030A1 (en) 3d model rendering method and apparatus, electronic device, and storage medium
WO2023197689A1 (en) Data processing method, system, and device
CN109685882B (en) Rendering a light field as a better background
CN118172459A (en) Oblique photography model rendering method and rendering system
CN117036577A (en) Scene rendering method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination