WO2022100437A1 - 虚拟环境中的天气渲染方法、装置、设备、介质及程序 - Google Patents

虚拟环境中的天气渲染方法、装置、设备、介质及程序 Download PDF

Info

Publication number
WO2022100437A1
WO2022100437A1 PCT/CN2021/126846 CN2021126846W WO2022100437A1 WO 2022100437 A1 WO2022100437 A1 WO 2022100437A1 CN 2021126846 W CN2021126846 W CN 2021126846W WO 2022100437 A1 WO2022100437 A1 WO 2022100437A1
Authority
WO
WIPO (PCT)
Prior art keywords
weather
map
raindrop
scene
texture
Prior art date
Application number
PCT/CN2021/126846
Other languages
English (en)
French (fr)
Inventor
马晓霏
张佳伟
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Publication of WO2022100437A1 publication Critical patent/WO2022100437A1/zh
Priority to US17/965,323 priority Critical patent/US20230039131A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/217Input arrangements for video game devices characterised by their sensors, purposes or types using environment-related information, i.e. information generated otherwise than by the player, e.g. ambient temperature or humidity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/65Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/603D [Three Dimensional] animation of natural phenomena, e.g. rain, snow, water or plants
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/663Methods for processing data by generating or executing the game program for rendering three dimensional images for simulating liquid objects, e.g. water, gas, fog, snow, clouds

Definitions

  • the present application relates to the technical field of image processing, and in particular, to a weather rendering method, apparatus, device, medium and program in a virtual environment.
  • 3D virtual environment such as MMORPG
  • the weather in the real world is simulated, so that the user has a more realistic experience when controlling the virtual character to play the game.
  • Embodiments of the present application provide a weather rendering method, apparatus, device, medium, and program in a virtual environment.
  • the technical solution is as follows:
  • a weather rendering method in a virtual environment is provided, which is applied to a computer device, and the method includes:
  • the first weather map includes weather maps in the virtual environment that are located outside the field of view of the current virtual character;
  • the weather scene is rendered according to the second weather map.
  • a weather rendering apparatus in a virtual environment comprising:
  • an acquisition module for acquiring at least one weather map of the weather scene in the virtual environment
  • a processing module configured to remove the first weather map in the at least one weather map to obtain the remaining second weather map;
  • the first weather map includes the weather in the virtual environment that is outside the field of view of the current virtual character map;
  • a rendering module configured to render the weather scene according to the second weather map.
  • a computer device comprising: a processor and a memory, wherein the memory stores at least one instruction, at least a piece of program, a code set or an instruction set, the at least one The instructions, the at least one piece of program, the code set or the instruction set are loaded and executed by the processor to implement the weather rendering method in the virtual environment as described in the above aspects.
  • a computer-readable storage medium having a computer program stored therein, the computer program being loaded and executed by a processor to implement the weather in a virtual environment as described in the above aspect render method.
  • a computer program product or computer program comprising computer instructions stored in a computer readable storage medium.
  • a processor of a computer device reads the computer instructions from the computer-readable storage medium, the processor executes the computer instructions, causing the computer device to perform the weather rendering method in a virtual environment as described above.
  • the first weather map is a weather map in the virtual environment that is outside the field of view of the current virtual character.
  • a weather map is missing, it does not affect the rendering of the weather scene in the current virtual character's field of view, nor does it affect the rendering of the overall virtual environment, so that the computer equipment used by the user can ensure the normal display of the weather scene when loading the virtual environment picture. , which can reduce the number of texture samples by reducing the number of weather textures, so as to avoid the performance of computer equipment being greatly reduced when running applications that support virtual environments.
  • FIG. 1 is a block diagram of a terminal provided by an exemplary embodiment of the present application.
  • FIG. 2 is a flowchart of a weather rendering method in a virtual environment provided by an exemplary embodiment of the present application
  • FIG. 3 is a frame diagram of a weather system provided by an exemplary embodiment of the present application.
  • FIG. 4 is a schematic diagram of a scattering map corresponding to the sky provided by an exemplary embodiment of the present application.
  • FIG. 5 is a flowchart of a weather rendering method in a virtual environment provided by another exemplary embodiment of the present application.
  • FIG. 6 is a schematic diagram of a setting interface for weather configuration parameters provided by an exemplary embodiment of the present application.
  • FIG. 7 is a schematic diagram of a setting interface for weather configuration parameters provided by another exemplary embodiment of the present application.
  • FIG. 8 is a schematic diagram of texture layer division provided by an exemplary embodiment of the present application.
  • FIG. 9 is a schematic diagram of a simulation effect of a rain scene provided by an exemplary embodiment of the present application.
  • FIG. 10 is a working schematic diagram of a particle emitter provided by an exemplary embodiment of the present application.
  • FIG. 11 is a schematic diagram of a particle distribution box provided by an exemplary embodiment of the present application.
  • FIG. 12 is a screen diagram of a rain scene corresponding to a virtual environment screen provided by an exemplary embodiment of the present application.
  • FIG. 13 is a schematic diagram of a splash generated in a virtual environment provided by an exemplary embodiment of the present application.
  • FIG. 14 is a screen diagram of a rain scene corresponding to a virtual environment screen provided by another exemplary embodiment of the present application.
  • 15 is a screen diagram of a rain scene corresponding to a virtual environment screen provided by another exemplary embodiment of the present application.
  • 16 is a flowchart of a weather rendering method in a virtual environment provided by another exemplary embodiment of the present application.
  • FIG. 17 is a schematic diagram of a scattering map corresponding to the sky provided by another exemplary embodiment of the present application.
  • FIG. 18 is a schematic diagram of the effect of sky rendering provided by an exemplary embodiment of the present application.
  • FIG. 19 is a flowchart of a weather rendering method in a virtual environment provided by another exemplary embodiment of the present application.
  • FIG. 20 is a schematic diagram of rendering a texture cloud map provided by an exemplary embodiment of the present application.
  • FIG. 21 is a schematic diagram of rendering a texture cloud map provided by another exemplary embodiment of the present application.
  • FIG. 22 is a schematic diagram of the effect of cloud rendering provided by an exemplary embodiment of the present application.
  • FIG. 23 is a flowchart of a weather rendering method in a virtual environment provided by another exemplary embodiment of the present application.
  • FIG. 24 is a flow chart of a weather scene transition provided by an exemplary embodiment of the present application.
  • FIG. 25 is a block diagram of a weather rendering apparatus in a virtual environment provided by an exemplary embodiment of the present application.
  • FIG. 26 is a schematic structural diagram of a computer device provided by an exemplary embodiment of the present application.
  • Virtual environment is the virtual environment displayed (or provided) by the application when it is run on the terminal.
  • the virtual environment may be a simulated environment of the real world, a semi-simulated and semi-fictional environment, or a purely fictional environment.
  • the virtual environment may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment, which is not limited in this application.
  • the following embodiments are exemplified by the virtual environment being a three-dimensional virtual environment.
  • Virtual character refers to the movable object in the virtual scene.
  • the movable objects may be virtual characters, virtual animals, cartoon characters, etc., such as characters, animals, plants, oil barrels, walls, stones, etc. displayed in the three-dimensional virtual scene.
  • the virtual character is a three-dimensional solid model created based on animation skeleton technology.
  • Each virtual character has its own shape and volume in the three-dimensional virtual scene, and occupies a part of the space in the three-dimensional virtual scene.
  • a virtual character generally refers to one or more virtual characters in a virtual scene. Taking a game application as an example, a virtual character is a movable object that the user controls when participating in the game.
  • Massively Multiplayer Online Role-Playing Game refers to an online game that supports multiplayer online. Different clients can play in the same scene, and different clients can cooperate to complete a game. Tasks, clients can communicate online, and clients can also interact with non-player characters (NPCs, Non-Player Characters) in the game. Usually, the user controls the virtual role by logging in the user account on the client side, and the virtual role corresponds to the user account (ID, Identity) one-to-one. The virtual characters controlled by the user play different roles in the virtual environment, such as generals, mages, scholar, dancers and so on. Massively multiplayer online games include strategy, action, adventure, simulation, sports, racing, role-playing and other types. The following embodiments are described by taking the client as a game client as an example.
  • the method provided in this application can be applied to 3D map programs, military simulation programs, First-Person Shooting Game (FPS), Multiplayer Online Battle Arena Games (MOBA), MMOPRG games, A virtual reality application (Virtual Reality, VR), an augmented reality application (Augmented Reality, AR), etc., the following embodiments are exemplified by game applications.
  • a game based on a virtual scene consists of one or more maps of the game world.
  • the virtual environment in the game simulates the real world environment.
  • Users can control the virtual characters in the game to walk, run, jump, shoot, fight, Driving, being attacked by other avatars (other avatars are avatars controlled by other users), being hurt in the virtual environment, attacking other avatars, etc., are highly interactive, and multiple users can team up to compete online game.
  • the virtual environment corresponds to changing weather, such as sunny, rainy, snowy, sandstorm, thunderstorm, heavy rain, blizzard, etc.
  • the virtual environment is divided into various time periods, and each time period corresponds to different time periods.
  • the same light and shadow effects for example, 6:00 to 7:00 am is the sunrise time, the virtual environment corresponds to the sunrise light, 4:00 pm to 5:00 pm is the sunset time, and the virtual environment corresponds to the dusk light.
  • An embodiment of the present application provides a weather rendering method in a virtual environment.
  • a first weather map in at least one weather map of a weather scene is eliminated, and the first weather map does not affect the field of view of the current virtual character when the first weather map is missing. Therefore, when the terminal runs the game application, the number of texture sampling times can be reduced, and the performance of the terminal can be avoided.
  • FIG. 1 shows a block diagram of a terminal provided by an exemplary embodiment of the present application.
  • the weather rendering method in a virtual environment provided by the embodiment of the present application is applied to a terminal 100.
  • the terminal 100 is installed with an application program supporting virtual scene running corresponding to
  • the application includes 3D map program, military simulation program, FPS game, MOBA game, MMOPRG game, VR application, AR application.
  • the following embodiments take the application as a game application for illustration.
  • the terminal includes at least one of a smart phone, a tablet computer, an MP3 player, an MP4 player, a laptop computer, a desktop computer, and a notebook computer.
  • a smart phone as an example. .
  • the client corresponds to a weather system 110 including a sky light and shadow system 111 , a raindrop subsystem 112 , a cloud subsystem 113 and a transition subsystem 114 .
  • the sky light and shadow system 111 is used to simulate various lighting scenes in the virtual environment, such as the lighting scene when the sun rises, the lighting scene at dusk, and the lighting scene of the moon at night in the virtual environment.
  • the sky light and shadow system 111 calculates the scattering map corresponding to the sky of the virtual environment according to the weather configuration information, stores the scattering map offline, and directly loads the lighting scene in the virtual environment by acquiring the scattering map when the game client runs.
  • the scatter map does not include the first scatter map of the virtual element in the spatial dimension, that is, does not include the first scatter map at a sky height higher than the specified height.
  • Texture sampling refers to the process in which the graphics processor (Graphics Processing Unit, GPU) of the terminal accesses (obtains) textures to load the weather scene during the running process of the game client.
  • the raindrop subsystem 112 is used to simulate various rain scenarios in the virtual environment, such as at least one of a light rain scenario, a heavy rain scenario, a heavy rain scenario, and a thunderstorm scenario.
  • the raindrop subsystem 112 calculates the raindrop texture according to the weather configuration information.
  • the virtual environment When the virtual environment is in a raining scene, it simulates the raining scene by constructing a biconical model as shown in FIG. 9 , and the biconical model will be bound with the virtual character. Surrounded by the camera model of the The picture containing raindrops is also constantly changing.
  • the raindrop subsystem 112 divides the viewing angle range of the virtual character along the line of sight, obtains texture layers with a preset number of layers, and adds raindrop textures with a preset number of layers according to the position of the texture layer, that is, attaches a preset number of raindrop textures to the surface of the biconical model.
  • a raindrop map with layers is set.
  • the B channel in the raindrop map stores the depth information of the raindrop, and the B channel is the B channel in the red (Red, R) green (Green, G) blue (Blue, B) color mode.
  • the raindrop depth information represents the distance between the virtual character (or the camera model bound to the virtual character) and the raindrop texture.
  • the raindrop textures with different raindrop depth information simulate the rain scene according to the Parallax motion (Parallax) principle, that is, close to the virtual character. Raindrops have a larger droplet size, and raindrops farther away from the avatar have a smaller droplet size.
  • Parallax Parallax motion
  • the raindrop subsystem 112 is also used to update the position of the water splash.
  • the water splash is used to represent the water splash generated when the raindrops splash on the ground of the virtual environment.
  • the terminal does not need to repeatedly generate the water splash, and the performance of the terminal when running the game client is improved.
  • the raindrop subsystem 112 is also used to generate a particle distribution box for the virtual character, the particle distribution box surrounds the virtual character, the particle distribution box is divided into m ⁇ n sub-distribution boxes, m and n are both positive integers, and m and n can be Equal or unequal, each subdistribution box has a corresponding particle emitter.
  • the sub-distribution box on the layer farthest from the virtual character will move to the front of the nearest sub-distribution box along the moving direction of the virtual character.
  • the sub-distribution boxes of the nearest layer are adjacent, and the sub-distribution boxes of the nearest layer are located in front of the visual field of the virtual character, and have the shortest distance from the virtual character. Due to the movement of the sub-distribution box, the virtual character is always located in the particle distribution box, that is, the virtual character is always covered by raindrops.
  • the particle distribution box is also used for simulating a snow scene, a hail scene, a sandstorm scene and other weather scenes that need to be represented by particles.
  • the cloud subsystem 113 is used to preprocess the two-dimensional cloud image.
  • the two-dimensional cloud image is a grayscale image with cloud contours and no lighting conditions.
  • Preprocessing refers to the ray marching algorithm (Raymarching).
  • Image processing that is, the image obtained by moving the pixels in the two-dimensional cloud image to a certain direction according to the preset step size, and the preprocessed at least two texture cloud maps according to the weight corresponding to the texture cloud map channel Mixing is performed to obtain cloud textures for loading virtual scenes, and there is no need for the terminal to repeatedly sample textures to load virtual environment pictures.
  • the transition subsystem 114 is configured to calculate the screen rendering parameters when the weather scene of the virtual environment is switched, and render the transition screen when the two weather scenes are switched according to the screen rendering parameters.
  • FIG. 2 shows a flowchart of a weather rendering method in a virtual environment provided by an exemplary embodiment of the present application, and the method can be applied to the terminal 100 shown in FIG.
  • the client corresponding to the application the client runs based on the weather system 110 .
  • the method includes the following steps:
  • Step 201 Obtain at least one weather map of a weather scene in a virtual environment.
  • the terminal acquires the weather configuration information, and the weather configuration information is used to render the weather scene in the virtual environment.
  • the weather configuration information can be used for the rendering of the weather scene in the initial virtual environment, and can also be used for the rendering of the weather scene when the weather scene is switched in the virtual environment.
  • an application program supporting virtual scene running is installed and running on the terminal used by the user.
  • the display screen of the terminal correspondingly displays the picture when the application program is used.
  • the application is run based on the weather system, and the weather scene is correspondingly displayed on the display screen of the terminal.
  • the display screen of the terminal displays a virtual environment picture in the game.
  • the virtual scene picture is a picture obtained by observing the virtual scene from the first-person perspective of the virtual character, or the virtual scene picture is a picture obtained by observing the virtual scene from the third-person perspective of the virtual character.
  • the game application program corresponds to a server
  • the server sends the weather configuration information to the terminal running the application program
  • the game application program performs screen rendering on the weather scene corresponding to the virtual environment according to the weather configuration information.
  • the weather configuration information is:
  • the server sends the current time of the virtual environment to the terminal, such as ten o'clock in the morning, and the game application loads the virtual environment screen according to the weather scene corresponding to ten in the morning.
  • the weather configuration parameters are obtained by obtaining the state parameters from the terminal.
  • the terminal is a smart phone, and the smart phone displays the current time as ten o'clock in the morning. Environment screen.
  • Weather configuration information refers to the configuration information of weather elements in the virtual environment; the weather configuration information includes the time period in which the virtual environment is located, the time lapse mode of the virtual environment (such as day and night changes or fixed in a time period), light source parameters, sky Parameters, fog parameters, cloud parameters, wind parameters, particle parameters (used to represent particles in weather scenes, such as raindrops, snowflakes, sandstorms, etc.), environmental special effects parameters (such as the virtual environment is affected when the virtual character releases skills) , at least one of water surface effect parameters, sun parameters, moon parameters, and lens flare (the lens of the camera model that moves with the virtual character).
  • the weather configuration information includes the time period in which the virtual environment is located, the time lapse mode of the virtual environment (such as day and night changes or fixed in a time period), light source parameters, sky Parameters, fog parameters, cloud parameters, wind parameters, particle parameters (used to represent particles in weather scenes, such as raindrops, snowflakes, sandstorms, etc.), environmental special effects parameters (such as the
  • a weather correspondence is pre-stored in the weather system, and the weather correspondence is used to represent the correspondence between weather configuration information and weather scenes (including scene effects).
  • the weather system is preset with a weather correspondence
  • the weather system is encapsulated in the game application.
  • the weather scene under the weather configuration information is calculated according to the weather configuration information and the weather correspondence.
  • the weather scene refers to the scene effect presented by the virtual environment in a certain weather scene.
  • the weather correspondence includes a comparison table relationship
  • the game application queries the corresponding weather scene from the comparison table according to the weather configuration information.
  • Table 1 shows the correspondence between weather configuration information and weather scenarios.
  • the embodiments of the present application do not limit the expression manner of the weather correspondence.
  • Weather maps are used to express weather scenes. There are two types of weather maps: first weather maps and second weather maps.
  • the first weather maps include weather maps in the virtual environment that are outside the current avatar's field of vision, and the second weather maps. Includes weather maps in the virtual environment that are outside the current avatar's field of view.
  • the weather system 110 is used for at least one of the following aspects: day and night changes in a virtual environment, simulating real-world weather performance, and smooth transitions between various weather scenarios.
  • the weather system 110 includes a circadian subsystem 101 and a weather subsystem 102 .
  • the day and night change subsystem 101 is used to change the scene picture in the virtual environment according to the weather configuration information, such as changing from daytime scene to dusk scene, from dusk scene to night scene, from night scene to daytime scene, etc.
  • the weather subsystem 102 is used to change the weather scene in the virtual environment according to the weather configuration information, such as changing the size and density of raindrops in the virtual environment according to raindrop parameters; changing the light intensity and light irradiation direction in the virtual environment according to light source parameters.
  • the circadian subsystem 101 includes a transition subsystem, and the weather subsystem 102 includes a sky light and shadow system, a raindrop subsystem and a cloud subsystem. It can be understood that the weather subsystem 102 also includes subsystems such as the wind subsystem and the Milky Way subsystem, and each subsystem operates independently. illustrate.
  • the texture sampling process refers to the process by which the game application obtains the weather texture from the Graphics Processing Unit (GPU). That is, the game application draws the weather map of the virtual elements in the weather scene through the GPU according to the weather configuration information.
  • a virtual element is an element in a virtual environment, and the virtual element includes at least one element of flat ground, river, lake, ocean, desert, sky, clouds, raindrops, snowflakes, plants, buildings, and vehicles.
  • Step 202 Eliminate the first weather texture in at least one weather texture to obtain the remaining second weather textures.
  • the removal of the first weather map does not affect the rendering of the virtual environment picture.
  • the first weather map includes at least one of the following maps:
  • the first weather map includes a first raindrop map that is located in front of the current avatar's field of vision and outside the preset distance of the current avatar.
  • the current avatar refers to the avatar controlled by the terminal. Taking the current avatar as the center, within the viewing angle of the current avatar, divide the texture layer along the line of sight of the current avatar, as shown in Figure 8, by adding the raindrop texture to the texture layer, so that the user can bind the current avatar through the current avatar.
  • the camera model sees raindrops located in front of the current virtual character's field of view, and the embodiment of the present application reduces the number of raindrop textures by adding raindrop textures with a preset number of texture layers within a preset distance range, thereby reducing the number of times the terminal performs texture sampling, Improve terminal performance when running game applications.
  • the first weather map includes a first scattering map at a sky height higher than a specified height, and the first scattering map is an image used to express the light scattering effect on the sky.
  • the scattering map is represented by a rectangle.
  • the pixels in the scattering map corresponding to the sky light and shadow effect include four-dimensional values.
  • C and coordinate axis D represent, among them, the coordinate axis A represents the angle between the line of sight of the virtual character and the highest point of the sun, the highest point of the sun represents the highest point of the sun in the sky, and the sky is the sky in the virtual environment;
  • the coordinate axis B represents the angle between any position of the sun in the sky and the highest point of the sun;
  • the coordinate axis C represents the distance of the virtual element in the space dimension, and in this embodiment of the application, the coordinate axis C represents the virtual character when the virtual character observes the earth in space , the distance between the virtual character and the virtual elements in the virtual environment on the earth, wherein the virtual environment formed by the earth and space is constructed by simulating the planetary environment in the real world, and the virtual environment on the earth is where the virtual character usually performs activities Area;
  • the coordinate axis D represents the angle
  • a part of the scattering map is reduced by removing the value represented by the coordinate axis C, that is, the scattering map on the sky height above the specified height is removed, and the coordinate axis A, the coordinate axis D, and the coordinate axis B are used as three pixel points.
  • This dimension reduces the number of scattering textures, thereby reducing the number of times the terminal performs texture sampling, and improving the performance of the terminal when running game applications.
  • the terminal when the weather scene includes cloud effects, the terminal directly obtains a two-dimensional grayscale map of the clouds; therefore, when the weather scene includes cloud effects, the second weather texture obtained by the terminal does not include a color three-dimensional map cloud image.
  • volumetric clouds are composed of three-dimensional cloud images with color. Therefore, in the embodiment of the present application, by not acquiring the three-dimensional cloud image with color, it has grayscale.
  • the two-dimensional cloud image of high degree can realize the cloud effect, which reduces the calculation amount of the terminal and improves the performance of the terminal when running the game application.
  • Step 203 rendering the weather scene in the virtual environment according to the second weather map.
  • Rendering refers to adding a weather map to the target location to make the target location appear richer in detail.
  • a raindrop texture excluding the first raindrop texture is added in front of the virtual character's field of vision, and the virtual environment is observed from the virtual character's perspective, and the virtual environment is in a raining scene.
  • the scattering map excluding the first scattering map is attached to the sky ball model, and the virtual environment picture is observed from the perspective of the virtual character, and the sky of the virtual environment presents the sky light and shadow effect corresponding to the sunny day scene.
  • the sky sphere model is a hemispherical model of the sky used to characterize the virtual environment.
  • a two-dimensional cloud image with grayscale is attached to the sky ball model, and the virtual environment picture is observed from the perspective of the virtual character, and the sky of the virtual environment presents the cloud effect corresponding to the sunny day scene.
  • the first weather map in the at least one weather map is removed, and the first weather map is located in the virtual environment.
  • the weather map outside the current virtual character's field of vision when the weather map is missing, does not affect the rendering of the weather scene within the current virtual character's field of view, nor does it affect the rendering of the overall virtual environment, thereby making the computer equipment used by the user load the virtual
  • the number of texture samples can be reduced by reducing the number of weather textures, so as to avoid the performance of the computer equipment being greatly reduced when running applications that support the virtual environment.
  • FIG. 5 shows a flowchart of a weather rendering method in a virtual environment provided by another exemplary embodiment of the present application.
  • the method can be applied to the terminal 100 as shown in FIG. 1 , the terminal 100 is installed with a client corresponding to an application program that supports running in a virtual environment, and the client runs based on the weather system 110 .
  • the method includes the following steps:
  • Step 501 Obtain weather configuration information, where the weather configuration information is used to render a weather scene in a virtual environment.
  • the server sends weather configuration information to the client, where the weather configuration information includes that the current time of the virtual environment is 3:00 pm, and the raindrop particle parameter is 10.
  • the client obtains the weather configuration information, it will The stored weather correspondence obtains the raindrop map related to the rainy day, and renders the rainy scene according to the raindrop map.
  • the operation of the weather system is related to the weather correspondence set by the game application during the development process, and the parameter information involved in the weather configuration information is set through the setting interface.
  • the weather configuration parameters and the weather correspondence are set in the setting interface.
  • the current time 25 of the virtual environment is manually input, or the current time 25 of the virtual environment is adjusted by clicking the time adjustment control.
  • the time lapse mode 26 it is set that the time in the virtual environment changes day and night, or the virtual environment runs in a fixed time period. For example, no matter when the terminal runs the game application, the scene displayed in the virtual environment is from 8:00 am to 9:00 am The corresponding scene, the scene is fixed.
  • the virtual environment is the same as the real world in 24-hour format, and 24 hours can be divided into 9 time periods (as shown in control 27 ) through the setting interface, and the time lengths corresponding to each time period are the same or different.
  • the technician may further perform more detailed settings, for example, for the time period in which the virtual environment is located in the morning, set the weather configuration parameters of the weather scene in the time period.
  • weather configuration parameters such as light source parameters, sky light and shadow parameters, fog parameters, cloud parameters, wind parameters, hue parameters, etc. in this time period.
  • Step 502 Acquire a second raindrop map corresponding to the rainy scene according to the weather configuration information.
  • the first raindrop map includes a raindrop map located in front of the current virtual character's field of view and beyond the preset distance of the current virtual character.
  • the second raindrop map includes a raindrop map that is located in front of the current virtual character's field of view and is within a preset distance from the current virtual character.
  • the second raindrop map is not included in the first raindrop map.
  • the terminal calculates the corresponding weather scene according to the weather configuration information, and obtains the second raindrop map according to the weather scene.
  • Step 503 When the weather scene includes a raindrop effect, acquire the texture layers divided in the direction of the current virtual character's sight line.
  • the terminal obtains the viewing angle range of the current avatar, and the viewing angle range includes texture layers divided along the line of sight of the avatar, and the number of texture layers is less than or equal to the preset number of layers.
  • the position of the avatar is represented by point A.
  • the preset number of layers of the texture layer is two layers.
  • a texture layer 1 and a texture layer 2 are divided, wherein the distance 28 between the texture layer 2 and the current virtual character is within a preset distance of the current virtual character.
  • the distances between the texture layers are equal, and the number of layers of the texture layers is controlled by the preset distance to be less than or equal to the preset number of layers.
  • the texture layer is used to determine the number of raindrop textures required to render a rainy scene, and each texture layer corresponds to a raindrop texture. In some embodiments, the distances between map layers may also be unequal.
  • Step 504 add a second raindrop texture to the texture layer, the channel of the second raindrop texture stores raindrop depth information, the raindrop depth information is used to represent the distance between the virtual character and the second raindrop texture, and the raindrop depth information and the raindrop size are positive. relationship.
  • the raindrop depth information is pre-stored in the B channel of the second raindrop map.
  • the raindrop depth information has a positive correlation with the raindrop size, and the B channel is red (Red, R) B channel in green (Green, G) blue (Blue, B) color mode. From this, it can be seen that the sizes of raindrops on the same second raindrop map are different.
  • the second raindrop map generally refers to one or more raindrop maps.
  • Step 505 Render raindrops conforming to the raindrop size in the rainy scene according to the raindrop depth information.
  • the terminal renders raindrops with a raindrop size matching the raindrop depth information in the rainy scene. That is, the raindrop size in the raining scene is rendered according to the depth information of the raindrops stored in the second raindrop map, so that the raindrops closer to the current avatar show a larger raindrop size, and the raindrops farther from the current avatar show a smaller size. Raindrop size. As shown in the right image of Figure 8, the raindrops have different sizes, showing a parallax effect.
  • the current virtual character is bound with a camera model, and the image obtained by shooting the virtual environment with the camera model represents the virtual environment image observed by the current virtual character, and a biconical model is constructed for the camera model, as shown in the left figure of Figure 9.
  • the cone model surrounds the camera model. Attach the second raindrop map to the surface of the biconical model. When the biconical model swings, the second raindrop map also swings with the biconical model. There is a certain angle between the camera model and the second raindrop map.
  • the raindrops in the rain image captured by the camera model, the raindrops present a certain inclination angle; when the two tips of the biconical model are perpendicular to the horizontal plane, the camera model is perpendicular to the second raindrop map, and the raindrops captured by the camera model are In the picture, raindrops fall vertically from top to bottom.
  • the camera model corresponding to the current virtual character is added to the biconical model, so as to simulate a rain scene in the real world. It should be noted that, during the running of the game application, neither the camera model nor the biconical model is displayed in the virtual environment picture, that is, the user cannot see the biconical model and the camera model, and FIG. 9 is for illustration only.
  • the raindrop map is displayed in the field of view of the virtual character according to the depth information, showing a visual phenomenon of "near big and far small", That is, the size of the raindrops close to the virtual character is larger, and the size of the raindrops far from the virtual character is smaller, so that the scene when it rains in the real world is realistically simulated.
  • particle emitters are usually used to simulate rain in the virtual environment
  • the weather rendering method in the virtual environment further includes the following steps:
  • Step 51 When the virtual character moves to the edge position of the enclosing range corresponding to the particle distribution box, obtain the moving direction of the current virtual character.
  • a particle emitter is usually arranged above the current virtual character 29 to realize that the current virtual character 29 is always located in the raining scene.
  • the virtual character corresponds to a particle distribution box.
  • the particle distribution box is a box-like model for emitting particles.
  • the particle distribution box is divided into m ⁇ n sub-distribution boxes.
  • the virtual character is located in the particle distribution box. Both m and n are positive integers.
  • Each subdistribution box corresponds to a particle emitter.
  • Particle distribution boxes are used to simulate weather scenes represented by particles through particle emitters.
  • m and n may be equal or unequal.
  • the particle distribution box is used to simulate rain, snow, sandstorm, hail and other weather scenarios that need to be represented by particles.
  • each surface of the particle distribution box 30 is divided into a 3 ⁇ 3 nine-square grid, and the particle distribution box is divided into sub-distribution boxes.
  • the distribution box corresponds to the particle emitter.
  • the current virtual character is located in the particle distribution box 30 .
  • the edge position refers to the position corresponding to the sub-distribution box of the edge layer of the particle distribution box that the current avatar is located. As shown in the left figure of Figure 11, the edge position is the position corresponding to the nearest layer of sub-distribution box 33, that is, the current avatar. The position corresponding to the first-layer sub-distribution box where 29 is located. The closest sub-distribution box 33 is the sub-distribution box of the first layer located in front of the current avatar's field of view and closest to the avatar.
  • Step 52 Move the layer of sub-distribution boxes farthest from the current virtual character to the front of the nearest layer of sub-distribution boxes along the moving direction, adjacent to the nearest layer of sub-distribution boxes, and the nearest layer of sub-distribution boxes.
  • the sub-distribution box is located in front of the current avatar's field of vision, and has the shortest distance from the current avatar.
  • the moving process of the particle distribution box is described from a top view, and the current virtual character 29 is located at the edge position of the enclosing range corresponding to the particle distribution box.
  • the moving direction of the current avatar is the direction indicated by the arrow, and the first layer of sub-distribution box located at the first position 31 is moved to the second position 32.
  • the first position 31 corresponds to the layer of sub-distribution box farthest from the current avatar.
  • the second position 32 is located in front of the nearest layer of sub-distribution box 33, and is adjacent to the nearest layer of sub-distribution box 33, the nearest layer of sub-distribution box 33 and the current virtual character
  • the distance between 29 is the shortest.
  • the particle distribution box is not displayed in the virtual environment screen, that is, the user cannot see the particle distribution box, and FIG. 11 is only for illustration.
  • the distribution box on the layer farthest from the virtual character It will move to the front of the virtual character's field of vision along the moving direction of the virtual character, so that the virtual character is always surrounded by the particle distribution box, so as to ensure that the virtual character is always surrounded by raindrops in a rainy scene, realizing the simulation of the real world.
  • the raindrops are blocked by using the depth information of the shielding element in the virtual environment to simulate the scene of people sheltering from the rain under the building in the real world, then the weather rendering method in the virtual environment It also includes the following steps:
  • Step 61 When the current avatar is located in the indoor environment, obtain depth information of the shielding object element, the depth information is obtained from the perspective of looking down on the virtual environment, and the shielding object element is used to provide the indoor environment for the current avatar.
  • the game application obtains the depth information of the occlusion element from the perspective of looking down on the virtual environment, that is, from the direction of the sky of the virtual environment to the ground of the virtual environment, to obtain the depth information of the occlusion element.
  • the shelter element includes at least one of a house element, a hut element, a pavilion element, a bridge hole element, a tunnel element, and a straw shed element.
  • the depth information of the occluder element is used to represent the distance between the highest point in the sky of the virtual environment (ie, the highest point of the sky sphere model) and the occluder element.
  • the depth information of occlusion elements is pre-stored in the weather system in offline state.
  • the current avatar can hide under cover to avoid raindrops in the virtual environment.
  • Step 62 Delete part of the second raindrop texture according to the depth information of the occluder element and the depth information corresponding to the avatar.
  • Part of the second raindrop texture refers to the second raindrop texture occluded by the occluder element in front of the current virtual character's field of view.
  • the depth information corresponding to the current avatar includes depth information corresponding to other avatars around the current avatar, and depth information of virtual elements within a preset range around the current avatar (virtual elements include shade elements, plant elements, and animal elements. at least one).
  • d represents the raindrop depth value
  • B represents the parameter corresponding to the raindrop depth information stored in the B channel (Blue channel) in the raindrop map
  • d range represents the depth range corresponding to the raindrop depth information
  • d base represents the virtual element in the virtual environment.
  • Base depth value, i represents the raindrop texture in the i-th texture layer.
  • the raindrop depth value is calculated by the above formula, that is, from the perspective of looking down on the virtual environment, when the raindrop falls to the ground, the distance between the raindrop and the highest point of the sky in the virtual environment.
  • h represents the height of the raindrop
  • d represents the raindrop depth value
  • the height at which the raindrop is located represents the height at which the raindrop falls from the sky of the virtual environment.
  • d represents the depth value of the raindrop
  • d scene represents the depth value of the occlusion element in the virtual environment
  • h represents the height value of the raindrop
  • h scene represents the height of the occlusion element in the virtual environment in the virtual environment
  • 1 represents the occlusion
  • the object element blocks the raindrops, and 0 means that the raindrops are not blocked.
  • the part of the second raindrop texture in front of the virtual character's field of vision is deleted, forming the raindrop that is "" Occlusion" phenomenon outside the occluder element.
  • the current avatar 29 moves to the indoor environment, the current avatar stands on the steps of the building, faces the outdoor environment, and raindrops are displayed in front of the current avatar 29 .
  • the current virtual character 29 is standing on a step 35 belonging to an indoor environment, and the other side of the step 35 is displayed with water splashes generated by raindrops 34.
  • the current virtual character 29 The ground in the standing position is dry, and the ground on the other side of the step 35 is wet.
  • the raindrop texture is the farthest away from the user, and the distance between the texture corresponding to the step element (indoor environment) and the user is less than The distance between the raindrop map and the user, use the map corresponding to the step element to block the raindrop map to achieve the effect shown in 12.
  • the position where the raindrop map appears in front of the virtual character is determined according to the depth information around the virtual character and the scene depth information of the raining scene, so that when the virtual character is in the In the indoor environment, the raindrops are effectively blocked to simulate the real world.
  • the weather rendering method in the virtual environment further includes the following steps:
  • Step 71 Obtain the initial position and the shooting position.
  • the initial position is the position where the water splash first appears on the ground of the virtual environment
  • the shooting position is the position where the camera model is located. .
  • the current virtual character corresponds to a camera model, and meshes are divided on the ground of the virtual environment, and each mesh corresponds to a splash.
  • the water splash is generated by raindrops falling on the ground of the virtual environment.
  • the position of the water splash changes dynamically.
  • the position of the water splash is updated by updating the position of the grid. Constantly updating the position of the grid will cause the performance consumption of the terminal, making the terminal surface temperature rises.
  • the graphics processing unit is used to update the position of the water spray at preset time intervals, and the position of the water spray moves with the camera model, thereby simulating the water spray effect in a rainy scene. Reduced terminal performance consumption, thereby avoiding terminal surface temperature rise.
  • Step 72 Calculate the i-th cycle position when the water splash is generated for the i-th time according to the initial position and the shooting position, where i is a positive integer.
  • P 0 represents the initial position of a single splash
  • P camera represents the shooting position of the camera model
  • P i represents the ith cycle to generate the splash.
  • Step 73 Obtain a position offset parameter, where the position offset parameter is used to represent the offset position when the spray cycle is generated.
  • Step 74 Calculate the i+1 th loop position when the i+1 th water splash is generated according to the position offset parameter and the ith loop position.
  • P i represents the generation of splash in the i-th cycle
  • represents the position offset parameter
  • t total represents the start time of the game to the current time
  • t cycle represents the time of one life cycle of the splash cycle, that is, the time from the generation to the disappearance of the splash.
  • Step 75 Repeat the above steps of generating the cyclic position of the water splash until the rain scene is switched.
  • the above two formulas are used to repeat the calculation to obtain the circulation position of the water splash.
  • the calculation of the circulation position of the generated water splash is stopped.
  • FIG. 13 the rendering of the water splash obtained by using the method provided in the embodiment of the present application is shown.
  • the method of this implementation calculates the first cycle position for generating the i-th spray by obtaining the shooting position of the camera model and the initial position of the spray on the virtual environment screen, so as to pass the first cycle position of the i-th spray.
  • Calculate the second loop position for generating the i+1th spray and use a fixed number of sprays to perform position offsets in the virtual environment, without the need for the terminal to generate new sprays, and improve the performance of the terminal when rendering sprays in rainy scenes.
  • the effect diagram of the rain scene in the virtual environment picture is obtained, which realizes a more realistic simulation of the real world.
  • the virtual environment picture is prone to layering phenomenon, and the area 47 in FIG. 15(a) represents the raindrop layering phenomenon that raindrops appear in the virtual environment.
  • the raindrop depth information stored in the channel of the second raindrop map is used, so that the raindrops are coherent and the layering phenomenon does not occur.
  • the game application obtains the weather configuration information , control the virtual environment screen to display a coherent raindrop effect, the raindrop effect is obtained after rendering the second raindrop map, and the second raindrop map is obtained according to the weather configuration information and the corresponding relationship between the weather (the second raindrop map is obtained through the above method. Embodiments are described, and will not be repeated here), raindrop depth information is stored in the channel of the second raindrop map, and the raindrop depth information represents the distance between the raindrop map and the virtual character.
  • f(x) represents the color of raindrops on each layer of raindrop map
  • R represents the raindrop shape information stored in the R channel (Red channel) in the raindrop map
  • g(x) represents the raindrop occlusion result in formula 3.
  • C represents the rendering color of raindrops
  • f(x) represents the color of raindrops on each layer of raindrop maps
  • i represents the raindrop maps of the i-th layer
  • n represents the n layers of raindrop maps.
  • the color of the raindrops on each layer of raindrop maps is calculated by using the raindrop occlusion results calculated in Formula 3 and the raindrop shape information stored in the R channel of the raindrop map, and the color of the raindrops on each layer of raindrop maps is accumulated by Formula 7, so that Finally, the overall color of the raindrops in the virtual environment picture tends to be consistent, so that the raindrops do not appear layered, as shown in (b) of Figure 15 .
  • FIG. 16 shows a flowchart of a method for rendering weather in a virtual environment provided by another exemplary embodiment of the present application.
  • the method can be applied to the terminal 100 shown in FIG. 1 , where the terminal 100 is installed with a client corresponding to an application program that supports virtual scene running.
  • the client operates based on the weather system 110 .
  • the method includes the following steps:
  • Step 1601 Obtain weather configuration information, where the weather configuration information is used to render a weather scene in a virtual environment.
  • step 1601 For the implementation of step 1601, reference may be made to the implementation of step 201 in the embodiment shown in FIG. 2 and the implementation of step 501 in the embodiment shown in FIG. 5 , and details are not repeated here.
  • Step 1602 Acquire a second scattering map corresponding to the sky according to the weather configuration information.
  • the first scatter map is a scatter map at a sky height above the specified height in the virtual environment; the second scatter map is a scatter map at a height below or equal to the specified height in the virtual environment; the first scatter map does not include the second scatter map.
  • the sky height can be represented by altitude.
  • the corresponding weather scene is calculated according to the weather configuration information and the weather correspondence, and the second scattering map is obtained according to the weather scene.
  • Step 1603 When the weather scene includes sky light and shadow effects, process the second scattering map corresponding to the sky to obtain a processed rendering map, and the resolution of the processed rendering map is smaller than that of the second scattering map.
  • Atmospheric scattering refers to the phenomenon that light interacts with atmospheric particles to redistribute incident energy in all directions with a certain law.
  • the calculation data about light scattering is stored offline in a scattering map (Inscattering Map), and the scattering map is directly sampled when the game is running to optimize the efficiency at runtime. That is, the game application calculates the scattering map of the sky in the virtual environment according to the weather correspondence and weather configuration information, stores the scattering map offline in the graphics processing unit, and directly obtains the scattering map to load the virtual environment when the game application is running. in the sky.
  • the embodiment of the present application optimizes the amount of calculation when rendering the sky, by rendering the scattering map of the sky part in the full-screen rendering screen 40 of the terminal into a rendering with a resolution of 64 ⁇ 64.
  • the calculation amount of the embodiment of the present application is reduced by about 500 times.
  • the rendering map 36 corresponding to the sky is attached to the sky ball model 39 to form a lighting scene corresponding to the rendered sky.
  • a cloud map 38 may be attached to the sky ball model 39 to enrich the rendered picture, and the cloud map 38 may also be rendered to a smaller resolution, so as to reduce the amount of computation brought by the rendering process.
  • a fog map 37 can be attached to the sky ball model 39 to enrich the rendered picture, and the fog map 37 can also be rendered to a smaller resolution to reduce the amount of calculation brought by the rendering process. .
  • Step 1604 Render a lighting scene corresponding to the sky of the virtual environment according to the processed rendering map.
  • the lighting scene corresponding to the sky of the virtual environment is rendered by the atmospheric scattering model, and the lighting scene corresponding to the full-screen rendering picture 40 is obtained by the method provided by the embodiment of the present application.
  • L(s, ⁇ ) represents the scattering result
  • L 0 represents the light intensity before entering the absorbing medium
  • F ex (s) represents the transmittance ratio
  • Lin (s, ⁇ ) represents the internal scattering result
  • represents the scattering angle.
  • the transmittance ratio is expressed by the following formula nine:
  • ⁇ R is the Rayleigh scattering coefficient
  • ⁇ M is the Mie scattering coefficient
  • s is the distance from the target to the light source.
  • ⁇ R is the Rayleigh scattering coefficient
  • ⁇ M is the Mie scattering coefficient
  • s is the distance from the target to the light source
  • is the scattering angle
  • E sun is the sunlight intensity.
  • the fog scene is rendered by the atmospheric scattering model.
  • the atmospheric scattering model when invoked to render the fog, when calculating the internal scattering result, the Rayleigh scattering coefficient and the Mie scattering coefficient are calculated.
  • the pixels corresponding to the calculation results of the light intensity part are rendered to a render map with a resolution of 64 ⁇ 64.
  • Figure 18 shows the rendered sky light and shadow effect obtained by the method provided by the embodiment of the present application, which shows the sky in the virtual environment in each time period of the day, and (a) of Figure 18 shows the virtual environment when the sun just rises.
  • the corresponding sky light and shadow effects Figure 18 (b) represents the sky light and shadow effects corresponding to the virtual environment in the morning to noon time period
  • Figure 18 (c) represents the sky light and shadow effects corresponding to the virtual environment when the sun is about to set
  • Figure 18 (d) represents the sky light and shadow effect corresponding to the virtual environment at night after the sun has completely set.
  • the terminal by processing the scattering map into a rendering map with a smaller resolution, and simulating a lighting scene corresponding to the sky of the virtual environment by using the rendering map with a smaller resolution, the terminal supports the virtual environment when running.
  • the number of times of texture sampling is reduced to avoid performance degradation of the terminal.
  • FIG. 19 shows a flowchart of a weather rendering method in a virtual environment provided by another exemplary embodiment of the present application.
  • the method can be applied to the terminal 100 as shown in FIG. 1 , the terminal 100 is installed with a client corresponding to an application program supporting the running of a virtual scene, and the client runs based on the weather system 110 .
  • the method includes the following steps:
  • Step 1901 when the weather scene includes cloud effects, obtain at least two first texture cloud maps, the first texture cloud maps include two-dimensional cloud images with grayscale, and the at least two first texture cloud maps are in the reference texture cloud map
  • the base texture cloud map includes two-dimensional cloud images under no-light conditions, obtained by offsetting the pixels in different directions.
  • the terminal obtains weather configuration information, and obtains at least two first texture cloud maps based on the weather configuration information.
  • the first texture cloud map does not include a three-dimensional cloud image with colored colors.
  • the embodiment of the present application realizes the rendering of clouds through a two-dimensional texture cloud map.
  • the two-dimensional texture cloud map is an image with grayscale. Obtained through the pre-stored benchmark texture cloud map in the weather system.
  • the benchmark texture cloud map is a two-dimensional cloud image with cloud outlines and under lighting conditions, that is, a cloud image that is not affected by any lighting conditions.
  • the benchmark texture cloud map is Image with grayscale.
  • the reference texture cloud map is preprocessed through a ray marching algorithm to obtain a first texture cloud map.
  • the process of the ray stepping algorithm is: First, n rays (n is a positive integer) are emitted from the sun, and the rays have a sampling step size. When the ray is in the cloud, one sample per step is taken to obtain the texture value in the cloud, where the white rectangle represents the texture value of the white value, and the rectangle with shadow represents the gray value of the texture value.
  • the acquisition method of the first texture cloud map is shown in the upper figure of FIG. 21 , and the pixels in the reference texture cloud map 41 are offset in three directions: left, right, and upward.
  • the upper offset steps are equal, and the left offset first texture cloud map 42 , the right offset first texture cloud map 43 , and the upper offset first texture cloud map 44 are obtained, respectively.
  • Step 1902 Mix at least two first texture cloud maps according to the weights corresponding to the channels of the first texture cloud map to obtain a second texture cloud map.
  • the weight corresponding to the channel of the first texture cloud map 42 offset to the left is 0.6
  • the weight corresponding to the channel of the first texture cloud map 43 offset to the right is 0.3
  • the first texture cloud map 43 offset to the right corresponds to the channel.
  • the weight corresponding to the channel of the first texture cloud map 44 with the upper offset is 0.1.
  • the three first texture cloud maps are rendered to obtain the rendered second texture cloud map 45 as shown in the lower part of FIG. 21 .
  • Step 1903 rendering the cloud under the light change according to the second texture cloud map.
  • the cloud under the change of the light 46 in the virtual environment is obtained by using the above method of obtaining the second texture cloud map 45, and the rendered cloud rendering is shown in FIG. 22 .
  • the two-dimensional texture cloud map is preprocessed, and the first texture cloud map is mixed according to the weights corresponding to the channels of the preprocessed first texture cloud map, so as to obtain a mixed
  • the terminal only needs to perform the process of sampling the two-dimensional texture cloud map once, which reduces the energy consumption of the terminal.
  • the following describes the transition between time periods and between scenes and between scenes in the virtual environment.
  • FIG. 23 shows a flowchart of a weather rendering method in a virtual environment provided by another exemplary embodiment of the present application. This method can be applied to the terminal 100 shown in Fig. 1, where the terminal 100 is installed with a client corresponding to an application program that supports virtual scene running. The client operates based on the weather system 110 .
  • the method includes the following steps:
  • Step 2301 obtain a first parameter, a second parameter, a third parameter and a fourth parameter, the first parameter is used to represent the texture parameter corresponding to the first weather scene, and the second parameter is used to represent the texture parameter corresponding to the second weather scene,
  • the third parameter is used to represent the corresponding transition coefficient when the time period is changed, and the fourth parameter is used to represent the corresponding transition coefficient when the two weather scenarios are switched.
  • a time period refers to a collection of all parameters that control the performance of the virtual environment at various time points in a day of the virtual environment.
  • the weather scene in the virtual environment is associated with the time period, that is, when the time in the virtual environment is switched from the time period A to the time period B, the weather scene is also switched from the weather scene corresponding to the time period A to the time period B in the corresponding weather scene.
  • the weather scene of the virtual environment is sunny, a linear transition is performed between adjacent periods.
  • adjacent time periods and weather transition together.
  • Weather switching is first the change of environmental elements, and then the switching of weather scenes. For example, in a rainy scene, the sky darkens first, and then it starts to rain. There can be smooth transitions between time periods, between time periods and weather scenes, and between weather scenes and weather scenes.
  • Step 2302 Obtain, according to the first parameter, the second parameter, the third parameter and the fourth parameter, the corresponding rendering picture parameters when the weather scene is switched.
  • Step 2303 Render a transition picture when the first weather scene is switched to the second weather scene according to the rendering picture parameter.
  • d represents the screen rendering parameter
  • d 1 represents the texture parameter corresponding to the first weather scene (or the first time period)
  • d 2 represents the texture parameter corresponding to the second weather scene (or the second time period)
  • d 3 represents the first Two map parameters corresponding to weather scenes
  • represents the corresponding transition coefficient when the two weather scenes are switched
  • the value range of ⁇ is 0 to 1
  • 1 means that the weather scene does not change (one weather scene remains unchanged)
  • 0 means that the weather scene occurs Change (two weather scenes are switched)
  • represents the corresponding transition coefficient when the time period is changed
  • the value of ⁇ ranges from 0 to 1
  • 0 represents the night time period
  • 1 represents the day time period.
  • the value of ⁇ changes from 1 to 0.
  • the ⁇ value changes; if the weather scene does not change, the ⁇ value does not change.
  • the picture rendering parameters corresponding to any time period can be calculated, so the virtual environment smoothly transitions from the scene corresponding to the day time period to the scene corresponding to the night time period.
  • the sunny day scene transitions to the rainy day scene (the second weather scene)
  • the current time period of the virtual environment is the daytime time period
  • the current weather scene is the sunny day scene
  • the screen rendering parameters corresponding to any time period can be calculated, and the corresponding screen rendering parameters can also be calculated when the weather scene changes in a certain period of time. Therefore, the virtual environment can be performed between time periods and time periods. The screen transitions smoothly when the scene is switched, and the screen can also be smoothly transitioned when any two weather scenes are switched.
  • Step 2303 can be replaced with the following steps:
  • Step S1 updating and calculating the rendering picture parameters at preset time intervals.
  • Step S2 rendering a transition picture when the first weather scene is switched to the second weather scene according to the rendering picture parameter.
  • the weather transition is optimized every frame.
  • FPS Frames Per Second
  • the number of frames per second is 30, only 15 update calculations need to be done per second, which makes the weather system consume more energy.
  • Small terminal performance Illustratively, two scene maps need to be sampled during the time period switching process, and the consumption of the terminal performance by the weather system is reduced by means of frame-by-frame optimization. Therefore, the switching between weather scenes in the virtual environment can be smoothly transitioned.
  • the first parameter, the second parameter, the third parameter and the fourth parameter are used to calculate the picture rendering parameters of the weather scene when the weather scene is cut, and the picture is rendered according to the picture rendering parameters, so that the virtual The weather scene of the environment can smoothly transition between any weather or any time period to ensure the smoothness of the display screen.
  • the screen rendering parameters are updated by updating the screen rendering parameters every preset time, so as to avoid frequent updating and calculation of the screen rendering parameters, which may reduce the performance of the terminal when loading the virtual environment screen.
  • FIG. 25 shows a schematic structural diagram of a weather rendering apparatus in a virtual environment provided by an exemplary embodiment of the present application.
  • the device can be implemented as all or a part of the terminal through software, hardware or a combination of the two, and the device includes:
  • an obtaining module 2510 configured to obtain at least one weather map of the weather scene in the virtual environment
  • the processing module 2520 is configured to remove the first weather map in at least one weather map to obtain the remaining second weather map; the first weather map includes the weather map in the virtual environment outside the field of view of the current virtual character;
  • the rendering module 2530 is configured to render the weather scene in the virtual environment according to the weather map.
  • the first weather map includes at least one of the following maps:
  • the first weather map includes a first raindrop map located in front of the current avatar's field of view and outside the preset distance of the current avatar;
  • the first weather map includes a first scattering map at a sky height higher than a specified height, and the first scattering map is an image for expressing the light scattering effect on the sky.
  • the obtaining module 2510 is configured to obtain the texture layer divided in the line of sight direction of the current virtual character when the weather scene includes a raindrop effect;
  • the processing module 2520 is used to add a second raindrop texture in the texture layer.
  • the second raindrop texture includes a raindrop texture located in front of the current virtual character's field of view and within a preset distance of the current virtual character.
  • the channel of the second raindrop texture Raindrop depth information is stored in the , and the raindrop depth information is used to represent the distance between the current virtual character and the second raindrop texture, and the raindrop depth information is positively correlated with the raindrop size;
  • the rendering module 2530 is configured to render raindrops with a raindrop size matching the raindrop depth information in the rain scene.
  • the obtaining module 2510 is configured to obtain the depth information of the shielding object element when the current virtual character is located in the indoor environment, the depth information is obtained from the perspective of looking down on the virtual environment, and the shielding object element is used for The current avatar provides an indoor environment;
  • the processing module 2520 is used to delete part of the second raindrop map according to the depth information of the shield element and the depth information corresponding to the current virtual character, and part of the second raindrop map refers to the second raindrop map that is blocked by the shield element in front of the field of view of the current virtual character. Raindrop texture.
  • the current virtual character corresponds to a camera model
  • the acquisition module 2510 is used to acquire the initial position and the shooting position, the initial position is the position where the water splash first appears on the ground of the virtual environment, the shooting position is the position where the camera model is located, and the water splash is used to represent the raindrops landing on the ground of the virtual environment produced splash;
  • the processing module 2520 is used to calculate the i-th cycle position when the water splash is generated for the i-th time according to the initial position and the shooting position, where i is a positive integer;
  • the obtaining module 2510 is used to obtain the position offset parameter, and the position offset parameter is used to represent the offset position when the spray cycle is generated;
  • the processing module 2520 is configured to calculate the i+1 th cycle position when the i+1 th water splash is generated according to the position offset parameter and the ith cycle position; repeat the above steps of generating the cyclic position of the water splash until the rain scene is switched.
  • the processing module 2520 is configured to process the second scattering map corresponding to the sky when the weather scene includes a sky light and shadow effect to obtain a processed rendering map with a resolution of the processed rendering map smaller than the resolution of the second scatter map, which includes the scatter map at sky heights below the specified height;
  • the rendering module 2530 is configured to render the lighting scene corresponding to the sky of the virtual environment according to the processed rendering map.
  • the obtaining module 2510 is configured to obtain at least two first texture cloud maps when the weather scene includes cloud effects, the first texture cloud maps include two-dimensional cloud images with grayscale, and at least two The first texture cloud map is obtained by offsetting pixels in the reference texture cloud map in different directions, and the reference texture cloud map includes a two-dimensional cloud image under no-light conditions;
  • a processing module 2520 configured to mix at least two first texture cloud maps according to the weights corresponding to the channels of the first texture cloud map to obtain a second texture cloud map;
  • the rendering module 2530 is configured to render the cloud under the light change according to the second texture cloud map.
  • the current virtual character corresponds to a particle distribution box
  • the particle distribution box is divided into m ⁇ n sub-distribution boxes, the current virtual character is located in the particle distribution box, m and n are both positive integers, each sub-distribution box
  • the distribution box corresponds to a particle emitter, and the particle distribution box is used to simulate the weather scene represented by particles through the particle emitter;
  • the obtaining module 2510 is used to obtain the moving direction of the current virtual character when the current virtual character moves to the edge position of the enclosing range corresponding to the particle distribution box;
  • the processing module 2520 is used to move the layer of sub-distribution boxes farthest from the current virtual character to the front of the nearest layer of sub-distribution boxes along the moving direction, adjacent to the nearest layer of sub-distribution boxes, and the nearest layer of sub-distribution boxes.
  • the sub-distribution box of the first layer is located in front of the current avatar's field of vision and has the shortest distance from the current avatar.
  • the obtaining module 2510 is configured to obtain a first parameter, a second parameter, a third parameter and a fourth parameter, the first parameter is used to represent the texture parameter corresponding to the first weather scene, and the second parameter It is used to represent the texture parameter corresponding to the second weather scene, the third parameter is used to represent the corresponding transition coefficient when the time period is changed, and the fourth parameter is used to represent the corresponding transition coefficient when the two weather scenes are switched;
  • the processing module 2520 is used to obtain the corresponding rendering picture parameters when the weather scene is switched according to the first parameter, the second parameter, the third parameter and the fourth parameter; Transition screen.
  • processing module 2520 is configured to update and calculate the rendering picture parameters at preset time intervals
  • the rendering module 2520 is configured to render a transition picture when the first weather scene is switched to the second weather scene according to the rendering picture parameter.
  • FIG. 26 shows a structural block diagram of a computer device 2600 provided by an exemplary embodiment of the present application.
  • the computer device 2600 can be a portable mobile terminal, such as: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, moving picture expert compression standard audio layer 3), MP4 (Moving Picture Experts Group Audio Layer IV, Motion Picture Expert Compression Standard Audio Layer 4) Player.
  • Computer device 2600 may also be referred to by other names such as user equipment, portable terminal, and the like.
  • computer device 2600 includes: processor 2601 and memory 2602 .
  • the processor 2601 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like.
  • the processor 2601 can use at least one hardware form among DSP (Digital Signal Processing, digital signal processing), FPGA (Field-Programmable Gate Array, field programmable gate array), PLA (Programmable Logic Array, programmable logic array) accomplish.
  • the processor 2601 may also include a main processor and a coprocessor.
  • the main processor is a processor used to process data in the wake-up state, also called CPU (Central Processing Unit, central processing unit); the coprocessor is A low-power processor for processing data in a standby state.
  • the processor 2601 may be integrated with a GPU (Graphics Processing Unit, image processor), and the GPU is used for rendering and drawing the content that needs to be displayed on the display screen.
  • the processor 2601 may further include an AI (Artificial Intelligence, artificial intelligence) processor, where the AI processor is used to process computing operations related to machine learning.
  • AI Artificial Intelligence, artificial intelligence
  • Memory 2602 may include one or more computer-readable storage media, which may be tangible and non-transitory. Memory 2602 may also include high-speed random access memory, as well as non-volatile memory, such as one or more disk storage devices, flash storage devices. In some embodiments, the non-transitory computer-readable storage medium in the memory 2602 is used to store at least one instruction, and the at least one instruction is used to be executed by the processor 2601 to implement the virtual environment provided in the embodiments of the present application. weather rendering method.
  • FIG. 26 does not constitute a limitation to the computer device 2600, and may include more or less components than the one shown, or combine some components, or adopt different component arrangements.
  • An embodiment of the present application further provides a computer device, the computer device includes a processor and a memory, and the memory stores at least one instruction, at least one piece of program, code set or instruction set, the at least one instruction, the at least one piece of program, the at least one piece of program, the The code set or the instruction set is loaded and executed by the processor to implement the weather rendering method in the virtual environment provided by the above method embodiments.
  • Embodiments of the present application further provide a computer-readable storage medium, in which at least one instruction, at least one piece of program, code set or instruction set is stored, and the at least one instruction, at least one piece of program, code set or instruction set is processed by The server is loaded and executed to implement the weather rendering method in the virtual environment provided by the above method embodiments.
  • Embodiments of the present application further provide a computer program product or computer program, where the computer program product or computer program includes computer instructions, and the computer instructions are stored in a computer-readable storage medium.
  • a processor of a computer device reads the computer instructions from the computer-readable storage medium, the processor executes the computer instructions, causing the computer device to perform the weather rendering method in a virtual environment as described above.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Ecology (AREA)
  • Environmental & Geological Engineering (AREA)
  • Environmental Sciences (AREA)
  • Computer Graphics (AREA)
  • Processing Or Creating Images (AREA)
  • Image Generation (AREA)

Abstract

本申请公开了一种虚拟环境中的天气渲染方法、装置、设备、介质及程序,属于图像处理技术领域。该方法包括:获取虚拟环境中天气场景的至少一张天气贴图(201);剔除至少一张天气贴图中的第一天气贴图,得到剩余的第二天气贴图(202);根据第二天气贴图渲染虚拟环境中的天气场景(203)。该方法通过减少天气贴图的数量来减少贴图采样的次数,提高终端在运行支持虚拟环境的应用程序时的性能。

Description

虚拟环境中的天气渲染方法、装置、设备、介质及程序
本申请要求于2020年11月16日提交的申请号为202011280349.7、发明名称为“虚拟环境中的天气渲染方法、装置、设备及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及图像处理技术领域,特别涉及一种虚拟环境中的天气渲染方法、装置、设备、介质及程序。
背景技术
在三维虚拟环境的应用程序中,如大型多人在线角色扮演游戏,呈现出模拟现实世界中的天气,使得用户在控制虚拟角色进行游戏时,具有更真实的体验。
以天气场景为下雨场景来举例说明,在渲染下雨场景时,各张雨滴贴图上的雨滴尺寸不同,需要将多张雨滴贴图按照一定顺序添加在虚拟角色的视野前方,来模拟真实世界中的下雨场景。
发明内容
本申请实施例提供了一种虚拟环境中的天气渲染方法、装置、设备、介质及程序。所述技术方案如下:
根据本申请的一个方面,提供了一种虚拟环境中的天气渲染方法,应用于计算机设备中,所述方法包括:
获取虚拟环境中天气场景的至少一张天气贴图;
剔除所述至少一张天气贴图中的第一天气贴图,得到剩余的第二天气贴图;所述第一天气贴图包括所述虚拟环境中位于当前虚拟角色的视野之外的天气贴图;
根据所述第二天气贴图渲染所述天气场景。
根据本申请的另一方面,提供了一种虚拟环境中的天气渲染装置,所述装置包括:
获取模块,用于获取虚拟环境中天气场景的至少一张天气贴图;
处理模块,用于剔除所述至少一张天气贴图中的第一天气贴图,得到剩余的第二天气贴图;所述第一天气贴图包括所述虚拟环境中位于当前虚拟角色的视野之外的天气贴图;
渲染模块,用于根据所述第二天气贴图渲染所述天气场景。
根据本申请的另一方面,提供了一种计算机设备,所述计算机设备包括:处理器和存储器,所述存储器中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现如上方面所述的虚拟环境中的天气渲染方法。
根据本申请的另一方面,提供了一种计算机可读存储介质,所述存储介质中存储有计算机程序,所述计算机程序由处理器加载并执行以实现如上方面所述的虚拟环境中的天气渲染方法。
根据本申请的另一方面,提供了一种计算机程序产品或计算机程序,所述计算机程序产品或计算机程序包括计算机指令,所述计算机指令存储在计算机可读存储介质中。计算机设备的处理器从所述计算机可读存储介质读取所述计算机指令,所述处理器执行所述计算机指令,使得所述计算机设备执行如上方面所述的虚拟环境中的天气渲染方法。
本申请实施例提供的技术方案带来的有益效果至少包括:
在获取得到虚拟环境中天气场景的至少一张天气贴图之后,剔除至少一张天气贴图中的 第一天气贴图,第一天气贴图是虚拟环境中位于当前虚拟角色的视野之外的天气贴图,这一天气贴图在缺失时不影响当前虚拟角色的视野内天气场景的渲染,也不影响整体虚拟环境的渲染,进而使得用户使用的计算机设备在加载虚拟环境画面时,保证天气场景的正常显示的同时,能够通过减少天气贴图的数量来减少贴图采样的次数,从而避免计算机设备在运行支持虚拟环境的应用程序时,性能大幅度降低。
附图说明
图1是本申请一个示例性实施例提供的终端的框图;
图2是本申请一个示例性实施例提供的虚拟环境中的天气渲染方法的流程图;
图3是本申请一个示例性实施例提供的天气***的框架图;
图4是本申请一个示例性实施例提供的天空对应的散射贴图的示意图;
图5是本申请另一个示例性实施例提供的虚拟环境中的天气渲染方法的流程图;
图6是本申请一个示例性实施例提供的天气配置参数的设置界面示意图;
图7是本申请另一个示例性实施例提供的天气配置参数的设置界面示意图;
图8是本申请一个示例性实施例提供的贴图层划分的示意图;
图9是本申请一个示例性实施例提供的下雨场景的模拟效果示意图;
图10是本申请一个示例性实施例提供的粒子发射器的工作示意图;
图11是本申请一个示例性实施例提供的粒子分布盒的示意图;
图12是本申请一个示例性实施例提供的虚拟环境画面对应的下雨场景的画面图;
图13是本申请一个示例性实施例提供的虚拟环境中的水花生成的画面示意图;
图14是本申请另一个示例性实施例提供的虚拟环境画面对应的下雨场景的画面图;
图15是本申请另一个示例性实施例提供的虚拟环境画面对应的下雨场景的画面图;
图16是本申请另一个示例性实施例提供的虚拟环境中的天气渲染方法的流程图;
图17是本申请另一个示例性实施例提供的天空对应的散射贴图的示意图;
图18是本申请一个示例性实施例提供的天空渲染的效果示意图;
图19是本申请另一个示例性实施例提供的虚拟环境中的天气渲染方法的流程图;
图20是本申请一个示例性实施例提供的纹理云贴图的渲染示意图;
图21是本申请另一个示例性实施例提供的纹理云贴图的渲染示意图;
图22是本申请一个示例性实施例提供的云彩渲染的效果示意图;
图23是本申请另一个示例性实施例提供的虚拟环境中的天气渲染方法的流程图;
图24是本申请一个示例性实施例提供的天气场景过渡的流程框架图;
图25是本申请一个示例性实施例提供的虚拟环境中的天气渲染装置的框图;
图26是本申请一个示例性实施例提供的计算机设备的结构示意图。
具体实施方式
首先,对本申请实施例中涉及的名词进行介绍:
虚拟环境:是应用程序在终端上运行时显示(或提供)的虚拟环境。该虚拟环境可以是对真实世界的仿真环境,也可以是半仿真半虚构的环境,还可以是纯虚构的环境。虚拟环境可以是二维虚拟环境、2.5维虚拟环境和三维虚拟环境中的任意一种,本申请对此不加以限定。下述实施例以虚拟环境是三维虚拟环境来举例说明。
虚拟角色:是指虚拟场景中的可活动对象。该可活动对象可以是虚拟人物、虚拟动物、动漫人物等,比如:在三维虚拟场景中显示的人物、动物、植物、油桶、墙壁、石块等。可选地,虚拟角色是基于动画骨骼技术创建的三维立体模型。每个虚拟角色在三维虚拟场景中具有自身的形状和体积,占据三维虚拟场景中的一部分空间。虚拟角色泛指虚拟场景中的一个或多个虚拟角色。以游戏应用程序为例,虚拟角色是用户参与游戏时控制的可活动对象。
大型多人在线角色扮演游戏(MMOPRG,Massively Multiplayer Online Role-Playing Game):是指一种支持多人在线的网络游戏,不同客户端可在同一场景中进行游戏,不同客户端可协同完成一项任务,客户端之间可在线交流,客户端还可与游戏中的非玩家角色(NPC,Non-Player Character)进行互动。通常情况下,用户通过在客户端登录用户帐号控制虚拟角色,虚拟角色与用户帐号(ID,Identity)一一对应。用户控制的虚拟角色在虚拟环境中扮演不同的角色,比如,将军角色、法师角色、书生角色、舞女角色等。大型多人在线游戏包括战略类、动作类、冒险类、模拟类、运动类、赛车类、角色扮演类等类型。下述实施例以在客户端是游戏客户端为例说明。
本申请中提供的方法可以应用于三维地图程序、军事仿真程序、第一人称射击游戏(First-Person Shooting Game,FPS)、多人在线战术竞技游戏(Multiplayer Online Battle Arena Games,MOBA)、MMOPRG游戏、虚拟现实应用程序(Virtual Reality,VR)、增强现实应用程序(Augmented Reality,AR)等,下述实施例是以游戏应用程序来举例说明。
基于虚拟场景的游戏由一个或多个游戏世界的地图构成,游戏中的虚拟环境模拟真实世界的环境,用户可以操控游戏中的虚拟角色在虚拟环境中进行行走、跑步、跳跃、射击、格斗、驾驶、受到其他虚拟角色的攻击(其他虚拟角色是其他用户控制的虚拟角色)、受到虚拟环境中的伤害、攻击其他虚拟角色等动作,交互性较强,并且多个用户可以在线组队进行竞技游戏。
示意性的,虚拟环境对应有变换的天气,比如,晴天、雨天、雪天、沙尘暴、雷阵雨、暴雨、暴雪等,与现实世界一样,虚拟环境被划分为各个时间段,每个时间段对应不一样的光影效果,比如,上午6点至7点是日出的时间,虚拟环境对应日出的光线,下午4点至5点是落日的时间,虚拟环境对应黄昏的光线。
本申请实施例提供了一种虚拟环境中的天气渲染方法,该方法中剔除了天气场景的至少一张天气贴图中的第一天气贴图,第一天气贴图在缺失时不影响当前虚拟角色的视野内的天气场景渲染,因此,可以使得终端在运行游戏应用程序时,贴图采样的次数减少,避免了终端的性能降低。
图1示出了本申请一个示例性实施例提供的终端的框图,本申请实施例提供的虚拟环境中的天气渲染方法应用于终端100中,该终端100安装有支持虚拟场景运行的应用程序对应的客户端,该应用程序包括三维地图程序、军事仿真程序、FPS游戏、MOBA游戏、MMOPRG游戏、VR应用程序、AR应用程序。下述实施例以应用程序为游戏应用程序来举例说明。
在一些实施例中,终端包括智能手机、平板电脑、MP3播放器、MP4播放器、膝上便携式计算机、台式计算机、笔记本电脑中的至少一种,以下实施例以终端包括智能手机为例进行说明。
以客户端为游戏应用程序的客户端(简称游戏客户端)来举例说明,客户端对应有天气***110包括天空光影子***111、雨滴子***112、云彩子***113和过渡子***114。
天空光影子***111用于模拟虚拟环境中的各种光照场景,如虚拟环境中太阳初升时的光照场景,黄昏时的光照场景,夜晚时月亮的光照场景。天空光影子***111根据天气配置信息计算出虚拟环境的天空对应的散射贴图,将该散射贴图进行离线存储,在游戏客户端运行时,直接通过获取散射贴图加载虚拟环境中的光照场景。该散射贴图不包括虚拟元素在空间维度上的第一散射贴图,也即不包括高于指定高度的天空高度上的第一散射贴图。贴图采样是指游戏客户端在运行过程中,终端的图形处理器(Graphics Processing Unit,GPU)访问(获取)贴图来加载天气场景的过程。
雨滴子***112用于模拟虚拟环境中的各种下雨场景,如小雨场景、大雨场景、暴雨场景、雷阵雨场景中的至少一种。雨滴子***112根据天气配置信息计算出雨滴贴图,在虚拟 环境处于下雨场景时,通过构建如图9所示的双锥形模型模拟下雨场景,该双锥形模型将与虚拟角色绑定的摄像机模型包围,将雨滴贴图贴附在该双锥形模型的表面,当双锥形模型的角度不断变化时,雨滴贴图与摄像机模型之间的角度也在不断变化,因此摄像机模型拍摄到的含有雨滴的画面也在不断变化。雨滴子***112对虚拟角色的视角范围沿视线方向进行划分,得到预设层数的贴图层,根据贴图层的位置添加预设层数的雨滴贴图,即在双锥形模型的表面贴附预设层数的雨滴贴图,该雨滴贴图中的B通道存储有雨滴深度信息,B通道为红(Red,R)绿(Green,G)蓝(Blue,B)色彩模式下的B通道。该雨滴深度信息表征虚拟角色(或虚拟角色绑定的摄像机模型)与雨滴贴图之间的距离,具有不同雨滴深度信息的雨滴贴图根据视差运动(Parallax)原理模拟下雨场景,即靠近虚拟角色的雨滴的雨滴尺寸较大,远离虚拟角色的雨滴的雨滴尺寸较小。
雨滴子***112还用于更新水花的位置,水花用于表征雨滴溅落在虚拟环境的地面上时产生的水花,通过将水花的位置与虚拟角色对应的摄像机模型关联,当摄像机模型移动时,水花会跟随摄像机模型移动。通过偏移水花在虚拟环境中的位置来模拟下雨时地面上源源不断的水花,无需终端重复生成水花,提高终端在运行游戏客户端时的性能。
雨滴子***112还用于针对虚拟角色生成粒子分布盒,该粒子分布盒将虚拟角色包围,该粒子分布盒子被划分为m×n个子分布盒,m和n均为正整数,m和n可以相等或不等,每个子分布盒对应有粒子发射器。当虚拟角色移动至粒子分布盒对应的包围范围的边缘位置时,距离虚拟角色最远的一层子分布盒会沿着虚拟角色移动的方向移动至最近一层的子分布盒的前方,且与最近一层的子分布盒相邻,最近一层子分布盒位于虚拟角色的视野前方,且与虚拟角色之间的距离最短。由于子分布盒的移动,使得虚拟角色始终位于粒子分布盒中,即虚拟角色始终被雨滴笼罩。示意性的,该粒子分布盒还用于模拟下雪场景、下冰雹场景、沙尘暴场景等需要以粒子表现的天气场景。
云彩子***113用于对二维云彩图像进行预处理,二维云彩图像是具有云彩轮廓,且无光照条件下的灰度图像,预处理是指通过光线步进算法(Raymarching)对二维云彩图形进行处理,即将二维云彩图像中的像素点按照预设步长向某一方向移动后得到的图像,将经过预处理后的至少两张纹理云贴图按照理纹云贴图的通道对应的权重进行混合,从而得到用于加载虚拟场景的云彩贴图,无需终端重复进行贴图采样来加载虚拟环境画面。
过渡子***114用于计算虚拟环境的天气场景切换时的画面渲染参数,根据画面渲染参数渲染两种天气场景切换时的过渡画面。
图2示出了本申请一个示例性实施例提供的虚拟环境中的天气渲染方法的流程图,该方法可应用于如图1所示的终端100中,该终端100安装有支持虚拟环境运行的应用程序对应的客户端,该客户端基于天气***110运行。该方法包括如下步骤:
步骤201,获取虚拟环境中天气场景的至少一张天气贴图。
针对天气贴图的获取,可以采用如下步骤实现:
1)终端获取天气配置信息,天气配置信息用于渲染虚拟环境中的天气场景。
示意性的,天气配置信息可以用于初始虚拟环境中的天气场景的渲染,还可以用于虚拟环境中天气场景切换时的天气场景的渲染。
示意性的,用户使用的终端上安装和运行有支持虚拟场景运行的应用程序,当终端运行该应用程序时,终端的显示屏上对应显示在使用该应用程序时的画面。该应用程序是基于天气***运行的,在终端的显示屏上对应显示天气场景。
以游戏应用程序为例,在运行游戏应用程序时,终端的显示屏显示有游戏中的虚拟环境画面。示意性的,该虚拟场景画面是以虚拟角色的第一人称视角对虚拟场景进行观察得到的画面,或,该虚拟场景画面是以虚拟角色的第三人称视角对虚拟场景进行观察得到的画面。
示意性的,游戏应用程序对应有服务器,服务器将天气配置信息发送至运行有应用程序 的终端中,游戏应用程序根据天气配置信息对虚拟环境对应的天气场景进行画面渲染,如,天气配置信息为当前虚拟环境所在的时刻,服务器向终端发送虚拟环境当前所在的时刻,如上午十点,游戏应用程序根据上午十点对应的天气场景加载虚拟环境画面。
示意性的,通过从终端中获取状态参数来获取天气配置参数,比如,终端为智能手机,智能手机显示当前时刻为上午十点,游戏应用程序根据智能手机显示的当前时刻对应的天气场景加载虚拟环境画面。
天气配置信息是指虚拟环境中天气元素的配置信息;天气配置信息包括虚拟环境所处的时间段、虚拟环境的时间流逝方式(如昼夜变化或固定在一个时间段不变)、光源参数、天空参数、雾气参数、云彩参数、风参数、粒子参数(用于表示天气场景中的粒子,如雨滴、雪花、沙尘暴等)、环境特效参数(如虚拟角色在释放技能时,虚拟环境受到的影响)、水面效果参数、太阳参数、月亮参数、镜头光晕(跟随虚拟角色移动的摄像机模型的镜头)中的至少一种。
示意性的,天气***中预先存储有天气对应关系,该天气对应关系用于表示天气配置信息和天气场景(包括场景效果)之间的对应关系,可以理解的是,技术人员在开发游戏应用程序时,在天气***中预先设置有天气对应关系,将天气***封装在游戏应用程序中。游戏应用程序在运行的过程中,根据天气配置信息和天气对应关系计算在该天气配置信息下的天气场景。天气场景是指在某种天气场景下虚拟环境呈现的场景效果。
示意性的,天气对应关系包括函数关系,如,天气配置信息与天气场景符合函数y=kx+b*e x,k和b为常数,x表示天气配置信息对应的参数,y表示天气场景。
示意性的,天气对应关系包括对照表关系,游戏应用程序根据天气配置信息从对照表中查询对应的天气场景。以表一表示天气配置信息和天气场景之间的对应关系。
表一
天气配置信息 天气场景
下午三点,光照强度0.4 落日余晖
雨滴粒子参数10 暴雨
雾气密度60% 浓雾
本申请实施例对天气对应关系的表达方式不加以限定。
2)根据天气配置信息,获取天气场景的至少一张天气贴图。
天气贴图用于表现天气场景的贴图,存在第一天气贴图和第二天气贴图这两类天气贴图,第一天气贴图包括虚拟环境中位于当前虚拟角色的视野之外的天气贴图,第二天气贴图包括虚拟环境中位于当前虚拟角色的视野之外的天气贴图。
天气***110用于如下几个方面中的至少一个方面:虚拟环境中的昼夜变化、模拟现实世界的天气表现以及在各种天气场景间的平滑过渡。如图3所示,天气***110包括昼夜变化子***101和天气子***102。其中,昼夜变化子***101用于根据天气配置信息改变虚拟环境中的场景画面,如,从白天场景变换至黄昏场景,从黄昏场景变换至夜晚场景,从夜晚场景变换至白天场景等。天气子***102用于根据天气配置信息改变虚拟环境中的天气场景,如,根据雨滴参数改变虚拟环境中的雨滴尺寸和雨滴密度;根据光源参数改变虚拟环境中的光线强度和光线照射方向等。昼夜变化子***101包括过渡子***,天气子***102包括天空光影子***、雨滴子***和云彩子***。可以理解的是,天气子***102还包括如风子***、银河子***等子***,各个子***分别独立运行,本申请实施例以天空光影子***、雨滴子***和云彩子***为例进行说明。
在天气***渲染天气的过程中,通常需要大量的、多个维度表示天气场景的天气贴图,终端基于天气配置信息进行贴图采样。贴图采样过程是指游戏应用程序从图形处理单元(Graphics Processing Unit,GPU)中获取天气贴图的过程。也即是游戏应用程序通过GPU按照天气配置信息绘制天气场景中虚拟元素的天气贴图。虚拟元素是虚拟环境中的元素,虚 拟元素包括平地、河流、湖泊、海洋、沙漠、天空、云彩、雨滴、雪花、植物、建筑、交通工具中的至少一种元素。
步骤202,剔除至少一张天气贴图中的第一天气贴图,得到剩余的第二天气贴图。
其中,第一天气贴图去除后不影响虚拟环境画面的渲染。第一天气贴图包括如下贴图中的至少一种:
1、当天气场景包括雨滴效果时,第一天气贴图包括位于当前虚拟角色的视野前方,且位于当前虚拟角色预设距离之外的第一雨滴贴图。
当前虚拟角色是指终端控制的虚拟角色。以当前虚拟角色为中心,在当前虚拟角色的视角范围内,沿当前虚拟角色的视线方向划分贴图层,如图8所示,通过在贴图层中添加雨滴贴图,使得用户通过当前虚拟角色绑定的摄像机模型看到位于当前虚拟角色视野前方的雨滴,本申请实施例通过在预设距离范围内添加预设贴图层数的雨滴贴图来减少雨滴贴图的数量,从而减少终端进行贴图采样的次数,提高终端在运行游戏应用程序时的性能。
2、当天气场景包括天空光影效果时,第一天气贴图包括高于指定高度的天空高度上的第一散射贴图,第一散射贴图是用于表现天空上的光线散射效果的图像。
如图4所示,以矩形表示散射贴图,通常在表示天空光影效果对应的散射贴图中的像素点包括四个维度的数值,四个维度的数值分别以坐标轴A、坐标轴B、坐标轴C和坐标轴D表示,其中,坐标轴A表示虚拟角色的视线和太阳最高点之间的夹角度数,太阳最高点表示太阳在天空中的最高点,天空是虚拟环境中的天空;坐标轴B表示太阳在天空中的任意位置与太阳最高点之间的夹角度数;坐标轴C表示虚拟元素在空间维度上的距离,本申请实施例以坐标轴C表示虚拟角色在太空中观察地球时,虚拟角色与地球上的虚拟环境中的虚拟元素之间的距离,其中,地球和太空形成的虚拟环境是模拟现实世界中的星球环境构建的,地球上的虚拟环境是虚拟角色通常进行活动的区域;坐标轴D表示虚拟角色的视线和太阳在天空中的任意位置时对应的夹角度数。
本申请实施例通过去除坐标轴C表示的数值来减少一部分散射贴图,也即去除指定高度及以上的天空高度上的散射图,将坐标轴A、坐标轴D和坐标轴B作为像素点的三个维度,减少了散射贴图的数量,从而减少终端进行贴图采样的次数,提高终端在运行游戏应用程序时的性能。
在一些实施例中,当天气场景包括云彩效果时,终端直接获取云彩的二维灰度贴图;因此,当天气场景包括云彩效果时,终端获得的第二天气贴图中还不包括具有彩色的三维云彩图像。
通常游戏应用程序使用体积云来表现虚拟环境的中的云彩效果,体积云是由具有彩色色彩的三维云彩图像构成,因此本申请实施例通过不获取带有彩色色彩的三维云彩图像,以具有灰度的二维云彩图像实现云彩效果,使得终端的运算量减少,提高终端在运行游戏应用程序时的性能。
步骤203,根据第二天气贴图渲染虚拟环境中的天气场景。
渲染是指将天气贴图添加在目标位置上,使目标位置呈现更丰富的细节。
示意性的,在下雨场景中,将不包括第一雨滴贴图的雨滴贴图添加在虚拟角色的视野前方,以虚拟角色的视角观察虚拟环境画面,虚拟环境处于下雨场景。
示意性的,在晴天场景中,将不包括第一散射贴图的散射贴图贴附在天空球模型上,以虚拟角色的视角观察虚拟环境画面,虚拟环境的天空呈现晴天场景对应的天空光影效果。天空球模型是用于表征虚拟环境的天空半球形模型。
示意性的,在晴天场景中,将具有灰度的二维云彩图像贴附在天空球模型上,以虚拟角色的视角观察虚拟环境画面,虚拟环境的天空呈现晴天场景对应的云彩效果。
综上所述,本实施例提供的方法,在获取得到虚拟环境中天气场景的至少一张天气贴图之后,剔除至少一张天气贴图中的第一天气贴图,第一天气贴图是虚拟环境中位于当前虚拟 角色的视野之外的天气贴图,这一天气贴图在缺失时不影响当前虚拟角色的视野内天气场景的渲染,也不影响整体虚拟环境的渲染,进而使得用户使用的计算机设备在加载虚拟环境画面时,保证天气场景的正常显示的同时,能够通过减少天气贴图的数量来减少贴图采样的次数,从而避免计算机设备在运行支持虚拟环境的应用程序时,性能大幅度降低。
以天气场景包括雨滴效果、天空光影效果、云彩效果为例,分别对本申请实施例提供的虚拟环境中的天气渲染方法进行说明。
1、雨滴效果。
图5示出了本申请另一个示例性实施例提供的虚拟环境中的天气渲染方法的流程图。该方法可应用于如图1所示的终端100中,该终端100安装有支持虚拟环境运行的应用程序对应的客户端,该客户端基于天气***110运行。该方法包括如下步骤:
步骤501,获取天气配置信息,天气配置信息用于渲染虚拟环境中的天气场景。
示意性的,服务器向客户端发送天气配置信息,该天气配置信息包括虚拟环境当前的时刻为下午三点,雨滴粒子参数为10,客户端在获取到该天气配置信息时,根据天气***中预先存储的天气对应关系获取与雨天相关的雨滴贴图,根据雨滴贴图渲染下雨场景。
天气***的运行与游戏应用程序在开发过程中设置的天气对应关系有关,天气配置信息中涉及的参数信息是通过设置界面设置的。
如图6所示,在设置界面中设置天气配置参数和天气对应关系。示意性的,在设置界面中通过手动输入虚拟环境的当前时间25,或者点击时间调节控件调节虚拟环境的当前时间25。通过设置时间流逝方式26设置虚拟环境中的时间是昼夜变化的,或者虚拟环境以固定的时间段运行,如无论终端何时运行游戏应用程序,虚拟环境显示的场景均为上午八点至九点对应的场景,该场景是固定不变的。示意性的,虚拟环境与现实世界一样,均为24小时制,可通过设置界面将24小时划分为9个时间段(如控件27中所示),各个时间段对应的时间长度相同或不同。
如图7所示。技术人员还可进一步进行更精细地设置,比如,针对虚拟环境所处的时间段为早上,设置在该时间段下天气场景的天气配置参数。如在该时间段下的光源参数、天空光影参数、雾气参数、云彩参数、风参数、色调参数等。
步骤502,根据天气配置信息,获取下雨场景对应的第二雨滴贴图。
第一雨滴贴图包括位于当前虚拟角色的视野前方,且位于当前虚拟角色预设距离之外的雨滴贴图。第二雨滴贴图包括位于当前虚拟角色的视野前方、且与当前虚拟角色之间的距离在预设距离之内的雨滴贴图。第一雨滴贴图中不包括第二雨滴贴图。终端通过天气配置信息计算对应的天气场景,根据天气场景获取第二雨滴贴图。
步骤503,当天气场景包括雨滴效果时,获取当前虚拟角色的视线方向上划分的贴图层。
终端当天气场景包括雨滴效果时,获取当前虚拟角色的视角范围,视角范围包括沿虚拟角色的视线方向划分的贴图层,贴图层的层数小于或等于预设层数。
如图8的左图所示,以A点表示虚拟角色所在的位置,示意性的,贴图层的预设层数为两层,在当前虚拟角色的视角范围内,沿当前虚拟角色的视线方向划分有贴图层1和贴图层2,其中贴图层2与当前虚拟角色的距离28位于当前虚拟角色的预设距离内。示意性的,贴图层之间的距离相等,通过预设距离控制贴图层的层数小于或等于预设层数。贴图层用于确定渲染下雨场景时所需的雨滴贴图的数量,每层贴图层对应一张雨滴贴图。在一些实施例中,贴图层之间的距离也可以不相等。
步骤504,在贴图层添加第二雨滴贴图,第二雨滴贴图的通道中存储有雨滴深度信息,雨滴深度信息用于表示虚拟角色与第二雨滴贴图之间的距离,雨滴深度信息与雨滴尺寸呈正相关关系。
为了模拟现实世界中“近大远小”的现象,在第二雨滴贴图的B通道中预先存储有雨滴 深度信息,该雨滴深度信息与雨滴尺寸呈正相关关系,B通道为红(Red,R)绿(Green,G)蓝(Blue,B)色彩模式下的B通道。由此可知,在同一张第二雨滴贴图上的雨滴尺寸不同。第二雨滴贴图泛指一张或多张雨滴贴图。
步骤505,根据雨滴深度信息渲染在下雨场景中符合雨滴尺寸的雨滴。
终端渲染出下雨场景中与雨滴深度信息匹配的雨滴尺寸的雨滴。也即根据第二雨滴贴图中存储的雨滴深度信息渲染下雨场景中的雨滴尺寸,使得距离当前虚拟角色较近的雨滴呈现较大的雨滴尺寸,距离当前虚拟角色较远的雨滴呈现较小的雨滴尺寸。如图8的右图所示,雨滴尺寸大小不一,呈现出视差(Parallax)效果。
当前虚拟角色绑定有摄像机模型,通过摄像机模型拍摄虚拟环境得到的画面来表现当前虚拟角色观察到的虚拟环境画面,针对摄像机模型构建双锥形模型,如图9的左图所示,该双锥形模型包围摄像机模型。将第二雨滴贴图贴附在双锥形模型的表面,当双锥形模型摆动时,第二雨滴贴图也跟随双锥形模型进行摆动,摄像机模型与第二雨滴贴图之间存在一定的夹角,在摄像机模型拍摄到的下雨画面中,雨滴呈现一定的倾斜角度;当双锥形模型的两个尖端与水平面垂直时,摄像机模型与第二雨滴贴图垂直,在摄像机模型拍摄到的下雨画面中,雨滴自上而下垂直降落。
本申请实施例通过构建双锥形模型,将当前虚拟角色对应的摄像机模型添加在双锥形模型中,以模拟现实世界中的下雨场景。需要说明的是,在游戏应用程序运行的过程中,摄像机模型和双锥形模型均不显示在虚拟环境画面中,即用户无法看到双锥形模型和摄像机模型,图9仅为示意。
综上所述,本实施例提供的方法,通过在雨滴贴图中存储深度信息,使得雨滴贴图根据深度信息显示在虚拟角色的视野范围内,表现出一种“近大远小”的视觉现象,即靠近虚拟角色的雨滴尺寸较大,远离虚拟角色的雨滴尺寸较小,从而真实模拟了现实世界中下雨时的场景。无需在虚拟角色的视野前方添加多层的具有不同雨滴尺寸的雨滴贴图(一张雨滴贴图上的雨滴尺寸相同),从而减少了雨滴贴图的数量,使得终端进行采样贴图的次数减少,提高终端在运行游戏应用程序时的性能。
基于图5的可选实施例中,虚拟环境中通常利用粒子发射器来模拟下雨,则虚拟环境中的天气渲染方法还包括如下步骤:
步骤51,当虚拟角色移动至粒子分布盒对应的包围范围的边缘位置处时,获取当前虚拟角色的移动方向。
如图10所示,在模拟下雨场景时,通常在当前虚拟角色29的上方设置有粒子发射器,以实现当前虚拟角色29始终位于下雨场景中。
虚拟角色对应有粒子分布盒,粒子分布盒是用于发射粒子的盒状模型,粒子分布盒被划分为m×n个子分布盒,虚拟角色位于粒子分布盒中,m和n均为正整数,每个子分布盒对应有粒子发射器。粒子分布盒用于通过粒子发射器模拟以粒子表现的天气场景。
示意性的,m和n可以相等或不等。粒子分布盒用于模拟下雨场景、下雪场景、沙尘暴场景、下冰雹等需要以粒子表现的天气场景。
如图11的右图所示,通过对当前虚拟角色设置粒子分布盒30,该粒子分布盒30的每个表面被划分为3×3的九宫格,将粒子分布盒划分为子分布盒,每个子分布盒对应有粒子发射器。当前虚拟角色位于该粒子分布盒30中。
边缘位置是指当前虚拟角色位于粒子分布盒的边缘层子分布盒对应的位置,如图11的左图所示,边缘位置是最近处的一层子分布盒33对应的位置,即当前虚拟角色29所在的一层子分布盒对应的位置。最近处的一层子分布盒33是位于当前虚拟角色的视野前方,且距离虚拟角色最近的一层子分布盒。
步骤52,将距离当前虚拟角色最远处的一层子分布盒沿移动方向移动至最近处的一层子 分布盒的前方,与最近处的一层子分布盒相邻,最近处的一层子分布盒位于当前虚拟角色的视野前方,且与当前虚拟角色之间的距离最短。
如图11的左图所示,从俯视角度说明粒子分布盒的移动过程,当前虚拟角色29位于粒子分布盒对应的包围范围的边缘位置处。当前虚拟角色移动方向为箭头指示的方向,将位于第一位置31的一层子分布盒移动至第二位置32处,第一位置31为距离当前虚拟角色最远处的一层子分布盒对应的位置,第二位置32是位于最近处的一层子分布盒33的前方,且与最近处的一层子分布盒33相邻的位置,最近处的一层子分布盒33与当前虚拟角色29之间的距离最短。
需要说明的是,在游戏应用程序运行的过程中,粒子分布盒不显示在虚拟环境画面中,即用户无法看到粒子分布盒,图11仅为示意。
综上所述,本实施例提供的方法,通过为虚拟角色设置粒子分布盒,当虚拟角色移动至粒子分布盒对应的包围范围的预设位置时,距离虚拟角色最远处的一层分布盒子会沿着虚拟角色的移动方向移动至虚拟角色的视野前方,使得虚拟角色始终被粒子分布盒包围,从而保证虚拟角色在下雨场景下始终被雨滴包围,实现了对现实世界的模拟。通过移动粒子分布盒的方式循环使用粒子发射器,提高终端在加载多个粒子发射器时的性能。
基于图5的可选实施例中,利用虚拟环境中的遮蔽物元素的深度信息对雨滴进行遮挡,以实现模拟现实世界中人在建筑物下避雨的场景,则虚拟环境中的天气渲染方法还包括如下步骤:
步骤61,在当前虚拟角色位于室内环境时,获取遮蔽物元素的深度信息,深度信息是以俯视虚拟环境的视角得到的,遮蔽物元素用于为当前虚拟角色提供室内环境。
游戏应用程序以俯视虚拟环境的视角获取遮蔽物元素的深度信息,即从虚拟环境的天空向虚拟环境的地面的方向观察,获取遮蔽物元素的深度信息。遮蔽物元素包括房屋元素、茅屋元素、亭子元素、桥洞元素、隧道元素、草棚元素中的至少一种。遮蔽物元素的深度信息用于表示虚拟环境的天空最高点(即天空球模型的最高点)与遮蔽物元素之间的距离。遮蔽物元素的深度信息是在离线状态下预先存储在天气***中的。
当前虚拟角色可以躲在遮蔽物下方躲避虚拟环境中的雨滴。
步骤62,根据遮蔽物元素的深度信息和虚拟角色对应的深度信息,删除部分第二雨滴贴图,部分第二雨滴贴图是指当前虚拟角色的视野前方被遮蔽物元素遮挡的第二雨滴贴图。
当前虚拟角色对应的深度信息包括当前虚拟角色周围的其他虚拟角色对应的深度信息、当前虚拟角色周围预设范围内的虚拟元素的深度信息(虚拟元素包括遮蔽物元素、植物元素、动物元素中的至少一种)。
通过如下三个公式对遮蔽物遮挡雨滴的过程进行说明:
利用公式一计算雨滴深度值:
公式一:d=B×d range+(d base+i×d range);
其中,d表示雨滴深度值,B表示雨滴贴图中的B通道(Blue通道)存储的雨滴深度信息对应的参数,d range表示雨滴深度信息对应的深度范围,d base表示虚拟环境中的虚拟元素的基础深度值,i表示第i层贴图层中的雨滴贴图。
通过上述公式计算雨滴深度值,即从俯视虚拟环境的角度,雨滴降落至地面时,雨滴与虚拟环境的天空的最高点之间的距离。
利用公式二计算雨滴所在的高度:
公式二:h=k(d);
其中,h表示雨滴所在的高度,d表示雨滴深度值,h=k(x)表示雨滴深度值与雨滴高度之间的函数关系。
雨滴所在的高度表示雨滴从虚拟环境的天空降落时的高度。
利用公式三计算雨滴的遮挡结果:
公式三:
Figure PCTCN2021126846-appb-000001
其中,d表示雨滴深度值,d scene表示虚拟环境中的遮蔽物元素的深度值,h表示雨滴所在的高度值,h scene表示虚拟环境中的遮蔽物元素在虚拟环境中的高度,1表示遮蔽物元素对雨滴进行遮挡,0表示未对雨滴进行遮挡。
当雨滴深度值小于遮蔽物元素的深度值,且雨滴所在的高度大于所述遮蔽物元素在虚拟环境中的高度时,将位于虚拟角色视野前方的部分第二雨滴贴图进行删除,形成雨滴被“遮挡”在遮蔽物元素之外的现象。
在现实世界中,当人进入室内环境后,室内的地面处于干燥状态,室外的地面处于湿润状态,本申请实施例通过上述公式在虚拟环境中模拟该情况。
如图12的(a)所示,当前虚拟角色29移动至室内环境,当前虚拟角色站在建筑物的台阶上,面向室外环境,在当前虚拟角色29的前方显示有雨滴。
如图12的(b)所示,以侧视的角度为例,当前虚拟角色29站在属于室内环境的台阶35上,台阶35的另一侧显示有雨滴34产生的水花,当前虚拟角色29站立的位置地面是干燥的,台阶35的另一侧地面是湿润的,以用户的角度观察虚拟环境,雨滴贴图距离用户最远,台阶元素(室内环境)对应的贴图与用户之间的距离小于雨滴贴图与用户之间的距离,利用台阶元素对应的贴图对雨滴贴图进行遮挡以实现如12所示的效果。
综上所述,本实施例的方法,当虚拟角色位于室内环境时,通过虚拟角色周围的深度信息和下雨场景的场景深度信息确定雨滴贴图出现在虚拟角色前方的位置,使得当虚拟角色处于室内环境时,将雨滴有效地进行遮挡,从而实现对现实世界的模拟。
基于图5的可选实施例中,雨滴掉落在虚拟环境的地面上会产生水花,则虚拟环境中的天气渲染方法还包括如下步骤:
步骤71,获取初始位置和拍摄位置,初始位置是水花在虚拟环境的地面上初次出现时的位置,拍摄位置是摄像机模型所在的位置,水花用于表征雨滴降落在虚拟环境的地面上产生的水花。
当前虚拟角色对应有摄像机模型,在虚拟环境的地面上划分有网格(Mesh),每个网格中对应有一个水花。水花是雨滴溅落在虚拟环境的地面上生成的水花,水花的位置是动态变化的,通过更新网格的位置来更新水花的位置,不断更新网格的位置会造成终端的性能消耗,使得终端表面温度升高。而本实施例利用图形处理单元每隔预设时间间隔更新水花的位置,并且让水花的位置跟随摄像机模型移动,从而模拟下雨场景中的水花效果。降低了终端性能消耗,从而避免终端表面温度升高。
步骤72,根据初始位置和拍摄位置计算第i次生成水花时的第i循环位置,i为正整数。
利用公式四计算第i次生成水花的第i循环位置:
公式四:P i=P 0+P camera
其中,P 0表示单个水花的初始位置,P camera表示摄像机模型的拍摄位置,P i表示第i次循环生成水花。
步骤73,获取位置偏移参数,位置偏移参数用于表示水花循环生成时的偏移位置。
步骤74,根据位置偏移参数和第i循环位置计算第i+1次生成水花时的第i+1循环位置。
利用公式五计算第i+1次生成水花的第i+1循环位置:
公式五:
Figure PCTCN2021126846-appb-000002
其中,P i表示第i次循环生成水花,δ表示位置偏移参数,t total表示游戏开始时刻至当前时刻,t cycle表示水花循环一个生命周期的时间,即水花从生成到消失的时间。
步骤75,重复上述生成水花的循环位置的步骤,直到下雨场景被切换。
利用上述两个公式重复计算,得到水花的循环位置,直到下雨场景被切换为其他场景时,停止计算生成水花的循环位置。如图13所示,利用本申请实施例提供的方法渲染得到的水花的效果图。
综上所述,本实施的方法,通过获取摄像机模型的拍摄位置和水花在虚拟环境画面的初始位置来计算生成第i次水花的第一循环位置,从而通过第i次水花的第一循环位置计算生成第i+1次水花的第二循环位置,通过利用固定数量的水花在虚拟环境中进行位置偏移,无需终端生成新的水花,提高终端在渲染下雨场景中的水花时的性能。
如图14所示,利用本申请实施例提供的天气渲染方法所渲染的雨滴,得到虚拟环境画面中下雨场景的效果图,实现了对现实世界较为真实的模拟。
此外,在渲染下雨场景时,虚拟环境画面易于出现分层现象,如图15的(a)中的区域47表示雨滴在虚拟环境中出现的雨滴分层现象。
通过本申请实施例中提供的天气渲染方法渲染雨滴效果时,利用第二雨滴贴图的通道中存储的雨滴深度信息,使得雨滴连贯,不出现分层现象,当游戏应用程序获取到天气配置信息时,控制虚拟环境画面显示连贯的雨滴效果,该雨滴效果是经过第二雨滴贴图渲染后得到的,第二雨滴贴图是根据天气配置信息和天气对应关系得到的(第二雨滴贴图的获取方式通过上述实施例进行表述,此处不再赘述),在第二雨滴贴图的通道中存储有雨滴深度信息,该雨滴深度信息表示雨滴贴图与虚拟角色之间的距离。
利用如下两个公式对雨滴不分层现象进行说明:
通过公式六计算每层雨滴贴图上的雨滴的颜色:
公式六:f(x)=R×g(x);
其中,f(x)表示每层雨滴贴图上的雨滴的颜色,R表示雨滴贴图中的R通道(Red通道)存储的雨滴形状信息,g(x)表示公式三中的雨滴遮挡结果。
通过公式七计算雨滴的渲染颜色:
公式七:
Figure PCTCN2021126846-appb-000003
其中,C表示雨滴的渲染颜色,f(x)表示每层雨滴贴图上的雨滴的颜色,i表示第i层贴图层的雨滴贴图,n表示n层雨滴贴图。
通过利用公式三中计算得到的雨滴遮挡结果和雨滴贴图的R通道中存储的雨滴形状信息计算得到每层雨滴贴图上的雨滴的颜色,通过公式七累加每层雨滴贴图上的雨滴的颜色,使得最后虚拟环境画面中雨滴的整体颜色趋于一致,使得雨滴未出现分层现象,如图15的(b)所示。
需要说明的是,上述关于雨滴效果的方法实施例可分别单独实施,也可任意组合实施,本申请实施例对此不加以限定。
2、天空光影效果。
图16示出了本申请另一个示例性实施例提供的虚拟环境中天气渲染方法的流程图。该方法可应用于如图1所示的终端100中,该终端100安装有支持虚拟场景运行的应用程序对应的客户端。该客户端基于天气***110运行。该方法包括如下步骤:
步骤1601,获取天气配置信息,天气配置信息用于渲染虚拟环境中的天气场景。
步骤1601的实施方式可参见图2所示的实施例中的步骤201的实施方式和图5所示的实 施例中的步骤501的实施方式,此处不再赘述。
步骤1602,根据天气配置信息,获取天空对应的第二散射贴图。
第一散射贴图是虚拟环境中高于指定高度的天空高度上的散射贴图;第二散射贴图是虚拟环境中低于或等于指定高度上的散射贴图;第一散射贴图不包括第二散射贴图。示例性的,天空高度可以采用海拔高度表示。如图4所示。通过天气配置信息和天气对应关系计算对应的天气场景,根据天气场景获取第二散射贴图。
步骤1603,当天气场景包括天空光影效果时,对天空对应的第二散射贴图进行处理,得到处理后的渲染贴图,处理后的渲染贴图的分辨率小于第二散射贴图的分辨率。
在虚拟环境中模拟天空的光影效果需要考虑大气散射因素,大气散射是指光线和大气粒子发生相互作用,使入射能量以一定规律在各方向重新分布的现象。本申请实施例将关于光线散射的计算数据离线存储在散射贴图(Inscattering Map)中,在游戏运行时直接采样散射贴图优化运行时的效率。即游戏应用程序根据天气对应关系和天气配置信息计算出虚拟环境中天空的散射贴图,将该散射贴图离线存储在图形处理单元中,在游戏应用程序运行时,直接获取该散射贴图以加载虚拟环境中的天空。
当通过二维散射图像计算大气散射效果时,需要进行至少两次散射贴图采样,同时针对光影效果进行复杂运算。示意性的,如图17所示,本申请实施例在渲染天空时对计算量做了优化,通过将终端的全屏渲染画面40中关于天空部分的散射贴图渲染为分辨率为64×64的渲染贴图(Rendertexture)36,相比于全屏渲染画面40的分辨率1920×1080(该全屏渲染画面的分辨率与终端的分辨率有关),本申请实施例的计算量降低了约500倍。在渲染天空时,将天空对应的渲染贴图36贴附在天空球模型39中,以形成渲染后的天空对应的光照场景。在一些实施例中,还可通过在天空球模型39上贴附云彩贴图38来丰富渲染后的画面,同样将云彩贴图38渲染至较小的分辨率,以减少渲染过程带来的计算量。在另一些实施例中,还可通过在天空球模型39上贴附雾气贴图37来丰富渲染后的画面,同样将雾气贴图37渲染至较小的分辨率,以减少渲染过程带来的计算量。
步骤1604,根据处理后的渲染贴图渲染虚拟环境的天空对应的光照场景。
示意性的,通过大气散射模型来渲染虚拟环境的天空对应的光照场景,通过本申请实施例提供的方法得到全屏渲染画面40对应的光照场景。
通过如下公式八表示大气散射模型的渲染结果:
公式八:L(s,θ)=L 0F ex(s)+L in(s,θ);
其中,L(s,θ)表示散射结果,L 0表示进入吸收介质前的光强,F ex(s)表示透光比,L in(s,θ)表示内散射结果,θ表示散射角度。
透光比通过如下公式九表示:
公式九:
Figure PCTCN2021126846-appb-000004
其中,β R表示瑞丽散射系数,β M表示米氏散射系数,s表示目标到光源的距离。
内散射结果通过如下公式十:表示:
公式十:
Figure PCTCN2021126846-appb-000005
其中,β R表示瑞丽散射系数,β M表示米氏散射系数,s表示目标到光源的距离,θ表示散射角度,E sun表示太阳光强。
示意性的,本申请实施例通过大气散射模型对雾气场景进行渲染,需要说明的是,当调用大气散射模型对雾气进行渲染时,在计算内散射结果时,将瑞丽散射系数、米氏散射系数和光照强度这一部分的计算结果对应的像素点渲染至分辨率为64×64的渲染贴图。
通过本申请实施例提供的方法得到渲染后的天空光影效果如图18所示,图18表示一天中各个时间段内虚拟环境中的天空,图18的(a)表示太阳刚升起时虚拟环境对应的天空光影效果,图18的(b)表示处于上午至中午时间段的虚拟环境对应的天空光影效果,图18的(c)表示太阳即将落山时的虚拟环境对应的天空光影效果,图18的(d)表示太阳完全落山后,临近夜晚时的虚拟环境对应的天空光影效果。
综上所述,本实施例提供的方法,通过将散射贴图处理为分辨率较小的渲染贴图,通过较小分辨率的渲染贴图模拟虚拟环境的天空对应的光照场景,使得终端在运行支持虚拟环境的应用程序时,进行贴图采样时的次数减少,避免终端的性能降低。
3、云彩效果。
图19示出了本申请另一个示例性实施例提供的虚拟环境中的天气渲染方法的流程图。该方法可应用于如图1所示的终端100中,该终端100安装有支持虚拟场景运行的应用程序对应的客户端,该客户端基于天气***110运行。该方法包括如下步骤:
步骤1901,当天气场景包括云彩效果时,获取至少两张第一纹理云贴图,第一纹理云贴图包括具有灰度的二维云彩图像,至少两张第一纹理云贴图是基准纹理云贴图中的像素点沿不同方向偏移后得到的,基准纹理云贴图包括无光照条件下的二维云彩图像。
终端获取天气配置信息,基于天气配置信息获取至少两张第一纹理云贴图。该第一纹理云贴图不包括具有彩色色彩的三维云彩图像。本申请实施例通过二维纹理云贴图实现对云彩的渲染。该二维纹理云彩图是具有灰度的图像。通过天气***中预先存储的基准纹理云贴图获得,基准纹理云图是具有云彩轮廓,且在光照条件下的二维云彩图像,即在不受任何光照条件影响下的云彩图像,基准纹理云贴图是具有灰度的图像。
示意性的,通过光线步进(Raymarching)算法对基准纹理云贴图进行预处理,得到第一纹理云贴图。如图20所示,光线步进算法的过程是:首先,从太阳发射n条射线(n为正整数),射线有一个采样的步长。当射线位于云彩中时,每个步长采一次样,获取云彩中的纹理值,其中白色矩形表示该纹理值为白色值,带有阴影的矩形表示该纹理值为灰度值。
示意性的,第一纹理云贴图的获取方式如图21的上图所示,将基准纹理云贴图41中的像素点按照向左、向右、向上三个方向进行偏移,在三个方向上的偏移步长是相等的,分别得到左偏移的第一纹理云贴图42、右偏移的第一纹理云贴图43和上偏移的第一纹理云贴图44。
步骤1902,根据第一纹理云贴图的通道对应的权重,混合至少两张第一纹理云贴图,得到第二纹理云贴图。
示意性的,左偏移的第一纹理云贴图42的通道对应的权重为0.6,右偏移的第一纹理云贴图43的通道对应的权重为0.3,右偏移的第一纹理云贴图43和上偏移的第一纹理云贴图44的通道对应的权重为0.1。将三张第一纹理云贴图进行渲染,得到如图21的下图所示的渲染后第二纹理云贴图45。
步骤1903,根据第二纹理云贴图渲染在光线变化下的云彩。
利用上述得到第二纹理云贴图45的方式得到虚拟环境中,在光线46变化下的云彩,渲染后的云彩效果图如图22所示。
综上所述,本实施例提供的方法,通过对二维纹理云贴图进行预处理,并根据预处理后的第一纹理云贴图的通道对应的权重来混合第一纹理云贴图,从而得到混合后的第二纹理云贴图,使得终端只需要进行一次二维纹理云贴图采样的过程,减少终端的能耗。
下面对虚拟环境中时间段与时间段之间的过渡、场景与场景之间的过渡进行说明。
图23示出了本申请另一个示例性实施例提供的虚拟环境中的天气渲染方法的流程图。该方法可应用于如图1所示的终端100中,该终端100安装有支持虚拟场景运行的应用程序对 应的客户端。该客户端基于天气***110运行。该方法包括如下步骤:
步骤2301,获取第一参数、第二参数、第三参数和第四参数,第一参数用于表示第一天气场景对应的贴图参数,第二参数用于表示第二天气场景对应的贴图参数,第三参数用于表示时间段变换时对应的过渡系数,第四参数用于表示两种天气场景切换时对应的过渡系数。
时间段是指虚拟环境的一天中各个时间点内控制虚拟环境表现的所有参数的集合。
如图24所示,虚拟环境中天气场景与时间段相关联,即虚拟环境中的时间从时间段A切换至时间段B时,天气场景也从时间段A对应的天气场景切换至时间段B对应的天气场景中。比如虚拟环境的天气场景为晴天时,相邻时段做线性过渡。有天气情况下,相邻时段和天气一起过渡,天气切换首先是环境元素变化,然后再是天气场景的切换,例如下雨场景是天空先变暗,然后再开始下雨。时间段之间、时间段与天气场景之间、天气场景与天气场景之间都可以平滑过渡。
步骤2302,根据第一参数、第二参数、第三参数和第四参数得到天气场景切换时对应的渲染画面参数。
步骤2303,根据渲染画面参数渲染第一天气场景切换至第二天气场景时的过渡画面。
利用如下公式十一计算画面渲染参数:
公式十一:d=(d 1×θ+d 2×(1-θ)×β)+d 3×(1-β);
其中,d表示画面渲染参数,d 1表示第一天气场景(或第一时间段)对应的贴图参数,d 2表示第二天气场景(或第二时间段)对应的贴图参数,d 3表示第二天气场景对应的贴图参数,β表示两种天气场景切换时对应的过渡系数,β的取值范围为0至1,1表示天气场景没有变化(一个天气场景不变),0表示天气场景发生变化(两个天气场景进行切换),θ表示时间段变换时对应的过渡系数,θ的取值范围为0至1,0表示晚上时间段,1表示白天时间段。
以虚拟环境中上午六点至下午六点为白天时间段,下午六点至第二天上午六点为夜晚时间段为例。在一个示例中,白天时间段过渡至晚上时间段,当虚拟环境当前的时刻为上午六点时,θ=1,则d 2×(1-θ)×β)的值为0,若天气场景没有变化,则β=1,d 3×(1-β)的值为0,画面渲染参数d=d 1×θ,对应白天场景。随着虚拟环境画面的时间变化,θ的值由1变为0,在这期间,若天气场景发生变化,则β值变化;若天气场景不发生变化,则β值不发生变化。当虚拟环境的当前时刻为下午七点时,θ=0,则d 1×θ的值为0,若天气场景没有变化,则β=1,d 3×(1-β)的值为0,画面渲染参数d=d2×(1-θ),对应夜晚场景。
由于θ和β数值的变化,可计算出任意时间段对应的画面渲染参数,因此虚拟环境从白天时间段对应的场景平滑过渡至晚上时间段对应的场景。
在另一个示例中,晴天场景(第一天气场景)过渡至雨天场景(第二天气场景),虚拟环境当前的时间段为白天时间段,当前的天气场景为晴天场景,则θ=1,β=1,则画面渲染参数d=d 1×θ,若虚拟环境的时间为中午十二点开始下雨,则θ=1,β=0,则画面渲染参数d=d 1×θ+d 3,对应下雨场景。
由于θ和β数值的变化,可计算出任意时间段对应的画面渲染参数,也可计算出在某一时间段天气场景变化时对应的画面渲染参数,因此虚拟环境可以在时间段与时间段进行场景切换时画面平滑过渡,也可以在任意两种天气场景切换时画面平滑过渡。
步骤2303可替换为如下步骤:
步骤S1,每间隔预设时间更新计算渲染画面参数。
步骤S2,根据渲染画面参数渲染第一天气场景切换至第二天气场景时的过渡画面。
在保证自然过渡的前提下,对天气过渡进行隔帧优化,在每秒传输帧数(Frames Per  Second,FPS)为30的情况下,每秒只需要做15次更新计算,使天气***消耗较小的终端性能。示意性的,时间段切换过程中需要采样2张场景贴图,通过隔帧优化的方式减小天气***对终端性能的消耗。从而使得虚拟环境中的天气场景之间的切换能够平滑的过渡。
综上所述,本实施例提供的方法,通过第一参数、第二参数、第三参数和第四参数来计算天气场景切时的画面渲染参数,根据该画面渲染参数来渲染画面,使得虚拟环境的天气场景能够在任意天气或任意时段之间平滑过渡,保证显示画面的顺畅性。
通过每间隔预设时间即进行一次更新的方式更新画面渲染参数,避免频繁更新计算画面渲染参数导致终端在加载虚拟环境画面时的性能降低。
需要说明的是,上述关于雨滴效果、天空光影效果、云彩效果、过渡效果的方法实施例可分别单独实施,也可任意组合实施,本申请实施例对此不加以限定。
图25示出了本申请的一个示例性实施例提供的虚拟环境中的天气渲染装置的结构示意图。该装置可以通过软件、硬件或者两者的结合实现成为终端的全部或一部分,该装置包括:
获取模块2510,用于获取虚拟环境中天气场景的至少一张天气贴图;
处理模块2520,用于剔除至少一张天气贴图中的第一天气贴图,得到剩余的第二天气贴图;第一天气贴图包括虚拟环境中位于当前虚拟角色的视野之外的天气贴图;
渲染模块2530,用于根据天气贴图渲染虚拟环境中的天气场景。
在一个可选的实施例中,第一天气贴图包括如下贴图中的至少一种:
当天气场景包括雨滴效果时,第一天气贴图包括位于当前虚拟角色的视野前方、且位于当前虚拟角色的预设距离之外的第一雨滴贴图;
当天气场景包括天空光影效果时,第一天气贴图包括高于指定高度的天空高度上的第一散射贴图,第一散射贴图是用于表现天空上的光线散射效果的图像。
在一个可选的实施例中,获取模块2510,用于当天气场景包括雨滴效果时,获取当前虚拟角色的视线方向上划分的贴图层;
处理模块2520,用于在贴图层中添加第二雨滴贴图,第二雨滴贴图包括位于当前虚拟角色的视野前方、且位于当前虚拟角色的预设距离之内的雨滴贴图,第二雨滴贴图的通道中存储有雨滴深度信息,雨滴深度信息用于表示当前虚拟角色与第二雨滴贴图之间的距离,雨滴深度信息与雨滴尺寸呈正相关关系;
渲染模块2530,用于渲染出下雨场景中与雨滴深度信息匹配的雨滴尺寸的雨滴。
在一个可选的实施例中,获取模块2510,用于当当前虚拟角色位于室内环境时,获取遮蔽物元素的深度信息,深度信息是以俯视虚拟环境的视角得到的,遮蔽物元素用于为当前虚拟角色提供室内环境;
处理模块2520,用于根据遮蔽物元素的深度信息和当前虚拟角色对应的深度信息,删除部分第二雨滴贴图,部分第二雨滴贴图是指当前虚拟角色的视野前方被遮蔽物元素遮挡的第二雨滴贴图。
在一个可选的实施例中,当前虚拟角色对应有摄像机模型;
获取模块2510,用于获取初始位置和拍摄位置,初始位置是水花在虚拟环境的地面上初次出现时的位置,拍摄位置是摄像机模型所在的位置,水花用于表征雨滴降落在虚拟环境的地面上产生的水花;
处理模块2520,用于根据初始位置和拍摄位置计算第i次生成水花时的第i循环位置,i为正整数;
获取模块2510,用于获取位置偏移参数,位置偏移参数用于表示水花循环生成时的偏移位置;
处理模块2520,用于根据位置偏移参数和第i循环位置计算第i+1次生成水花时的第i+1循环位置;重复上述生成水花的循环位置的步骤,直到下雨场景被切换。
在一个可选的实施例中,处理模块2520,用于当天气场景包括天空光影效果时,对天空对应的第二散射贴图进行处理,得到处理后的渲染贴图,处理后的渲染贴图的分辨率小于第二散射贴图的分辨率,第二散射贴图包括低于指定高度的天空高度上的散射贴图;
渲染模块2530,用于根据处理后的渲染贴图渲染虚拟环境的天空对应的光照场景。
在一个可选的实施例中,获取模块2510,用于当天气场景包括云彩效果时,获取至少两张第一纹理云贴图,第一纹理云贴图包括具有灰度的二维云彩图像,至少两张第一纹理云贴图是基准纹理云贴图中的像素点沿不同方向偏移后得到的,基准纹理云贴图包括无光照条件下的二维云彩图像;
处理模块2520,用于根据第一纹理云贴图的通道对应的权重,混合至少两张第一纹理云贴图,得到第二纹理云贴图;
渲染模块2530,用于根据第二纹理云贴图渲染在光线变化下的云彩。
在一个可选的实施例中,当前虚拟角色对应有粒子分布盒,粒子分布盒被划分为m×n个子分布盒,当前虚拟角色位于粒子分布盒中,m和n均为正整数,每个子分布盒对应有粒子发射器,粒子分布盒用于通过粒子发射器模拟以粒子表现的天气场景;
获取模块2510,用于当当前虚拟角色移动至粒子分布盒对应的包围范围的边缘位置处时,获取当前虚拟角色的移动方向;
处理模块2520,用于将距离当前虚拟角色最远处的一层子分布盒沿移动方向移动至最近处的一层子分布盒的前方,与最近处的一层子分布盒相邻,最近处的一层子分布盒位于当前虚拟角色的视野前方且与当前虚拟角色之间的距离最短。
在一个可选的实施例中,获取模块2510,用于获取第一参数、第二参数、第三参数和第四参数,第一参数用于表示第一天气场景对应的贴图参数,第二参数用于表示第二天气场景对应的贴图参数,第三参数用于表示时间段变换时对应的过渡系数,第四参数用于表示两种天气场景切换时对应的过渡系数;
处理模块2520,用于根据第一参数、第二参数、第三参数和第四参数得到天气场景切换时对应的渲染画面参数;根据渲染画面参数渲染第一天气场景切换至第二天气场景时的过渡画面。
在一个可选的实施例中,处理模块2520,用于每间隔预设时间更新计算渲染画面参数;
渲染模块2520,用于根据渲染画面参数渲染第一天气场景切换至第二天气场景时的过渡画面。
图26示出了本申请一个示例性实施例提供的计算机设备2600的结构框图。该计算机设备2600可以是便携式移动终端,比如:智能手机、平板电脑、MP3播放器(Moving Picture Experts Group Audio Layer III,动态影像专家压缩标准音频层面3)、MP4(Moving Picture Experts Group Audio Layer IV,动态影像专家压缩标准音频层面4)播放器。计算机设备2600还可能被称为用户设备、便携式终端等其他名称。
通常,计算机设备2600包括有:处理器2601和存储器2602。
处理器2601可以包括一个或多个处理核心,比如4核心处理器、8核心处理器等。处理器2601可以采用DSP(Digital Signal Processing,数字信号处理)、FPGA(Field-Programmable Gate Array,现场可编程门阵列)、PLA(Programmable Logic Array,可编程逻辑阵列)中的至少一种硬件形式来实现。处理器2601也可以包括主处理器和协处理器,主处理器是用于对在唤醒状态下的数据进行处理的处理器,也称CPU(Central Processing Unit,中央处理器);协处理器是用于对在待机状态下的数据进行处理的低功耗处理器。在一些实施例中,处理器2601可以在集成有GPU(Graphics Processing Unit,图像处理器),GPU用于负责显示屏所需要显示的内容的渲染和绘制。一些实施例中,处理器2601还可以包括AI(Artificial Intelligence,人工智能)处理器,该AI处理器用于处理有关机器学习的计算操作。
存储器2602可以包括一个或多个计算机可读存储介质,该计算机可读存储介质可以是有形的和非暂态的。存储器2602还可包括高速随机存取存储器,以及非易失性存储器,比如一个或多个磁盘存储设备、闪存存储设备。在一些实施例中,存储器2602中的非暂态的计算机可读存储介质用于存储至少一个指令,该至少一个指令用于被处理器2601所执行以实现本申请实施例中提供的虚拟环境中的天气渲染方法。
本领域技术人员可以理解,图26中示出的结构并不构成对计算机设备2600的限定,可以包括比图示更多或更少的组件,或者组合某些组件,或者采用不同的组件布置。
本申请实施例还提供一种计算机设备,该计算机设备包括处理器和存储器,该存储器中存储有至少一条指令、至少一段程序、代码集或指令集,该至少一条指令、该至少一段程序、该代码集或指令集由该处理器加载并执行以实现如上述各方法实施例提供的虚拟环境中的天气渲染方法。
本申请实施例还提供一种计算机可读存储介质,该存储介质中存储有至少一条指令、至少一段程序、代码集或指令集,该至少一条指令、至少一段程序、代码集或指令集由处理器加载并执行以实现上述各方法实施例提供的虚拟环境中的天气渲染方法。
本申请实施例还提供一种计算机程序产品或计算机程序,所述计算机程序产品或计算机程序包括计算机指令,所述计算机指令存储在计算机可读存储介质中。计算机设备的处理器从所述计算机可读存储介质读取所述计算机指令,所述处理器执行所述计算机指令,使得所述计算机设备执行如上方面所述的虚拟环境中的天气渲染方法。

Claims (20)

  1. 一种虚拟环境中的天气渲染方法,应用于计算机设备中,所述方法包括:
    获取所述虚拟环境中天气场景的至少一张天气贴图;
    剔除所述至少一张天气贴图中的第一天气贴图,得到剩余的第二天气贴图;所述第一天气贴图包括所述虚拟环境中位于当前虚拟角色的视野之外的天气贴图;
    根据所述第二天气贴图渲染所述天气场景。
  2. 根据权利要求1所述的方法,所述第一天气贴图包括如下贴图中的至少一种:
    当所述天气场景包括雨滴效果时,所述第一天气贴图包括位于所述当前虚拟角色的视野前方、且位于所述当前虚拟角色的预设距离之外的第一雨滴贴图;
    当所述天气场景包括天空光影效果时,所述第一天气贴图包括高于指定高度的天空高度上的第一散射贴图,所述第一散射贴图是用于表现天空上的光线散射效果的图像。
  3. 根据权利要求2所述的方法,所述根据所述第二天气贴图渲染所述天气场景,包括:
    当所述天气场景包括所述雨滴效果时,获取所述当前虚拟角色的视线方向上划分的贴图层;
    在所述贴图层中添加第二雨滴贴图,所述第二雨滴贴图包括位于所述当前虚拟角色的视野前方、且位于所述当前虚拟角色的预设距离之内的雨滴贴图,所述第二雨滴贴图的通道中存储有雨滴深度信息,所述雨滴深度信息用于表示所述当前虚拟角色与所述第二雨滴贴图之间的距离,所述雨滴深度信息与雨滴尺寸呈正相关关系;
    渲染出下雨场景中与所述雨滴深度信息匹配的所述雨滴尺寸的雨滴。
  4. 根据权利要求3所述的方法,所述方法还包括:
    当所述当前虚拟角色位于室内环境时,获取遮蔽物元素的深度信息,所述深度信息是以俯视所述虚拟环境的视角得到的,所述遮蔽物元素用于为所述当前虚拟角色提供所述室内环境;
    根据所述遮蔽物元素的深度信息和所述当前虚拟角色对应的深度信息,删除部分第二雨滴贴图,所述部分第二雨滴贴图是指所述当前虚拟角色的视野前方被所述遮蔽物元素遮挡的第二雨滴贴图。
  5. 根据权利要求3所述的方法,所述当前虚拟角色对应有摄像机模型;
    所述方法还包括:
    获取初始位置和拍摄位置,所述初始位置是水花在所述虚拟环境的地面上初次出现时的位置,所述拍摄位置是所述摄像机模型所在的位置,所述水花用于表征雨滴降落在所述虚拟环境的地面上产生的水花;
    根据所述初始位置和所述拍摄位置计算第i次生成所述水花时的第i循环位置,i为正整数;
    获取位置偏移参数,所述位置偏移参数用于表示所述水花循环生成时的偏移位置;
    根据所述位置偏移参数和所述第i循环位置计算第i+1次生成所述水花时的第i+1循环位置;
    重复上述生成所述水花的循环位置的步骤,直到所述下雨场景被切换。
  6. 根据权利要求2所述的方法,所述根据所述第二天气贴图渲染所述天气场景,包括:
    当所述天气场景包括所述天空光影效果时,对天空对应的第二散射贴图进行处理,得到处理后的渲染贴图,所述处理后的渲染贴图的分辨率小于所述第二散射贴图的分辨率,所述第二散射贴图包括低于指定高度的天空高度上的散射贴图;
    根据所述处理后的渲染贴图渲染所述虚拟环境的天空对应的光照场景。
  7. 根据权利要求1所述的方法,所述方法还包括:
    当所述天气场景包括云彩效果时,获取至少两张第一纹理云贴图,所述第一纹理云贴图包括具有灰度的二维云彩图像,所述至少两张第一纹理云贴图是基准纹理云贴图中的像素点沿不同方向偏移后得到的,所述基准纹理云贴图包括无光照条件下的二维云彩图像;
    根据所述第一纹理云贴图的通道对应的权重,混合所述至少两张第一纹理云贴图,得到第二纹理云贴图;
    根据所述第二纹理云贴图渲染在光线变化下的云彩。
  8. 根据权利要求1至7任一所述的方法,所述当前虚拟角色对应有粒子分布盒,所述粒子分布盒被划分为m×n个子分布盒,所述当前虚拟角色位于所述粒子分布盒中,m和n均为正整数,每个子分布盒对应有粒子发射器,所述粒子分布盒用于通过所述粒子发射器模拟以粒子表现的天气场景;
    所述方法还包括:
    当所述当前虚拟角色移动至所述粒子分布盒对应的包围范围的边缘位置处时,获取所述当前虚拟角色的移动方向;
    将距离所述当前虚拟角色最远处的一层子分布盒沿所述移动方向移动至最近处的一层子分布盒的前方,与所述最近处的一层子分布盒相邻,所述最近处的一层子分布盒位于所述当前虚拟角色的视野前方且与所述当前虚拟角色之间的距离最短。
  9. 根据权利要求1至7任一所述的方法,所述方法还包括:
    获取第一参数、第二参数、第三参数和第四参数,所述第一参数用于表示第一天气场景对应的贴图参数,所述第二参数用于表示第二天气场景对应的贴图参数,所述第三参数用于表示时间段变换时对应的过渡系数,所述第四参数用于表示两种天气场景切换时对应的过渡系数;
    根据所述第一参数、所述第二参数、所述第三参数和所述第四参数得到所述天气场景切换时对应的渲染画面参数;
    根据所述渲染画面参数渲染所述第一天气场景切换至所述第二天气场景时的过渡画面。
  10. 根据权利要求9所述的方法,所述根据所述渲染画面参数渲染所述第一天气场景切换至所述第二天气场景时的过渡画面,包括:
    每间隔预设时间更新计算所述渲染画面参数;
    根据所述渲染画面参数渲染所述第一天气场景切换至所述第二天气场景时的所述过渡画面。
  11. 一种虚拟环境中的天气渲染装置,所述装置包括:
    获取模块,用于获取虚拟环境中天气场景的至少一张天气贴图;
    处理模块,用于剔除所述至少一张天气贴图中的第一天气贴图,得到剩余的第二天气贴图;所述第一天气贴图包括所述虚拟环境中位于当前虚拟角色的视野之外的天气贴图;
    渲染模块,用于根据所述第二天气贴图渲染所述天气场景。
  12. 根据权利要求11所述的装置,所述第一天气贴图包括如下贴图中的至少一种:
    当所述天气场景包括雨滴效果时,所述第一天气贴图包括位于所述当前虚拟角色的视野前方、且位于所述当前虚拟角色的预设距离之外的第一雨滴贴图;
    当所述天气场景包括天空光影效果时,所述第一天气贴图包括高于指定高度的天空高度上的第一散射贴图,所述第一散射贴图是用于表现天空上的光线散射效果的图像。
  13. 根据权利要求12所述的装置,
    所述获取模块,用于当所述天气场景包括所述雨滴效果时,获取所述当前虚拟角色的视线方向上划分的贴图层;
    所述处理模块,用于在所述贴图层中添加第二雨滴贴图,所述第二雨滴贴图包括位于所述当前虚拟角色的视野前方、且位于所述当前虚拟角色的预设距离之内的雨滴贴图,所述第二雨滴贴图的通道中存储有雨滴深度信息,所述雨滴深度信息用于表示所述当前虚拟角色与所述第二雨滴贴图之间的距离,所述雨滴深度信息与雨滴尺寸呈正相关关系;
    所述渲染模块,用于渲染出下雨场景中与所述雨滴深度信息匹配的所述雨滴尺寸的雨滴。
  14. 根据权利要求13所述的装置,
    所述获取模块,用于当所述当前虚拟角色位于室内环境时,获取遮蔽物元素的深度信息,所述深度信息是以俯视所述虚拟环境的视角得到的,所述遮蔽物元素用于为所述虚拟角色提供所述室内环境;
    所述处理模块,用于根据所述遮蔽物元素的深度信息和所述当前虚拟角色对应的深度信息,删除部分第二雨滴贴图,所述部分第二雨滴贴图是指所述当前虚拟角色的视野前方被所述遮蔽物元素遮挡的第二雨滴贴图。
  15. 根据权利要求13所述的装置,所述虚拟角色对应有摄像机模型;
    所述获取模块,用于获取初始位置和拍摄位置,所述初始位置是水花在所述虚拟环境的地面上初次出现时的位置,所述拍摄位置是所述摄像机模型所在的位置,所述水花用于表征雨滴降落在所述虚拟环境的地面上产生的水花;
    所述处理模块,用于根据所述初始位置和所述拍摄位置计算第i次生成所述水花时的第i循环位置,i为正整数;
    所述获取模块,用于获取位置偏移参数,所述位置偏移参数用于表示所述水花循环生成时的偏移位置;
    所述处理模块,用于根据所述位置偏移参数和所述第i循环位置计算第i+1次生成所述水花时的第i+1循环位置;重复上述生成所述水花的循环位置的步骤,直到所述下雨场景被切换。
  16. 根据权利要求12所述的装置,所述装置还包括:处理模块;
    所述处理模块,用于当所述天气场景包括所述天空光影效果时,对天空对应的第二散射贴图进行处理,得到处理后的渲染贴图,所述处理后的渲染贴图的分辨率小于所述第二散射贴图的分辨率,所述第二散射贴图包括低于指定高度的天空高度上的散射贴图;
    所述渲染模块,用于根据所述处理后的渲染贴图渲染所述虚拟环境的天空对应的光照场景。
  17. 根据权利要求11所述的装置,
    所述获取模块,用于当所述天气场景包括云彩效果时,获取至少两张第一纹理云贴图,所述第一纹理云贴图包括具有灰度的二维云彩图像,所述至少两张第一纹理云贴图是基准纹理云贴图中的像素点沿不同方向偏移后得到的,所述基准纹理云贴图包括无光照条件下的二维云彩图像;
    所述处理模块,用于根据所述第一纹理云贴图的通道对应的权重,混合所述至少两张第一纹理云贴图,得到第二纹理云贴图;
    所述渲染模块,用于根据所述第二纹理云贴图渲染在光线变化下的云彩。
  18. 一种计算机设备,所述计算机设备包括处理器和存储器,所述存储器中存储有至少一条指令、至少一段程序、代码集或指令集,所述指令、所述程序、所述代码集或所述指令集由所述处理器加载并执行以实现如权利要求1至10任一项所述的虚拟环境中的天气渲染方法。
  19. 一种计算机可读存储介质,所述存储介质中存储有计算机程序,所述计算机程序由处理器加载并执行,以实现如权利要求1至10任一项所述的虚拟环境中的天气渲染方法。
  20. 一种计算机程序,所述计算机程序包括计算机指令,所述计算机指令存储在计算机可读存储介质中;
    计算机设备的处理器从所述计算机可读存储介质读取所述计算机指令,所述处理器执行所述计算机指令,以实现如权利要求1至10任一项所述的虚拟环境中的天气渲染方法。
PCT/CN2021/126846 2020-11-16 2021-10-27 虚拟环境中的天气渲染方法、装置、设备、介质及程序 WO2022100437A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/965,323 US20230039131A1 (en) 2020-11-16 2022-10-13 Method and apparatus for rendering weather in virtual environment, device, medium and program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011280349.7 2020-11-16
CN202011280349.7A CN112263837B (zh) 2020-11-16 2020-11-16 虚拟环境中的天气渲染方法、装置、设备及存储介质

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/965,323 Continuation US20230039131A1 (en) 2020-11-16 2022-10-13 Method and apparatus for rendering weather in virtual environment, device, medium and program

Publications (1)

Publication Number Publication Date
WO2022100437A1 true WO2022100437A1 (zh) 2022-05-19

Family

ID=74340052

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/126846 WO2022100437A1 (zh) 2020-11-16 2021-10-27 虚拟环境中的天气渲染方法、装置、设备、介质及程序

Country Status (3)

Country Link
US (1) US20230039131A1 (zh)
CN (1) CN112263837B (zh)
WO (1) WO2022100437A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117934784A (zh) * 2024-03-25 2024-04-26 山东捷瑞数字科技股份有限公司 一种基于图形学的自然视景构建及增强优化的方法及***
CN117994407A (zh) * 2024-04-07 2024-05-07 山东捷瑞信息技术产业研究院有限公司 一种基于图形学的自然现象构建及渲染优化的方法及***

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112263837B (zh) * 2020-11-16 2021-12-21 腾讯科技(深圳)有限公司 虚拟环境中的天气渲染方法、装置、设备及存储介质
CN112870707B (zh) * 2021-03-19 2022-11-18 腾讯科技(深圳)有限公司 虚拟场景中的虚拟物体展示方法、计算机设备及存储介质
CN113077541B (zh) * 2021-04-02 2022-01-18 广州益聚未来网络科技有限公司 一种虚拟天空画面的渲染方法及相关设备
CN113230651B (zh) * 2021-04-20 2024-02-02 网易(杭州)网络有限公司 游戏场景的显示方法、装置、电子设备及存储介质
CN113368496B (zh) * 2021-05-14 2023-08-01 广州三七互娱科技有限公司 游戏场景的天气渲染方法、装置及电子设备
CN113384887A (zh) * 2021-06-18 2021-09-14 网易(杭州)网络有限公司 游戏中模拟天气的方法、装置、电子设备和存储介质
CN113457134A (zh) * 2021-06-28 2021-10-01 网易(杭州)网络有限公司 游戏场景的处理方法、装置、设备及存储介质
CN113936097B (zh) * 2021-09-30 2023-10-20 完美世界(北京)软件科技发展有限公司 体积云渲染方法、设备及存储介质
CN116206041A (zh) * 2021-11-30 2023-06-02 华为技术有限公司 一种渲染方法及其相关设备
CN114768250B (zh) * 2022-04-06 2023-03-24 成都星奕网络科技有限公司 一种基于图像处理技术的虚拟场景渲染配色分析管理***
CN115953560B (zh) * 2023-03-15 2023-08-22 苏州飞蝶虚拟现实科技有限公司 基于元宇宙的虚拟天气模拟优化***
CN116030179B (zh) * 2023-03-29 2023-06-09 腾讯科技(深圳)有限公司 一种数据处理方法、装置、计算机设备及存储介质
CN116185206A (zh) * 2023-04-27 2023-05-30 碳丝路文化传播(成都)有限公司 一种元宇宙天气与真实天气同步的方法及***
CN116832434B (zh) * 2023-06-19 2024-04-12 广州怪力视效网络科技有限公司 一种游戏场景中虚拟天空的渲染方法及装置
CN116974417B (zh) * 2023-07-25 2024-03-29 江苏泽景汽车电子股份有限公司 显示控制方法及装置、电子设备、存储介质
CN117994181B (zh) * 2024-04-03 2024-06-11 成都力比科技有限公司 一种基于全屏幕的2d光照渲染方法及***

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102855654A (zh) * 2012-09-07 2013-01-02 中国人民解放军理工大学 一种超大规模天气效果渲染方法
CN108132712A (zh) * 2018-01-12 2018-06-08 网易(杭州)网络有限公司 虚拟场景中天气状态的呈现方法、装置及存储介质和终端
CN110473279A (zh) * 2019-08-22 2019-11-19 网易(杭州)网络有限公司 一种天气粒子渲染方法、装置、计算机设备及存储介质
US20200327711A1 (en) * 2018-10-29 2020-10-15 August Camden Walker System and methods for generating augmented reality displays of weather data
CN112263837A (zh) * 2020-11-16 2021-01-26 腾讯科技(深圳)有限公司 虚拟环境中的天气渲染方法、装置、设备及存储介质

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100091018A1 (en) * 2008-07-11 2010-04-15 Advanced Micro Devices, Inc. Rendering Detailed Animated Three Dimensional Characters with Coarse Mesh Instancing and Determining Tesselation Levels for Varying Character Crowd Density
US10320893B2 (en) * 2014-07-31 2019-06-11 Corent Technology, Inc. Partitioning and mapping workloads for scalable SaaS applications on cloud
CN107481312B (zh) * 2016-06-08 2020-02-14 腾讯科技(深圳)有限公司 一种基于体绘制的图像渲染方法及装置
CN108492353B (zh) * 2018-02-02 2021-12-31 珠海金山网络游戏科技有限公司 一种全局环境球的绘制方法和***
US10535180B2 (en) * 2018-03-28 2020-01-14 Robert Bosch Gmbh Method and system for efficient rendering of cloud weather effect graphics in three-dimensional maps
CN110196746B (zh) * 2019-05-30 2022-09-30 网易(杭州)网络有限公司 交互界面渲染方法及装置、电子设备、存储介质
CN110264553B (zh) * 2019-07-19 2024-02-23 网易(杭州)网络有限公司 地表植被处理的方法和装置
CN110706329A (zh) * 2019-09-06 2020-01-17 深圳亚联发展科技股份有限公司 一种三维场景重构方法和装置
CN111009026B (zh) * 2019-12-24 2020-12-01 腾讯科技(深圳)有限公司 对象渲染方法和装置、存储介质及电子装置
CN111701238B (zh) * 2020-06-24 2022-04-26 腾讯科技(深圳)有限公司 虚拟画卷的显示方法、装置、设备及存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102855654A (zh) * 2012-09-07 2013-01-02 中国人民解放军理工大学 一种超大规模天气效果渲染方法
CN108132712A (zh) * 2018-01-12 2018-06-08 网易(杭州)网络有限公司 虚拟场景中天气状态的呈现方法、装置及存储介质和终端
US20200327711A1 (en) * 2018-10-29 2020-10-15 August Camden Walker System and methods for generating augmented reality displays of weather data
CN110473279A (zh) * 2019-08-22 2019-11-19 网易(杭州)网络有限公司 一种天气粒子渲染方法、装置、计算机设备及存储介质
CN112263837A (zh) * 2020-11-16 2021-01-26 腾讯科技(深圳)有限公司 虚拟环境中的天气渲染方法、装置、设备及存储介质

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117934784A (zh) * 2024-03-25 2024-04-26 山东捷瑞数字科技股份有限公司 一种基于图形学的自然视景构建及增强优化的方法及***
CN117934784B (zh) * 2024-03-25 2024-06-07 山东捷瑞数字科技股份有限公司 一种基于图形学的自然视景构建及增强优化的方法及***
CN117994407A (zh) * 2024-04-07 2024-05-07 山东捷瑞信息技术产业研究院有限公司 一种基于图形学的自然现象构建及渲染优化的方法及***
CN117994407B (zh) * 2024-04-07 2024-06-07 山东捷瑞信息技术产业研究院有限公司 一种基于图形学的自然现象构建及渲染优化的方法及***

Also Published As

Publication number Publication date
CN112263837A (zh) 2021-01-26
US20230039131A1 (en) 2023-02-09
CN112263837B (zh) 2021-12-21

Similar Documents

Publication Publication Date Title
WO2022100437A1 (zh) 虚拟环境中的天气渲染方法、装置、设备、介质及程序
CN110874812B (zh) 游戏中的场景图像绘制方法、装置及电子终端
US11704868B2 (en) Spatial partitioning for graphics rendering
US20100020080A1 (en) Image generation system, image generation method, and information storage medium
EP4394713A1 (en) Image rendering method and apparatus, electronic device, computer-readable storage medium, and computer program product
CN107038745A (zh) 一种3d旅游景观漫游交互方法及装置
CN101527051B (zh) 基于大气散射原理对天空进行渲染的方法和装置
CN108043027B (zh) 存储介质、电子装置、游戏画面的显示方法和装置
CN114067042A (zh) 一种图像渲染方法、装置、设备、存储介质及程序产品
CN112233214A (zh) 一种大场景的雪景渲染方法、装置、设备及存储介质
CN114225386A (zh) 场景画面的编码方法、装置、电子设备及存储介质
CN115082607A (zh) 虚拟角色头发渲染方法、装置、电子设备和存储介质
CN110400366B (zh) 一种基于OpenGL的实时洪水灾害可视化仿真方法
CN114359458A (zh) 一种图像渲染方法、装置、设备、存储介质及程序产品
JP2008027064A (ja) プログラム、情報記録媒体および画像生成システム
WO2017174006A1 (zh) 图片处理的方法和装置
CN112396683B (zh) 虚拟场景的阴影渲染方法、装置、设备及存储介质
CN113368496B (zh) 游戏场景的天气渲染方法、装置及电子设备
CN112037292A (zh) 天气***的生成方法及装置、设备
US7710419B2 (en) Program, information storage medium, and image generation system
Catanese et al. Rendering of 3d dynamic virtual environments
CN117710557B (zh) 一种真实感体积云的构建方法、装置、设备及介质
Song et al. Development and implementation of pixel art game based on human-computer interaction
CN112791396B (zh) 一种拍摄画面的生成方法、装置、电子设备及存储介质
Lu et al. Design and implementation of three-dimensional game engine

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21890964

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 02-10-2023)

122 Ep: pct application non-entry in european phase

Ref document number: 21890964

Country of ref document: EP

Kind code of ref document: A1