CN116934940A - Method for generating model map by using panorama based on ray tracing technology - Google Patents

Method for generating model map by using panorama based on ray tracing technology Download PDF

Info

Publication number
CN116934940A
CN116934940A CN202310977152.6A CN202310977152A CN116934940A CN 116934940 A CN116934940 A CN 116934940A CN 202310977152 A CN202310977152 A CN 202310977152A CN 116934940 A CN116934940 A CN 116934940A
Authority
CN
China
Prior art keywords
model
ray tracing
data
panoramic image
image data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310977152.6A
Other languages
Chinese (zh)
Inventor
陈卓仪
刘佳乐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yunchuang Zhanhui Technology Shenzhen Co ltd
Original Assignee
Yunchuang Zhanhui Technology Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yunchuang Zhanhui Technology Shenzhen Co ltd filed Critical Yunchuang Zhanhui Technology Shenzhen Co ltd
Priority to CN202310977152.6A priority Critical patent/CN116934940A/en
Publication of CN116934940A publication Critical patent/CN116934940A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/06Ray-tracing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Generation (AREA)

Abstract

The invention discloses a method for generating a model map by using a panorama based on a ray tracing technology, which comprises the following steps: step one: obtaining panoramic image data and model data; step two: inputting panoramic image data and model data into a ray tracing algorithm; step three: in the ray tracing process, determining the path and the intersection point of the rays; step four: according to the material information at the intersection point, calculating the mapping color attribute of the model at the ray tracing intersection point; step five: generating a model map according to the attribute of the ray and the model uv data; the beneficial effects of the invention are as follows: the steps are realized and the model map is generated by writing a computer program, and the computer program can run on hardware such as a graphic processing unit and the like so as to improve the calculation speed and the image generation efficiency; according to the light attribute and the model data, generating a model map, which contains the detail and texture information of the model surface, can enhance the sense of reality and visual effect of the generated image.

Description

Method for generating model map by using panorama based on ray tracing technology
Technical Field
The invention belongs to the technical field of computer graphics and computer vision, and particularly relates to a method for generating a model map by using a panorama based on a ray tracing technology.
Background
Ray tracing is a technique that gives a real appearance to graphics, which attempts to simulate the way light works in the real world, and ray tracing does not create pre-designed rays for a game scene, but rather traces the path of simulated light, more precisely millions of simulated light; light rays are reflected back from objects as they move and interact with their properties; the substantial similarity of ray tracing to real life makes it a very realistic 3D rendering technique, and ray tracing can also be used for shadows, making objects look more dynamic and realistic, and ray tracing can create more realistic shadows in dark and bright scenes with softer edges and clearer definitions.
In 3D computer graphics, ray tracing is a rendering technique for generating an image by tracing a path of light in units of pixels in an image plane and simulating an effect of encountering a virtual object, which is capable of producing a high degree of visual realism compared to a typical scan line rendering method; ray tracing describes a method for generating visual images constructed in a 3D computer graphics environment with more photorealistic than ray casting or scanline rendering techniques, which works by tracing the path from an imaginary eye to each pixel in a virtual screen and calculating the color of an object visible through it; scenes in ray tracing are described mathematically by a programmer or visual artist, and the scene may also incorporate data from images and models captured by means such as digital photography.
In the fields of computer graphics and computer vision, it has become increasingly common to use ray tracing techniques to generate realistic images; however, current ray tracing techniques still present challenges in handling large-scale scenes, and in addition, mapping is a common technique for enhancing visual effects, but current mapping methods have less application in panorama generation models.
Disclosure of Invention
It is an object of the present invention to provide a method for generating model maps using panorama based on ray tracing techniques, improving the image generation of large scale scenes and enhancing the visual effect.
In order to achieve the above purpose, the present invention provides the following technical solutions: a method for generating a model map using a panorama based on ray tracing techniques, the method comprising:
step one: obtaining panoramic image data and model data;
step two: inputting panoramic image data and model data into a ray tracing algorithm;
step three: in the ray tracing process, determining the path and the intersection point of the rays;
step four: according to the material information at the intersection point, calculating the mapping color attribute of the model at the ray tracing intersection point;
step five: generating a model map according to the attribute of the ray and the model uv data;
step six: and combining the generated model map and panoramic image data to generate an image output for enhancing the visual effect.
As a preferable technical solution of the present invention, the acquiring panoramic image data and model data specifically includes: panoramic image data is acquired from a panoramic camera or other panoramic image source, while model data for modeling is acquired.
As a preferred technical solution of the present invention, the input panoramic image data and model data to ray tracing algorithm is specifically as follows: and taking the acquired panoramic image data and model data as input, and transmitting the input data and the model data to a ray tracing algorithm for processing.
As a preferred technical solution of the present invention, the determining the path and the intersection point of the light rays is specifically as follows: and determining the path of the light rays from the camera or the observation point according to the panoramic image data and the model data, and finding the point at which the light rays intersect with the object in the model.
As a preferable technical scheme of the invention, the model map is generated according to the attribute of the light ray and the model data, and the model map contains the detail and texture information of the model surface.
As a preferable technical scheme of the invention, the generated model map is fused with panoramic image data, and the pixel information and the color value of the model map and the panoramic image data are combined to generate the image output with enhanced visual effect, and the fusion process adopts an image processing technology, such as pixel-level fusion or mixing.
Compared with the prior art, the invention has the beneficial effects that:
the steps are realized and the model map is generated by writing a computer program, and the computer program can run on hardware such as a graphic processing unit and the like so as to improve the calculation speed and the image generation efficiency;
according to the light attribute and the model data, generating a model map, which contains the detail and texture information of the model surface, can enhance the sense of reality and visual effect of the generated image.
Drawings
FIG. 1 is a flow chart of a method of generating a model map according to the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Example 1
Referring to fig. 1, a first embodiment of the present invention provides a method for generating a model map using panorama based on ray tracing technology, comprising the steps of:
step one: obtaining panoramic image data and model data;
step two: inputting panoramic image data and model data into a ray tracing algorithm;
step three: in the ray tracing process, determining the path and the intersection point of the rays;
step four: calculating the mapping color attribute of the model at the ray tracing intersection point;
step five: generating a model map according to the attribute of the light ray, the model uv and other data;
step six: and combining the generated model map and panoramic image data, and combining pixel information and color values of the model map and the panoramic image data to generate image output for enhancing visual effect.
In this embodiment, it is preferable that the panoramic image data and the model data are acquired specifically as follows: panoramic image data is acquired from a panoramic camera or other panoramic image source, while model data for modeling is acquired.
In this embodiment, it is preferable that the panoramic image data and the model data are input to the ray tracing algorithm as follows: and taking the acquired panoramic image data and model data as input, and transmitting the input data and the model data to a ray tracing algorithm for processing.
In this embodiment, it is preferable that the path and the intersection point of the light ray are determined as follows: and determining the path of the light rays from the camera or the observation point according to the panoramic image data and the model data, and finding the point at which the light rays intersect with the object in the model.
In this embodiment, preferably, the model map is generated according to the attribute of the light and the model data, and includes details and texture information of the model surface, so that the sense of reality and the visual effect of the generated image can be enhanced.
In this embodiment, it is preferable that the generated model map is fused with panoramic image data, and the pixel information and the color value of the generated model map and the panoramic image data are combined to generate an image output with enhanced visual effect, and the fusion process adopts an image processing technology, such as pixel level fusion or blending.
Example 2
Referring to fig. 1, a second embodiment of the present invention provides a method for generating a model map using panorama based on ray tracing technology, comprising the steps of:
step one: obtaining panoramic image data and model data;
step two: inputting panoramic image data and model data into a ray tracing algorithm;
step three: in the ray tracing process, determining the path and the intersection point of the rays;
step four: calculating the mapping color attribute of the model at the ray tracing intersection point;
step five: generating a model map according to the attribute of the light ray, the model uv and other data;
step six: and combining the generated model map and panoramic image data, and combining pixel information and color values of the model map and the panoramic image data to generate image output for enhancing visual effect.
In this embodiment, it is preferable that the panoramic image data and the model data are acquired specifically as follows: panoramic image data is acquired from a panoramic camera or other panoramic image source, while model data for modeling is acquired.
In this embodiment, it is preferable that the panoramic image data and the model data are input to the ray tracing algorithm as follows: and taking the acquired panoramic image data and model data as input, and transmitting the input data and the model data to a ray tracing algorithm for processing.
In this embodiment, it is preferable that the path and the intersection point of the light ray are determined as follows: and determining the path of the light rays from the camera or the observation point according to the panoramic image data and the model data, and finding the point at which the light rays intersect with the object in the model.
In this embodiment, preferably, the model map is generated according to the attribute of the light and the model data, and includes details and texture information of the model surface, so that the sense of reality and the visual effect of the generated image can be enhanced.
In this embodiment, preferably, the generated model map is fused with panoramic image data, and the pixel information and the color value of the two are combined to generate an image output with enhanced visual effect, and the fusion process adopts an image processing technology, such as pixel-level fusion or mixing, the image fusion technology can extract complementary information of a natural light image and an infrared image, so as to obtain an image which is more accurate, comprehensive and reliable in explaining the same scene description, the pixel-level fusion is commonly used for fusion of a gray image and a visible light image, and the colorization based on a source image is a fusion process of the source image and a target image, so that the color of the source image and the shape and texture feature information of the target image are combined at the same time, thereby achieving harmony and reality of the overall color base; the main factors affecting image fusion: and (5) processing huge data amount of the image and selecting fusion rules.
While embodiments of the present invention have been shown and described in detail with reference to the foregoing detailed description, it will be understood by those skilled in the art that various changes, modifications, substitutions and alterations may be made therein without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (7)

1. A method for generating a model map using a panorama based on ray tracing techniques, comprising: the method comprises the following steps:
step one: obtaining panoramic image data and model data;
step two: inputting panoramic image data and model data into a ray tracing algorithm;
step three: in the ray tracing process, determining the path and the intersection point of the rays;
step four: according to the material information at the intersection point, calculating the mapping color attribute of the model at the ray tracing intersection point;
step five: generating a model map according to the attribute of the ray and the model uv data;
step six: and combining the generated model map and panoramic image data to generate an image output for enhancing the visual effect.
2. The method for generating model maps using panorama based on ray tracing technology according to claim 1, wherein: the panoramic image data and the model data are obtained specifically as follows: panoramic image data is acquired from a panoramic camera or other panoramic image source, while model data for modeling is acquired.
3. The method for generating model maps using panorama based on ray tracing technology according to claim 1, wherein: the input panoramic image data and model data to ray tracing algorithm is specifically as follows: and taking the acquired panoramic image data and model data as input, and transmitting the input data and the model data to a ray tracing algorithm for processing.
4. The method for generating model maps using panorama based on ray tracing technology according to claim 1, wherein: the determining of the path and the intersection point of the light rays is specifically as follows: and determining the path of the light rays from the camera or the observation point according to the panoramic image data and the model data, and finding the point at which the light rays intersect with the object in the model.
5. The method for generating model maps using panorama based on ray tracing technology according to claim 1, wherein: and generating a model map according to the attribute of the light ray and the model data, wherein the model map contains the detail and texture information of the model surface.
6. The method for generating model maps using panorama based on ray tracing technology according to claim 1, wherein: and fusing the generated model map with panoramic image data, combining pixel information and color values of the model map and the panoramic image data to generate image output with enhanced visual effect, wherein an image processing technology is adopted in the fusion process.
7. The method for generating model maps using panorama based on ray tracing technology according to claim 1, wherein: multiple scene panorama map mixes are also required in the ray tracing process.
CN202310977152.6A 2023-08-04 2023-08-04 Method for generating model map by using panorama based on ray tracing technology Pending CN116934940A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310977152.6A CN116934940A (en) 2023-08-04 2023-08-04 Method for generating model map by using panorama based on ray tracing technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310977152.6A CN116934940A (en) 2023-08-04 2023-08-04 Method for generating model map by using panorama based on ray tracing technology

Publications (1)

Publication Number Publication Date
CN116934940A true CN116934940A (en) 2023-10-24

Family

ID=88389599

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310977152.6A Pending CN116934940A (en) 2023-08-04 2023-08-04 Method for generating model map by using panorama based on ray tracing technology

Country Status (1)

Country Link
CN (1) CN116934940A (en)

Similar Documents

Publication Publication Date Title
CN111508052B (en) Rendering method and device of three-dimensional grid body
CN107341853B (en) Virtual-real fusion method and system for super-large virtual scene and dynamic screen shooting
US10062199B2 (en) Efficient rendering based on ray intersections with virtual objects
CN113674389B (en) Scene rendering method and device, electronic equipment and storage medium
Li et al. [Retracted] Multivisual Animation Character 3D Model Design Method Based on VR Technology
CN110070621A (en) Electronic device, the method and computer readable media for showing augmented reality scene
US9183654B2 (en) Live editing and integrated control of image-based lighting of 3D models
CN112184873B (en) Fractal graph creation method, fractal graph creation device, electronic equipment and storage medium
JP2012190428A (en) Stereoscopic image visual effect processing method
CN110634178A (en) Three-dimensional scene refinement reconstruction method for digital museum
Jiang et al. VR-GS: a physical dynamics-aware interactive gaussian splatting system in virtual reality
Ma et al. Neural compositing for real-time augmented reality rendering in low-frequency lighting environments
Sandnes Sketching 3D immersed experiences rapidly by hand through 2D cross sections
Astuti et al. Comparison of time, size and quality of 3d object rendering using render engine Eevee and cycles in blender
CN113648655B (en) Virtual model rendering method and device, storage medium and electronic equipment
Wei et al. Simulating shadow interactions for outdoor augmented reality with RGBD data
Wang et al. Bidirectional shadow rendering for interactive mixed 360° videos
Schwandt et al. Glossy reflections for mixed reality environments on mobile devices
Dai et al. PBR-Net: Imitating physically based rendering using deep neural network
Noh et al. Soft shadow rendering based on real light source estimation in augmented reality
Levene A framework for non-realistic projections
CN112002019B (en) Method for simulating character shadow based on MR mixed reality
CN116934940A (en) Method for generating model map by using panorama based on ray tracing technology
CN111862338B (en) Display method and device for simulated eyeglass wearing image
CN114170409A (en) Method for automatically judging display label of three-dimensional model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination