CN110276823A - The integration imaging generation method and system that can be interacted based on ray tracing and in real time - Google Patents
The integration imaging generation method and system that can be interacted based on ray tracing and in real time Download PDFInfo
- Publication number
- CN110276823A CN110276823A CN201910438381.4A CN201910438381A CN110276823A CN 110276823 A CN110276823 A CN 110276823A CN 201910438381 A CN201910438381 A CN 201910438381A CN 110276823 A CN110276823 A CN 110276823A
- Authority
- CN
- China
- Prior art keywords
- plane
- light field
- model
- seeing
- depending
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 43
- 238000003384 imaging method Methods 0.000 title claims abstract description 29
- 230000010354 integration Effects 0.000 title claims abstract description 27
- 239000011159 matrix material Substances 0.000 claims abstract description 65
- 230000002452 interceptive effect Effects 0.000 claims abstract description 54
- 238000005516 engineering process Methods 0.000 claims abstract description 19
- 238000009877 rendering Methods 0.000 claims abstract description 17
- 239000013598 vector Substances 0.000 claims description 71
- 230000003287 optical effect Effects 0.000 claims description 21
- 238000012544 monitoring process Methods 0.000 claims description 16
- 230000007246 mechanism Effects 0.000 claims description 15
- 239000004926 polymethyl methacrylate Substances 0.000 claims description 5
- 230000003993 interaction Effects 0.000 claims description 4
- 239000011800 void material Substances 0.000 claims description 4
- 238000003491 array Methods 0.000 claims 1
- 238000000151 deposition Methods 0.000 claims 1
- 230000008569 process Effects 0.000 abstract description 14
- 238000004364 calculation method Methods 0.000 abstract description 5
- 238000005457 optimization Methods 0.000 description 5
- 238000007781 pre-processing Methods 0.000 description 5
- 230000000007 visual effect Effects 0.000 description 5
- 230000000694 effects Effects 0.000 description 4
- 238000005286 illumination Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 238000013507 mapping Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/60—Rotation of whole images or parts thereof
- G06T3/604—Rotation of whole images or parts thereof using coordinate rotation digital computer [CORDIC] devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/06—Ray-tracing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Graphics (AREA)
- Software Systems (AREA)
- Architecture (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Geometry (AREA)
- Processing Or Creating Images (AREA)
- Image Generation (AREA)
Abstract
The integration imaging generation method and system that the invention discloses a kind of based on ray tracing and can interact in real time, it is related to Automated library system imaging generation technique field, including reading systems parameters document first, load virtual scene threedimensional model and virtual scene texture file, and according to above-mentioned file and model, bounding volume hierarchy (BVH) accelerating structure and integrated light field are established depending on seeing model;Then judge whether to receive interactive instruction, if then according to modified integrated light field depending on the corresponding light of each pixel of attribute value generation of seeing model, if otherwise directly according to integrating light field depending on the corresponding light of each pixel of attribute value generation of seeing model;Bounding volume hierarchy (BVH) accelerating structure and ray tracing technology, all light of parallel rendering are finally used, generation unit pattern matrix is simultaneously shown.Using the method provided by the present invention or system, the collection process of virtual camera model is got rid of, calculation process has been simplified, improves the real-time of interactive function.
Description
Technical field
The present invention relates to Automated library systems, and generation technique field is imaged, more particularly to a kind of based on ray tracing and real-time
The integration imaging generation method and system that can be interacted.
Background technique
Existing calculating generates integrated imaging method are as follows: a virtualphase is established on each lens or each viewpoint
Machine model, field information (visual angle figure or in which certain pictures at certain visual angle or certain orientation that then each virtual camera acquires
Element), cell picture array (Element is calculated by data handling procedures such as double sampling, pixel-maps
ImageArray, EIA).Although the above method can obtain cell picture array, time-consuming is calculated, integration imaging is limited
Performance is watched, and is unfavorable for realizing interactive function.
Specifically, Point Retracing Rendering (PRR), Multiple Viewpoint Rendering
(MVR), the methods of Parallel Group Rendering (PGR) and ViewpointVectorRendering (VVR), pass through
Virtual camera in business three-dimensional software obtains multiple visual angle figures, according to certain mapping relations by the pixel filling in the figure of visual angle
Into EIA, but there are many redundancies in the figure collection process of visual angle, pixel-map process is than relatively time-consuming.Then, scholars
Specific virtual camera is write, and improves calculating speed using parallel processing.image space parallelprocessing
(ISPP) each hexagonal lenses is established as virtual camera model by optimization algorithm, distributes GPU thread for each pixel,
Optimize parallel algorithm framework on the basis of ISPP algorithm, calculates generate EIA in real time.multiple ray clusterrendering
(MRCR) a kind of Ray Clusters model similar to virtual camera is established for each viewpoint in method position according to the observation,
The integrated light field for being suitble to current distance viewing is calculated, realizes Real-time Interactive Demonstration using parallel computation.Domestic Beijing University of Post & Telecommunication
It proposes BackwardRayTracing CGII method, establishes virtual camera for each viewpoint, be each pixel by virtual camera
Emit light, render light using reverse ray tracing technology, realizes real-time display using parallel computing.Though these algorithms
The redundancy of acquisition information is so avoided, but calculates EIA indirectly in the collection result using virtual camera model, there is no with collection
It is that model directly calculates EIA at light field;When realizing interaction display function, technical difficulty is increased, meanwhile, it constantly calculates virtual
The processes such as the space conversion matrix of camera increase the complexity of interactive computing, affect interactive real-time.
Summary of the invention
The integration imaging generation method and be that the object of the present invention is to provide a kind of based on ray tracing and can interact in real time
System, gets rid of the collection process of virtual camera model, simplifies calculation process, improve the real-time of interactive function.
To achieve the above object, the present invention provides following schemes:
A kind of integration imaging generation method that can be interacted based on ray tracing and in real time, comprising:
Systems parameters document is read using the function library of open source, loads virtual scene threedimensional model using the function library of open source
With virtual scene texture file;Wherein, in the systems parameters document deposit parameter class ConfigXML, the virtual scene three
In dimension module and virtual scene texture file deposit model data structures MeshBuffer;
According to the data in the model data structures MeshBuffer, bounding volume hierarchy (BVH) accelerating structure is established;
According to the data in the parameter class ConfigXML, integrated light field is established depending on seeing model;The integrated light field is depending on being shown in
Model successively includes plane where dummy unit pattern matrix, plane, three-dimension object center place are flat where virtual lens array
Face and the light plane of departure;The integrated light field is depending on being shown in that model is realized by ILFR class, category of the integrated light field depending on seeing model
Property value include data class EIABuffer, world coordinates data, the pixel size and lens of plan range data, LCD display
The position data of optical center;The data class EIABuffer is size two-dimensional structure body identical with cell picture array, is used
In the color value of pixel each in storage unit pattern matrix;The world coordinates data are by point Lookat, point OrWith vector up
World coordinates data composition, point Lookat be to integrate origin of the light field depending on seeing model coordinate systems, and point Or is world coordinate system
Origin, vector up are integrated top vector of the light field depending on seeing model coordinate systems;The plan range data include dummy unit image
The distance between plane where plane where array and virtual lens array, where virtual lens array in plane and three-dimension object
The distance between plane and the light plane of departure where the distance between plane where the heart and three-dimension object center;It is described
The position data of mirror optical center refers to optical center of lens pixel where upright projection point on cell picture array, in cell picture array
In position data;
Judge whether to receive interactive instruction, obtains the first judging result;The interactive instruction includes keyboard mutuality instruction
With mouse interactive instruction;
If the first judging result expression receives interactive instruction, according to the interactive instruction, modify described integrated
It is corresponding to generate each pixel for attribute value of the light field depending on seeing model, and the attribute value according to modified integrated light field depending on seeing model
Light;
If the first judging result expression does not receive interactive instruction, the category according to the integrated light field depending on seeing model
Property value, generates the corresponding light of each pixel;
Using the bounding volume hierarchy (BVH) accelerating structure and ray tracing technology, all light of parallel rendering are generated single
First pattern matrix;
It draws the cell picture array on a display screen using dual-cache mechanism and shows.
Optionally, the systems parameters document includes an xml document and a csv file;Wherein, the csv file
Position data comprising lens centres all in lens array, the xml document include pixel size, the unit of LCD display
The distance between pattern matrix and lens array, the focal length of lens, the width of LCD display, cell picture array in Virtual Space
Width numerical value, cell picture array horizontal resolution, cell picture array vertical resolution, in lens array lens
The pixel number of several, each cell picture array and the filename of lens centre position data file;
File is ply file, obj file and txt file, the virtual scene texture in the virtual scene threedimensional model
File is ppm file, hdr file and jpg file.
Optionally, the data according in the parameter class ConfigXML establish integrated light field depending on seeing model, specifically
Include:
Using right-handed Cartesian coordinate system, establishing reference axis is xw、yw、zwWorld coordinate system;Wherein, point Or is the world
The origin of coordinate system;
Point Lookat is set as integrated origin of the light field depending on seeing model coordinate systems, by the point OrTo the point Lookat shape
At vector be set as integrated z of the light field depending on seeing model coordinate systemscAxis, establishing reference axis is xc、yc、zcIntegrated light field depending on seeing mould
Type coordinate system;Wherein, the point Lookat is the volume center point of three-dimensional virtual object;Vector up is the integrated light field depending on seeing
The top vector of model coordinate systems, for constructing unit orthogonal basement of the integrated light field depending on seeing model coordinate systems;
According to flat where plane, three-dimension object center where plane, virtual lens array where dummy unit pattern matrix
Face, the light plane of departure and the integrated light field establish integrated light field depending on seeing model depending on seeing model coordinate systems;Wherein, described
Plane where plane, the three-dimension object center where plane, the virtual lens array where dummy unit pattern matrix with
And z of the light plane of departure with the integrated light field depending on seeing model coordinate systemscAxis is vertical;The light plane of departure with
Z of the integrated light field depending on seeing model coordinate systemscAxis intersects at point Or, plane where the three-dimension object center and described integrated
Z of the light field depending on seeing model coordinate systemscAxis intersects at point D, and the central point of virtual lens array is overlapped with the point D;The void
It is virtual in the positional relationship and physical reproduction system of plane where plane, the virtual lens array where quasi-simple member pattern matrix
The positional relationship of plane is consistent where plane, virtual lens array where cell picture array.
Optionally, the interactive instruction includes rotation instruction, move, scaling instruction, display fine tuning instruction;Wherein,
The rotation instruction is to reset the integrated light field depending on seeing model midpoint O by monitoring left mouse button draggingr
World coordinates realize;
The move is to reset the integrated light field depending on seeing model midpoint by monitoring right mouse button dragging
Lookat and point OrWorld coordinates realize;
The scaling instruction is to reset the integrated light field according to a certain percentage by monitoring mouse roller key dragging
Depending on seeing the distance between plane where dummy unit pattern matrix and virtual lens array place plane, virtual lens battle array in model
What the pixel size of the distance between plane and LCD display where plane where column and three-dimension object center was realized;
The display fine tuning instruction is that the integrated light field is individually modified by keyboard depending on seeing dummy unit in model
Plane and three-dimensional article where the distance between plane, virtual lens array where plane where pattern matrix and virtual lens array
The distance between plane and the light plane of departure and LCD where the distance between plane, three-dimension object center where body center
What the pixel size of display screen was realized.
Optionally, described to generate the corresponding light of each pixel, it specifically includes:
A thread is distributed to each pixel;
Attribute value according to the corresponding thread of each pixel and the integrated light field depending on seeing model calculates current pixel pair
The launch point coordinate and direction vector for the light answered;
According to the launch point coordinate and direction vector of the light, the corresponding light of each pixel is generated.
Optionally, described to use the bounding volume hierarchy (BVH) accelerating structure and ray tracing technology, parallel rendering is all described
Light, generation unit pattern matrix, specifically includes:
Step S1: using engine Optix in the bounding volume hierarchy (BVH) accelerating structure and the ray tracing technology of open source, parallel
The emittance value of every light is calculated, and is stored in the integrated light field depending on seeing in the data structure EIABuffer of model;
Step S2: finishing until the emittance value of the corresponding light of pixel each in cell picture array calculates, described
All data are a frame unit pattern matrix in data structure EIABuffer, will be owned in the data structure EIABuffer
Data copy to integrated light field depending on seeing in the free buffer of the OpenGL in model, and refresh the data structure
EIABuffer;
Step S3: repeating step S1 to step S2, the cell picture array until generating all frames.
Optionally, described to draw the cell picture array on a display screen using dual-cache mechanism and show, it is specific to wrap
It includes:
Using the dual-cache mechanism in the OpenGL shape library of open source, unit described in each frame is drawn on LCD display
Pattern matrix is simultaneously shown.
A kind of integration imaging that can be interacted based on ray tracing and in real time generates system, comprising:
Initialization module is loaded for using the function library of open source to read systems parameters document using the function library of open source
Virtual scene threedimensional model and virtual scene texture file;Wherein, the systems parameters document is stored in parameter class ConfigXML
In, in the virtual scene threedimensional model and virtual scene texture file deposit model data structures MeshBuffer;
Bounding volume hierarchy (BVH) accelerating structure establishes module, for according to the number in the model data structures MeshBuffer
According to establishing bounding volume hierarchy (BVH) accelerating structure;
Light field is integrated depending on seeing model building module, for establishing integrated according to the data in the parameter class ConfigXML
Light field is depending on being shown in model;The integrated light field includes plane, virtual lens battle array where dummy unit pattern matrix depending on seeing model successively
Plane and the light plane of departure where plane, three-dimension object center where column;The integrated light field is depending on being shown in model by ILFR class
It realizes, the integrated light field includes data class EIABuffer, world coordinates data, plan range number depending on the attribute value for seeing model
According to, the position data of the pixel size of LCD display and optical center of lens;The data class EIABuffer be a size with
The identical two-dimensional structure body of cell picture array, the color value for each pixel in storage unit pattern matrix;The world
Coordinate data is by point Lookat, point OrIt is formed with the world coordinates data of vector up, point Lookat is integrated light field depending on seeing model
The origin of coordinate system, point Or are the origin of world coordinate system, and vector up is integrated top vector of the light field depending on seeing model coordinate systems;Institute
Stating plan range data includes the distance between plane where plane where dummy unit pattern matrix and virtual lens array, empty
Plane where the distance between plane where plane where quasi- lens array and three-dimension object center and three-dimension object center with
The distance between light plane of departure;The position data of the optical center of lens refers to that optical center of lens is vertical on cell picture array
Pixel where subpoint, the position data in cell picture array;
First judging result obtains module, receives interactive instruction for judging whether, obtains the first judging result;It is described
Interactive instruction includes keyboard mutuality instruction and mouse interactive instruction;
First light generation module, for when first judging result expression receive interactive instruction when, according to described
Attribute value of the integrated light field depending on seeing model, and the category according to modified integrated light field depending on seeing model are modified in interactive instruction
Property value, generates the corresponding light of each pixel;
Second light generation module, for when first judging result expression do not receive interactive instruction when, according to institute
Attribute value of the integrated light field depending on seeing model is stated, the corresponding light of each pixel is generated;
Cell picture array generation module, for using the bounding volume hierarchy (BVH) accelerating structure and ray tracing technology, and
Row renders all light, generation unit pattern matrix;
Display module, for drawing the cell picture array on a display screen using dual-cache mechanism and showing.
Optionally, the integrated light field is specifically included depending on seeing model building module:
World coordinate system establishes unit, and for using right-handed Cartesian coordinate system, establishing reference axis is xw、yw、zwThe world
Coordinate system;Wherein, point Or is the origin of world coordinate system;
Light field is integrated depending on seeing that model coordinate systems establish unit, for point Lookat to be set as integrated light field depending on seeing model coordinate
The origin of system, by the point OrThe vector formed to the point Lookat is set as integrated z of the light field depending on seeing model coordinate systemscAxis is built
Vertical reference axis is xc、yc、zcIntegrated light field depending on seeing model coordinate systems;Wherein, the point Lookat is the body of three-dimensional virtual object
Product central point;Vector up is top vector of the integrated light field depending on seeing model coordinate systems, for constructing the integrated light field depending on seeing
The unit orthogonal basement of model coordinate systems;
Integrated light field is depending on being shown in model foundation unit, for according to plane, virtual lens battle array where dummy unit pattern matrix
Plane, the light plane of departure and the integrated light field are built depending on seeing model coordinate systems where plane, three-dimension object center where column
Vertical integrated light field is depending on being shown in model;Wherein, flat where plane, the virtual lens array where the dummy unit pattern matrix
Plane and the light plane of departure are with the integrated light field depending on being shown in model coordinate systems where face, the three-dimension object center
ZcAxis is vertical;The light plane of departure and z of the integrated light field depending on seeing model coordinate systemscAxis intersects at point Or, described three
Plane and z of the integrated light field depending on seeing model coordinate systems where tieing up object centercAxis intersects at point D, and virtual lens array
Central point be overlapped with the point D;Plane where plane, the virtual lens array where the dummy unit pattern matrix
The position of plane is closed where plane, virtual lens array where dummy unit pattern matrix in positional relationship and physical reproduction system
System is consistent.
Optionally, the display module, specifically includes:
Display unit, the dual-cache mechanism in OpenGL shape library for using open source are drawn every on LCD display
Cell picture array described in one frame is simultaneously shown.
The specific embodiment provided according to the present invention, the invention discloses following technical effects:
The present invention constructs virtual scene accelerating structure in preprocessing process, promotes ray tracing rendering in render process
Efficiency, while integrated light field is constructed depending on seeing model, replace the virtual camera array in previous algorithm, eliminates ISPP optimization algorithm
The middle calculating process for generating virtual camera array, calculating virtual camera transition matrix, has simplified calculation process, has improved interaction
The real-time of function.
Detailed description of the invention
It in order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, below will be to institute in embodiment
Attached drawing to be used is needed to be briefly described, it should be apparent that, the accompanying drawings in the following description is only some implementations of the invention
Example, for those of ordinary skill in the art, without any creative labor, can also be according to these attached drawings
Obtain other attached drawings.
Fig. 1 is the process signal for the integration imaging generation method that the embodiment of the present invention can be interacted based on ray tracing and in real time
Figure;
Fig. 2 is that the embodiment of the present invention integrates light field depending on seeing model basic block diagram;
Fig. 3 is that equations of light ray of the embodiment of the present invention calculates schematic diagram;
Fig. 4 is the structural representation that the integration imaging that the embodiment of the present invention can be interacted based on ray tracing and in real time generates system
Figure.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete
Site preparation description, it is clear that described embodiments are only a part of the embodiments of the present invention, instead of all the embodiments.It is based on
Embodiment in the present invention, it is obtained by those of ordinary skill in the art without making creative efforts every other
Embodiment shall fall within the protection scope of the present invention.
In order to make the foregoing objectives, features and advantages of the present invention clearer and more comprehensible, with reference to the accompanying drawing and specific real
Applying mode, the present invention is described in further detail.
Embodiment one
As shown in Figure 1, a kind of integration imaging generation side that can be interacted based on ray tracing and in real time provided in this embodiment
Method includes:
Step 101: reading systems parameters document using the function library of open source, load virtual scene using the function library of open source
Threedimensional model and virtual scene texture file;Wherein, in the systems parameters document deposit parameter class ConfigXML, the void
In quasi- scene threedimensional model and virtual scene texture file deposit model data structures MeshBuffer.
The systems parameters document includes an xml document and a csv file;Wherein, the csv file includes lens
The position data of all lens centres in array, the xml document include pixel size, the cell picture array of LCD display
The distance between lens array, the focal length of lens, the width of LCD display, in Virtual Space cell picture array width number
The number of lens, Mei Gedan in value, cell picture array horizontal resolution, cell picture array vertical resolution, lens array
The pixel number of first pattern matrix and the filename of lens centre position data file;File in the virtual scene threedimensional model
Format be ply file, obj file and txt file, the format of the virtual scene texture file is ppm file, hdr file
With jpg file.
Step 102: according to the data in the model data structures MeshBuffer, establishing bounding volume hierarchy (BVH) and accelerate knot
Structure.
Step 103: according to the data in the parameter class ConfigXML, establishing integrated light field depending on seeing model;It is described integrated
Light field includes plane where dummy unit pattern matrix, plane where virtual lens array depending on seeing model successively, in three-dimension object
Plane and the light plane of departure where the heart;The integrated light field is depending on being shown in that model is realized by ILFR class, and the integrated light field is depending on being shown in
The attribute value of model includes data class EIABuffer, world coordinates data, the pixel size of plan range data, LCD display
And the position data of optical center of lens;The data class EIABuffer is size two dimension identical with cell picture array
Structural body, the color value for each pixel in storage unit pattern matrix;The world coordinates data are by point Lookat, point Or
It is formed with the world coordinates data of vector up, point Lookat is integrated origin of the light field depending on seeing model coordinate systems, and point Or is the world
The origin of coordinate system, vector up are integrated top vector of the light field depending on seeing model coordinate systems;The plan range data include virtual
The distance between plane where plane where cell picture array and virtual lens array, plane and three where virtual lens array
Where the distance between plane where tieing up object center and three-dimension object center between plane and the light plane of departure away from
From;The position data of the optical center of lens refers to optical center of lens pixel where upright projection point on cell picture array, in list
Position data in first pattern matrix.
Step 104: judging whether to receive interactive instruction, obtain the first judging result;The interactive instruction includes keyboard
Interactive instruction and mouse interactive instruction;If the first judging result expression receives interactive instruction, 105 are thened follow the steps;If
The first judging result expression does not receive interactive instruction, thens follow the steps 106.
Step 105: according to the interactive instruction, modifying attribute value of the integrated light field depending on seeing model, and according to modification
Attribute value of the integrated light field afterwards depending on seeing model, generates the corresponding light of each pixel.
Step 106: the attribute value according to the integrated light field depending on seeing model generates the corresponding light of each pixel.
Step 107: using the bounding volume hierarchy (BVH) accelerating structure and ray tracing technology, all light of parallel rendering
Line, generation unit pattern matrix.
Step 108: drawing the cell picture array on a display screen using dual-cache mechanism and show.
Step 103 specifically includes:
Using right-handed Cartesian coordinate system, establishing reference axis is xw、yw、zwWorld coordinate system;Wherein, point Or is the world
The origin of coordinate system.
Point Lookat is set as integrated origin of the light field depending on seeing model coordinate systems, by the point OrTo the point Lookat shape
At vector be set as integrated z of the light field depending on seeing model coordinate systemscAxis, establishing reference axis is xc、yc、zcIntegrated light field depending on seeing mould
Type coordinate system;Wherein, the point Lookat is the volume center point of three-dimensional virtual object;Vector up is the integrated light field depending on seeing
The top vector of model coordinate systems, for constructing unit orthogonal basement of the integrated light field depending on seeing model coordinate systems.
According to flat where plane, three-dimension object center where plane, virtual lens array where dummy unit pattern matrix
Face, the light plane of departure and the integrated light field establish integrated light field depending on seeing model depending on seeing model coordinate systems;Wherein, described
Plane where plane, the three-dimension object center where plane, the virtual lens array where dummy unit pattern matrix with
And z of the light plane of departure with the integrated light field depending on seeing model coordinate systemscAxis is vertical;The light plane of departure with
Z of the integrated light field depending on seeing model coordinate systemscAxis intersects at point Or, plane where the three-dimension object center and described integrated
Z of the light field depending on seeing model coordinate systemscAxis intersects at point D, and the central point of virtual lens array is overlapped with the point D;The void
It is virtual in the positional relationship and physical reproduction system of plane where plane, the virtual lens array where quasi-simple member pattern matrix
The positional relationship of plane is consistent where plane, virtual lens array where cell picture array.
Interactive instruction in step 104 includes rotation instruction, move, scaling instruction, display fine tuning instruction;Wherein,
The rotation instruction is to reset the integrated light field depending on seeing model midpoint O by monitoring left mouse button draggingrThe world sit
What mark was realized;The move is to reset the integrated light field depending on seeing model midpoint by monitoring right mouse button dragging
Lookat and point OrWorld coordinates realize;The scaling instruction is by monitoring mouse roller key dragging, according to certain ratio
Example resets the integrated light field, and depending on seeing, plane where dummy unit pattern matrix and virtual lens array place are flat in model
The distance between plane and LCD are shown where plane where the distance between face, virtual lens array and three-dimension object center
What the pixel size of screen was realized;The display fine tuning instruction is that the integrated light field is individually modified by keyboard depending on seeing model
It is flat where the distance between plane, virtual lens array where plane where middle dummy unit pattern matrix and virtual lens array
Where the distance between plane, three-dimension object center where face and three-dimension object center between plane and the light plane of departure away from
From and LCD display pixel size realize.
The step for step 105 corresponding with each pixel is generated in step 106 light, specifically includes:
A thread is distributed to each pixel.
Attribute value according to the corresponding thread of each pixel and the integrated light field depending on seeing model calculates current pixel pair
The launch point coordinate and direction vector for the light answered.
According to the launch point coordinate and direction vector of the light, the corresponding light of each pixel is generated.
Step 107 specifically includes:
Step S1: using engine Optix in the bounding volume hierarchy (BVH) accelerating structure and the ray tracing technology of open source, parallel
The emittance value of every light is calculated, and is stored in the integrated light field depending on seeing in the data structure EIABuffer of model.
Step S2: finishing until the emittance value of the corresponding light of pixel each in cell picture array calculates, described
All data are a frame unit pattern matrix in data structure EIABuffer, will be owned in the data structure EIABuffer
Data copy to integrated light field depending on seeing in the free buffer of the OpenGL in model, and refresh the data structure
EIABuffer。
Step S3: repeating step S1 to step S2, the cell picture array until generating all frames.
Step 108 specifically includes:
Using the dual-cache mechanism in the OpenGL shape library of open source, unit described in each frame is drawn on LCD display
Pattern matrix is simultaneously shown.
Embodiment two
Integration imaging generation method provided in this embodiment mainly includes input module, preprocessing module, interactive module, wash with watercolours
Contaminate module and display module.
One, input module
(1) read systems parameters document using the function library of open source, the systems parameters document include xml document and
One csv file.
Wherein, csv file includes the position data of all lens centres in lens array.
Xml document has comprising key data:
The pixel size (mm) of pixelSize:LCD display screen;
The distance between lens_EIA_dist:EIA and lens array size (mm);
Lenf: the focal length of lens (mm);
The width (mm) of LCDw:LCD display screen;
VLCDw: the width numerical value of EIA in Virtual Space;
Film_horRes:EIA horizontal resolution;
Film_verRes:EIA vertical resolution;
LensCount: the number of lens in lens array;
Parallax: the pixel number of each EIA;
LensCenter_csv: the filename of lens centre position data file;
The position data of optical center of lens: the position data of optical center of lens refers to that optical center of lens is vertical on cell picture array
Pixel where subpoint, the position data in cell picture array.
All data after above-mentioned two file is read all are stored in a parameter class ConfigXML.
(2) using the function library load load virtual scene threedimensional model and virtual scene texture file of open source.Virtual field
The format of scape threedimensional model file can be ply file, obj file, txt file, and virtual scene texture file can be ppm text
Part, hdr file, jpg file.These data are stored in model data structures MeshBuffer.
Two, preprocessing module
(1) level encirclement is established using the function library of open source according to the data in model data structures MeshBuffer
Box accelerating structure (GeometryGroup class) is calculated for subsequent ray tracing.
(2) according to the data in parameter class ConfigXML, integrated light field is established depending on seeing model, the integrated light field is depending on being shown in mould
The basic structure of type is as shown in Figure 2.
Integrated light field view see model include EIAP (plane where virtual EIA), LAP (virtual lens array place plane),
The parts such as CDP (plane where three-dimension object center) and ROP (the light plane of departure).Using right-handed Cartesian coordinate system, establish
Reference axis is xw、yw、zwWorld coordinate system.Establishing reference axis is xc、yc、zcIntegrated light field depending on see model coordinate systems (referred to as
Interior space), point Lookat be interior space origin (initial default using the volume center of three-dimensional virtual object as point Lookat,
To guarantee that the EIA generated can provide basic initial display effect for integration imaging physical reproduction system), point Or(world is sat
Mark system origin) with point Lookat at a distance of p unit length (the interior all numerical value in space all use the unit length), from point OrIt arrives
The vector that point Lookat is formed is the z in interior spacecAxis, vector up are the top vectors in interior space (for constructing interior mikey just
Hand over substrate).The z of plane EIAP, plane LAP, plane CDP and plane ROP all with interior spacecAxis is vertical, the distance between they
It is indicated respectively with g, h, p.Face ROP and zcAxis intersects at point Or, face CDP and zcAxis intersects at point D.To guarantee that substantially initial is aobvious
Show effect and facilitate calculating, the center of the usually lens by virtual lens array middle position (rectangular centre) nearby is placed on point D
On.Position of the positional relationship of virtual EIA and virtual lens array with EIA and lens array in integration imaging physical reproduction system
Relationship is consistent.Pixel A in virtual EIAijInterior space coordinate by location of pixels (the i-th row jth column) and virtual pixel size
PixelSize is determined.Ray RayijRepresent pixel Aij(light all silent refer to issue corresponding light from plane ROP in the present invention
Chief ray, be different from shadow ray), pixel AijCentral point and corresponding lens centre DmnPlace straight line, with plane ROP's
Intersection point is ray RayijLaunch point, therefore, virtual scene on the left of plane ROP as it can be seen that right side virtual scene is invisible, so
P value initially set is generally higher than three-dimension object size.ClosestHit point is light RayijWith virtual scene threedimensional model
Nearest crosspoint, line segment RaysThe shadow ray in ray tracing technology is represented, Light represents virtual three-dimensional scene illumination model
Point light source in (point light source and environment light).
Integrated light field is depending on being shown in model for generating the light Ray of each pixelij, then calculated using ray tracing technology
Light RayijWith the nearest crosspoint closestHit of virtual scene threedimensional model, and according to illumination system (point light source Light etc.
Global illumination), material, the virtual scenes information such as texture calculate light RayijEmittance value, be then assigned to AijPixel, finally
Obtain EIA.
Light field is integrated depending on seeing that model is realized by ILFR class, underlying attribute includes EIABuffer unit (the storage each picture of EIA
The color value of element), point Lookat, point Or, vector up world coordinates data and numerical value g, numerical value h, numerical value p, numerical value
PixelSize, and read-write interface is provided for these attributes.EIABuffer unit is size two-dimensional structure identical with EIA
Body, structure body unit are a self-defining data class RGBColor, store the color value of a pixel.
Three, interactive module
Using open source library OpenGL monitoring keyboard and mouse event, as left button dragging, right button dragging, roller key dragging,
The key of keyboard.If being not received by event, just according to current integrated light field depending on seeing that the attribute value of model continues wash with watercolours
EIA is contaminated, otherwise, using the attribute in relevant interface modification ILFR class, then renders EIA.Interactive details have rotation, movement,
Scaling, display fine tuning etc. four.
(1) it rotates.By monitoring left mouse button dragging, integrated light field is reset depending on seeing model midpoint OrThe world sit
Mark, can be realized.
(2) mobile.By monitoring right mouse button dragging, integrated light field is reset depending on seeing model midpoint Lookat and point
OrWorld coordinates, can be realized.
(3) it scales.It is dragged by monitoring mouse roller key, according to a certain percentage (0.5-3 times) reset numerical value g, number
The size of value h, numerical value pixelSize, can be realized.
(4) display fine tuning.Individually modified by keyboard numerical value g, numerical value h, numerical value p, numerical value pixelSize it is big
It is small, to debug display effect.
Four, rendering module
(1) light generates.For the process by integrated light field depending on seeing that model is completed, integrated light field, which regards, sees model as each picture
Element distributes a thread and carries out parallel computation, and per thread is according to integrated light field depending on seeing that the attribute value of model calculates current pixel institute
The equations of light ray of corresponding light.
In the present invention, the coordinate of vector sum point is all indicated with column vector, and " * " indicates that inner product of vectors, " ^ " indicate outside vector
Product, " normalize (m) " indicate to solve the unit vector of vector m (m cannot be null vector).
Ray model R of the present inventionijIt is ray model, contains launch point Oij(coordinate is by vectorIndicate) and direction unit to
Measure Directionij(coordinate is by vectorIndicate) etc. parameters, (coordinate is by vector by the point P on lightIndicate) meet following public affairs
Formula, i.e. equations of light ray:
The size of real number t determines the position of point P.Light is generated in integrated light field depending on seeing in model, as shown in Figure 3.Fig. 3
Middle OijAnd DirectionijIt is light R respectivelyijLaunch point and direction, D1mn、D2mnIt is lens centre DmnIn plane EIAP and
The upright projection point of plane ROP plane, x, the y-coordinate in interior space are all identical.Point Oij, point Aij, point D1mn, point D2mn's
There are following relationships for interior space coordinate:
Light launch point O is calculated by formula (2)ijInterior space coordinate origin, can be with by space coordinates conversion
Obtain OijWorld coordinates originwAnd the unit direction vector direction of lightw, its calculation formula is:
Vector Groups { U, V, W } are a unit orthogonal basements of interior space coordinates, can be by point lookat, point OrWith vector
Up calculating is got, its calculation formula is:
In above formula, top vector up and point Lookat, point O of the light field depending on seeing model are integratedrPlace straight line (interior space z-axis) is no
Can be parallel, otherwise the multiplication cross result U vector of vector up and vector W will be null vector, can not be as the unit orthogonal basis in interior space
Bottom.
After the launch point coordinate and direction vector that acquire light by above-mentioned calculating, equations of light ray can be obtained, after convenient
The calculating of continuous light radiation degree.
(2) light is rendered using ray tracing technology.Use the ray tracing technology engine Optix and virtual field of open source
Scape accelerating structure realized, the emittance value (being indicated with RGB color value) of every light of parallel computation, and is stored in Integrated Light
Depending on see in the data structure EIABuffer of model, to get to a frame EIA after pixel each in EIA, which calculates, to be finished
After data, gives data structure EIABuffer to display module and show.Then refresh data structure EIABuffer, and
Interactive module is recycled to continue to calculate the calculating of next frame EIA.
Five, display module
EIA is drawn on LCD display using the dual-cache mechanism in the OpenGL shape library of open source, realizes a frame EIA
The display of data.
The present invention includes input module, preprocessing module, interactive module, rendering module and display module, is optimized with ISPP
Five modules are similar (ISPP optimization algorithm is in previous algorithm than more typical algorithm) in algorithm, continuous relative to model data
For the dynamic model (animation data) of variation, static models are calculated, the real-time of display is not by input module, preprocessing module
The influence of computational efficiency, but depend on the other three module (interactive module, rendering module and display module such as of the invention,
Or computing module, rendering module and display module in ISPP optimization algorithm) computational efficiency.Therefore, the present invention is locating in advance
It manages and constructs virtual scene accelerating structure in module, promote the efficiency of ray tracing rendering in rendering module, while constructing Integrated Light
Field replaces the virtual camera in previous algorithm depending on seeing model (Integral light fieldviewing model, ILFVM)
Array, the calculating process for eliminating in ISPP optimization algorithm and generating virtual camera array, calculate virtual camera transition matrix, is simplified
Algorithm complexity, and flexibility is high, improves the real-time of interactive function.
Embodiment three
As shown in figure 4, present embodiments providing a kind of integration imaging that can be interacted based on ray tracing and in real time generates system
System, comprising:
Initialization module 100 is added for using the function library of open source to read systems parameters document using the function library of open source
Carry virtual scene threedimensional model and virtual scene texture file;Wherein, the systems parameters document is stored in parameter class ConfigXML
In, in the virtual scene threedimensional model and virtual scene texture file deposit model data structures MeshBuffer.
Bounding volume hierarchy (BVH) accelerating structure establishes module 200, for according in the model data structures MeshBuffer
Data establish bounding volume hierarchy (BVH) accelerating structure.
Light field is integrated depending on seeing model building module 300, for establishing according to the data in the parameter class ConfigXML
Light field is integrated depending on seeing model;The integrated light field include depending on seeing model successively plane where dummy unit pattern matrix, it is virtual thoroughly
Plane and the light plane of departure where plane, three-dimension object center where lens array;The integrated light field depending on see model by
ILFR class realizes that the integrated light field includes data class EIABuffer, world coordinates data, plane depending on the attribute value for seeing model
The position data of range data, the pixel size of LCD display and optical center of lens;The data class EIABuffer is one
Size two-dimensional structure body identical with cell picture array, the color value for each pixel in storage unit pattern matrix;Institute
World coordinates data are stated by point Lookat, point OrIt is formed with the world coordinates data of vector up, point Lookat is integrated light field view
See that the origin of model coordinate systems, point Or are the origin of world coordinate system, vector up is integrated top of the light field depending on seeing model coordinate systems
Vector;The plan range data include between plane where dummy unit pattern matrix and virtual lens array place plane
Distance, the distance between plane and three-dimension object center institute where plane where virtual lens array and three-dimension object center
At a distance from plane is between the light plane of departure;The position data of the optical center of lens refers to optical center of lens in cell picture battle array
Pixel, the position data in cell picture array where upright projection point on column.
First judging result obtains module 400, receives interactive instruction for judging whether, obtains the first judging result;
The interactive instruction includes keyboard mutuality instruction and mouse interactive instruction.
First light generation module 500, for when first judging result expression receive interactive instruction when, according to institute
Interactive instruction is stated, modifies attribute value of the integrated light field depending on seeing model, and according to modified integrated light field depending on seeing model
Attribute value generates the corresponding light of each pixel.
Second light generation module 600, for when first judging result expression do not receive interactive instruction when, according to
Attribute value of the integrated light field depending on seeing model, generates the corresponding light of each pixel.
Cell picture array generation module 700, for using the bounding volume hierarchy (BVH) accelerating structure and ray tracing technology,
All light of parallel rendering, generation unit pattern matrix.
Display module 800, for drawing the cell picture array on a display screen using dual-cache mechanism and showing.
The systems parameters document includes an xml document and a csv file;Wherein, the csv file includes lens
The position data of all lens centres in array, the xml document include pixel size, the cell picture array of LCD display
The distance between lens array, the focal length of lens, the width of LCD display, in Virtual Space cell picture array width number
The number of lens, Mei Gedan in value, cell picture array horizontal resolution, cell picture array vertical resolution, lens array
The pixel number of first pattern matrix and the filename of lens centre position data file.
The format of file is ply file, obj file and txt file, the virtual field in the virtual scene threedimensional model
The format of scape texture file is ppm file, hdr file and jpg file.
Integrated light field is specifically included depending on seeing model module 300:
World coordinate system establishes unit, and for using right-handed Cartesian coordinate system, establishing reference axis is xw、yw、zwThe world
Coordinate system;Wherein, point Or is the origin of world coordinate system.
Light field is integrated depending on seeing that model coordinate systems establish unit, for point Lookat to be set as integrated light field depending on seeing model coordinate
The origin of system, by the point OrThe vector formed to the point Lookat is set as integrated z of the light field depending on seeing model coordinate systemscAxis is built
Vertical reference axis is xc、yc、zcIntegrated light field depending on seeing model coordinate systems;Wherein, the point Lookat is the body of three-dimensional virtual object
Product central point;Vector up is top vector of the integrated light field depending on seeing model coordinate systems, for constructing the integrated light field depending on seeing
The unit orthogonal basement of model coordinate systems.
Integrated light field is depending on being shown in model foundation unit, for according to plane, virtual lens battle array where dummy unit pattern matrix
Plane, the light plane of departure and the integrated light field are built depending on seeing model coordinate systems where plane, three-dimension object center where column
Vertical integrated light field is depending on being shown in model;Wherein, flat where plane, the virtual lens array where the dummy unit pattern matrix
Plane and the light plane of departure are with the integrated light field depending on being shown in model coordinate systems where face, the three-dimension object center
ZcAxis is vertical;The light plane of departure and z of the integrated light field depending on seeing model coordinate systemscAxis intersects at point Or, described three
Plane and z of the integrated light field depending on seeing model coordinate systems where tieing up object centercAxis intersects at point D, and virtual lens array
Central point be overlapped with the point D;Plane where plane, the virtual lens array where the dummy unit pattern matrix
The position of plane is closed where plane, virtual lens array where dummy unit pattern matrix in positional relationship and physical reproduction system
System is consistent.
The interactive instruction includes rotation instruction, move, scaling instruction, display fine tuning instruction;Wherein,
The rotation instruction is to reset the integrated light field depending on seeing model midpoint O by monitoring left mouse button draggingr
World coordinates realize.
The move is to reset the integrated light field depending on seeing model midpoint by monitoring right mouse button dragging
Lookat and point OrWorld coordinates realize.
The scaling instruction is to reset the integrated light field according to a certain percentage by monitoring mouse roller key dragging
Depending on seeing the distance between plane where dummy unit pattern matrix and virtual lens array place plane, virtual lens battle array in model
What the pixel size of the distance between plane and LCD display where plane where column and three-dimension object center was realized.
The display fine tuning instruction is that the integrated light field is individually modified by keyboard depending on seeing dummy unit in model
Plane and three-dimensional article where the distance between plane, virtual lens array where plane where pattern matrix and virtual lens array
The distance between plane and the light plane of departure and LCD where the distance between plane, three-dimension object center where body center
What the pixel size of display screen was realized.
First light generation module 500, specifically includes:
First allocation unit, for distributing a thread to each pixel.
Attribute value modifies unit, for modifying integrated attribute value of the light field depending on seeing model according to interactive instruction.
First launch point coordinate and direction vector computing unit, after according to the corresponding thread of each pixel and modification
Attribute value of the integrated light field depending on seeing model, calculate the launch point coordinate and direction vector of the corresponding light of current pixel.
First ray generation unit generates each pixel for the launch point coordinate and direction vector according to the light
Corresponding light.
Second light generation module 600, specifically includes:
Second allocation unit, for distributing a thread to each pixel.
Second launch point coordinate and direction vector computing unit, for according to the corresponding thread of each pixel and Integrated Light
Attribute value of the field depending on seeing model, calculates the launch point coordinate and direction vector of the corresponding light of current pixel.
Second ray generation unit generates each pixel for the launch point coordinate and direction vector according to the light
Corresponding light.
Cell picture array module 700, specific following steps:
Step S1: using engine Optix in the bounding volume hierarchy (BVH) accelerating structure and the ray tracing technology of open source, parallel
The emittance value of every light is calculated, and is stored in the integrated light field depending on seeing in the data structure EIABuffer of model.
Step S2: finishing until the emittance value of the corresponding light of pixel each in cell picture array calculates, described
All data are a frame unit pattern matrix in data structure EIABuffer, will be owned in the data structure EIABuffer
Data copy to integrated light field depending on seeing in the free buffer of the OpenGL in model, and refresh the data structure
EIABuffer。
Step S3: repeating step S1 to step S2, the cell picture array until generating all frames.
Display module 800, specifically includes:
Using the dual-cache mechanism in the OpenGL shape library of open source, unit described in each frame is drawn on LCD display
Pattern matrix is simultaneously shown.
Each embodiment in this specification is described in a progressive manner, the highlights of each of the examples are with other
The difference of embodiment, the same or similar parts in each embodiment may refer to each other.For system disclosed in embodiment
For, since it is corresponded to the methods disclosed in the examples, so being described relatively simple, related place is said referring to method part
It is bright.
Used herein a specific example illustrates the principle and implementation of the invention, and above embodiments are said
It is bright to be merely used to help understand method and its core concept of the invention;At the same time, for those skilled in the art, foundation
Thought of the invention, there will be changes in the specific implementation manner and application range.In conclusion the content of the present specification is not
It is interpreted as limitation of the present invention.
Claims (10)
1. a kind of integration imaging generation method that can be interacted based on ray tracing and in real time, which is characterized in that the integration imaging
Generation method includes:
Systems parameters document is read using the function library of open source, loads virtual scene threedimensional model and void using the function library of open source
Quasi- scene texture file;Wherein, in the systems parameters document deposit parameter class ConfigXML, the virtual scene three-dimensional mould
In type and virtual scene texture file deposit model data structures MeshBuffer;
According to the data in the model data structures MeshBuffer, bounding volume hierarchy (BVH) accelerating structure is established;
According to the data in the parameter class ConfigXML, integrated light field is established depending on seeing model;The integrated light field is depending on being shown in model
Successively include plane where plane, three-dimension object center where plane, virtual lens arrays where dummy unit pattern matrix with
And the light plane of departure;The integrated light field is depending on being shown in that model is realized by ILFR class, attribute value of the integrated light field depending on seeing model
Include data class EIABuffer, world coordinates data, the pixel size and optical center of lens of plan range data, LCD display
Position data;The data class EIABuffer is size two-dimensional structure body identical with cell picture array, for depositing
Put the color value of each pixel in cell picture array;The world coordinates data are by point Lookat, point OrWith the generation of vector up
Boundary's coordinate data composition, point Lookat are integrated origin of the light field depending on seeing model coordinate systems, and point Or is the origin of world coordinate system,
Vector up is integrated top vector of the light field depending on seeing model coordinate systems;The plan range data include dummy unit pattern matrix institute
At a distance from where plane with virtual lens array between plane, where plane where virtual lens array and three-dimension object center
The distance between plane and the light plane of departure where the distance between plane and three-dimension object center;The optical center of lens
Position data refer to optical center of lens pixel where upright projection point, position in cell picture array on cell picture array
Set data;
Judge whether to receive interactive instruction, obtains the first judging result;The interactive instruction includes keyboard mutuality instruction and mouse
Mark interactive instruction;
If the first judging result expression receives interactive instruction, according to the interactive instruction, the integrated light field is modified
Attribute value depending on seeing model, and the attribute value according to modified integrated light field depending on seeing model, generate the corresponding light of each pixel
Line;
If the first judging result expression does not receive interactive instruction, the attribute according to the integrated light field depending on seeing model
Value, generates the corresponding light of each pixel;
Using the bounding volume hierarchy (BVH) accelerating structure and ray tracing technology, all light of parallel rendering, generation unit figure
As array;
It draws the cell picture array on a display screen using dual-cache mechanism and shows.
2. a kind of integration imaging generation method that can be interacted based on ray tracing and in real time according to claim 1, special
Sign is that the systems parameters document includes an xml document and a csv file;Wherein, the csv file includes lens
The position data of all lens centres in array, the xml document include pixel size, the cell picture array of LCD display
The distance between lens array, the focal length of lens, the width of LCD display, in Virtual Space cell picture array width number
The number of lens, Mei Gedan in value, cell picture array horizontal resolution, cell picture array vertical resolution, lens array
The pixel number of first pattern matrix and the filename of lens centre position data file;
File is ply file, obj file and txt file, the virtual scene texture file in the virtual scene threedimensional model
For ppm file, hdr file and jpg file.
3. a kind of integration imaging generation method that can be interacted based on ray tracing and in real time according to claim 2, special
Sign is, the data according in the parameter class ConfigXML, establishes integrated light field depending on seeing model, specifically includes:
Using right-handed Cartesian coordinate system, establishing reference axis is xw、yw、zwWorld coordinate system;Wherein, point Or is world coordinates
The origin of system;
Point Lookat is set as integrated origin of the light field depending on seeing model coordinate systems, by the point OrIt is formed to the point Lookat
Vector is set as integrated z of the light field depending on seeing model coordinate systemscAxis, establishing reference axis is xc、yc、zcIntegrated light field depending on see model sit
Mark system;Wherein, the point Lookat is the volume center point of three-dimensional virtual object;Vector up is the integrated light field depending on seeing model
The top vector of coordinate system, for constructing unit orthogonal basement of the integrated light field depending on seeing model coordinate systems;
According to plane, light where plane, three-dimension object center where plane, virtual lens array where dummy unit pattern matrix
The line plane of departure and the integrated light field establish integrated light field depending on seeing model depending on seeing model coordinate systems;Wherein, the virtual list
Plane where plane, the three-dimension object center where plane, the virtual lens array where first pattern matrix and described
Z of the light plane of departure with the integrated light field depending on seeing model coordinate systemscAxis is vertical;The light plane of departure and the collection
Z at light field depending on seeing model coordinate systemscAxis intersects at point Or, plane where the three-dimension object center and the integrated light field regard
See the z of model coordinate systemscAxis intersects at point D, and the central point of virtual lens array is overlapped with the point D;The dummy unit
Dummy unit figure in the positional relationship and physical reproduction system of plane where plane, the virtual lens array where pattern matrix
The positional relationship of plane is consistent where plane, virtual lens array as where array.
4. a kind of integration imaging generation method that can be interacted based on ray tracing and in real time according to claim 3, special
Sign is that the interactive instruction includes rotation instruction, move, scaling instruction, display fine tuning instruction;Wherein,
The rotation instruction is to reset the integrated light field depending on seeing model midpoint O by monitoring left mouse button draggingrGeneration
What boundary's coordinate was realized;
The move is to reset the integrated light field depending on seeing model midpoint Lookat by monitoring right mouse button dragging
With point OrWorld coordinates realize;
The scaling instruction is to reset the integrated light field according to a certain percentage depending on seeing by monitoring mouse roller key dragging
The distance between plane where dummy unit pattern matrix and virtual lens array place plane, virtual lens array institute in model
At a distance from where plane with three-dimension object center between plane and the pixel size of LCD display realize;
The display fine tuning instruction is that the integrated light field is individually modified by keyboard depending on seeing dummy unit image in model
Where the distance between plane, virtual lens array where plane where array and virtual lens array in plane and three-dimension object
The distance between plane and the light plane of departure and LCD are shown where the distance between plane, three-dimension object center where the heart
What the pixel size of screen was realized.
5. a kind of integration imaging generation method that can be interacted based on ray tracing and in real time according to claim 1, special
Sign is, described to generate the corresponding light of each pixel, specifically includes:
A thread is distributed to each pixel;
It is corresponding to calculate current pixel for attribute value according to the corresponding thread of each pixel and the integrated light field depending on seeing model
The launch point coordinate and direction vector of light;
According to the launch point coordinate and direction vector of the light, the corresponding light of each pixel is generated.
6. a kind of integration imaging generation method that can be interacted based on ray tracing and in real time according to claim 1, special
Sign is, described to use the bounding volume hierarchy (BVH) accelerating structure and ray tracing technology, and all light of parallel rendering generate
Cell picture array, specifically includes:
Step S1: using engine Optix in the bounding volume hierarchy (BVH) accelerating structure and the ray tracing technology of open source, parallel computation
The emittance value of every light, and the integrated light field is stored in depending on seeing in the data structure EIABuffer of model;
Step S2: it is finished until the emittance value of the corresponding light of pixel each in cell picture array calculates, the data
All data are a frame unit pattern matrix in structure EIABuffer, by number all in the data structure EIABuffer
According to copying to integrated light field depending on seeing in the free buffer of the OpenGL in model, and refresh the data structure EIABuffer;
Step S3: repeating step S1 to step S2, the cell picture array until generating all frames.
7. a kind of integration imaging generation method that can be interacted based on ray tracing and in real time according to claim 1, special
Sign is, described to be drawn the cell picture array on a display screen using dual-cache mechanism and shown, specifically includes:
Using the dual-cache mechanism in the OpenGL shape library of open source, cell picture described in each frame is drawn on LCD display
Array is simultaneously shown.
8. a kind of integration imaging that can be interacted based on ray tracing and in real time generates system, which is characterized in that the integration imaging
Generation system includes:
Initialization module is loaded virtual for using the function library of open source to read systems parameters document using the function library of open source
Scene threedimensional model and virtual scene texture file;Wherein, in the systems parameters document deposit parameter class ConfigXML, institute
It states in virtual scene threedimensional model and virtual scene texture file deposit model data structures MeshBuffer;
Bounding volume hierarchy (BVH) accelerating structure establishes module, for building according to the data in the model data structures MeshBuffer
Vertical bounding volume hierarchy (BVH) accelerating structure;
Light field is integrated depending on seeing model building module, for establishing integrated light field according to the data in the parameter class ConfigXML
Depending on seeing model;The integrated light field includes plane, virtual lens array institute where dummy unit pattern matrix depending on seeing model successively
In plane and the light plane of departure where plane, three-dimension object center;The integrated light field depending on seeing that model is realized by ILFR class,
The integrated light field includes data class EIABuffer, world coordinates data, plan range data, LCD depending on the attribute value for seeing model
The pixel size of display screen and the position data of optical center of lens;The data class EIABuffer is a size and unit figure
As the identical two-dimensional structure body of array, the color value for each pixel in storage unit pattern matrix;The world coordinates number
According to by point Lookat, point OrIt is formed with the world coordinates data of vector up, point Lookat is integrated light field depending on seeing model coordinate systems
Origin, point Or be world coordinate system origin, vector up be integrate top vector of the light field depending on seeing model coordinate systems;The plane
Range data includes the distance between plane, virtual lens where plane where dummy unit pattern matrix and virtual lens array
Plane and light are sent out where the distance between plane where plane where array and three-dimension object center and three-dimension object center
Penetrate the distance between plane;The position data of the optical center of lens refers to optical center of lens upright projection point on cell picture array
Place pixel, the position data in cell picture array;
First judging result obtains module, receives interactive instruction for judging whether, obtains the first judging result;The interaction
Instruction includes keyboard mutuality instruction and mouse interactive instruction;
First light generation module, for when first judging result expression receive interactive instruction when, according to the interaction
Attribute value of the integrated light field depending on seeing model, and the attribute value according to modified integrated light field depending on seeing model are modified in instruction,
Generate the corresponding light of each pixel;
Second light generation module, for when first judging result expression do not receive interactive instruction when, according to the collection
Attribute value at light field depending on seeing model generates the corresponding light of each pixel;
Cell picture array generation module, for using the bounding volume hierarchy (BVH) accelerating structure and ray tracing technology, parallel wash with watercolours
Contaminate all light, generation unit pattern matrix;
Display module, for drawing the cell picture array on a display screen using dual-cache mechanism and showing.
9. a kind of integration imaging that can be interacted based on ray tracing and in real time according to claim 8 generates system, special
Sign is, the integrated light field is specifically included depending on seeing model building module:
World coordinate system establishes unit, and for using right-handed Cartesian coordinate system, establishing reference axis is xw、yw、zwWorld coordinates
System;Wherein, point Or is the origin of world coordinate system;
Light field is integrated depending on seeing that model coordinate systems establish unit, for point Lookat to be set as integrated light field depending on seeing model coordinate systems
Origin, by the point OrThe vector formed to the point Lookat is set as integrated z of the light field depending on seeing model coordinate systemscAxis is established and is sat
Parameter is xc、yc、zcIntegrated light field depending on seeing model coordinate systems;Wherein, the point Lookat is in the volume of three-dimensional virtual object
Heart point;Vector up is top vector of the integrated light field depending on seeing model coordinate systems, for constructing the integrated light field depending on seeing model
The unit orthogonal basement of coordinate system;
Integrated light field is depending on being shown in model foundation unit, for according to plane, virtual lens array institute where dummy unit pattern matrix
In plane, the light plane of departure and the integrated light field where plane, three-dimension object center depending on seeing model coordinate systems, collection is established
At light field depending on seeing model;Wherein, plane, institute where plane, the virtual lens array where the dummy unit pattern matrix
The z of plane and the light plane of departure with the integrated light field depending on seeing model coordinate systems where stating three-dimension object centerc
Axis is vertical;The light plane of departure and z of the integrated light field depending on seeing model coordinate systemscAxis intersects at point Or, the three-dimensional article
Plane and z of the integrated light field depending on seeing model coordinate systems where body centercAxis intersects at point D, and in virtual lens array
Heart point is overlapped with the point D;The position of plane where plane, the virtual lens array where the dummy unit pattern matrix
The positional relationship one of plane where plane, virtual lens array where dummy unit pattern matrix in relationship and physical reproduction system
It causes.
10. a kind of integration imaging that can be interacted based on ray tracing and in real time according to claim 8 generates system, special
Sign is that the display module specifically includes:
Display unit, the dual-cache mechanism in OpenGL shape library for using open source, draws each frame on LCD display
The cell picture array is simultaneously shown.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910438381.4A CN110276823B (en) | 2019-05-24 | 2019-05-24 | Ray tracing based real-time interactive integrated imaging generation method and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910438381.4A CN110276823B (en) | 2019-05-24 | 2019-05-24 | Ray tracing based real-time interactive integrated imaging generation method and system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110276823A true CN110276823A (en) | 2019-09-24 |
CN110276823B CN110276823B (en) | 2023-04-07 |
Family
ID=67960157
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910438381.4A Active CN110276823B (en) | 2019-05-24 | 2019-05-24 | Ray tracing based real-time interactive integrated imaging generation method and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110276823B (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113031262A (en) * | 2021-03-26 | 2021-06-25 | 中国人民解放军陆军装甲兵学院 | Integrated imaging system display end pixel value calculation method and system |
CN113240785A (en) * | 2021-04-13 | 2021-08-10 | 西安电子科技大学 | Multi-camera combined rapid ray tracing method, system and application |
WO2021197370A1 (en) * | 2020-03-31 | 2021-10-07 | 京东方科技集团股份有限公司 | Light field display method and system, storage medium and display panel |
CN113654458A (en) * | 2021-01-21 | 2021-11-16 | 中国人民解放军陆军装甲兵学院 | Three-dimensional method and system for measuring transverse position error of lens array |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100033493A1 (en) * | 2008-08-08 | 2010-02-11 | International Business Machines Corporation | System and Method for Iterative Interactive Ray Tracing in a Multiprocessor Environment |
CN103702099A (en) * | 2013-12-17 | 2014-04-02 | 四川大学 | Ultra-large visual-angle integrated-imaging 3D(Three-Dimensional)displaying method based on head tracking |
CN107563088A (en) * | 2017-09-14 | 2018-01-09 | 北京邮电大学 | A kind of light field display device emulation mode based on Ray Tracing Algorithm |
GB201721702D0 (en) * | 2017-07-13 | 2018-02-07 | Imagination Tech Ltd | Hybrid hierarchy for ray tracing |
CN107924580A (en) * | 2015-09-03 | 2018-04-17 | 西门子保健有限责任公司 | The visualization of surface volume mixing module in medical imaging |
-
2019
- 2019-05-24 CN CN201910438381.4A patent/CN110276823B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100033493A1 (en) * | 2008-08-08 | 2010-02-11 | International Business Machines Corporation | System and Method for Iterative Interactive Ray Tracing in a Multiprocessor Environment |
CN103702099A (en) * | 2013-12-17 | 2014-04-02 | 四川大学 | Ultra-large visual-angle integrated-imaging 3D(Three-Dimensional)displaying method based on head tracking |
CN107924580A (en) * | 2015-09-03 | 2018-04-17 | 西门子保健有限责任公司 | The visualization of surface volume mixing module in medical imaging |
GB201721702D0 (en) * | 2017-07-13 | 2018-02-07 | Imagination Tech Ltd | Hybrid hierarchy for ray tracing |
CN107563088A (en) * | 2017-09-14 | 2018-01-09 | 北京邮电大学 | A kind of light field display device emulation mode based on Ray Tracing Algorithm |
Non-Patent Citations (4)
Title |
---|
JUNFU WANG等: "Elemental image array generation based on object front reference", 《CONTENTS LISTS AVAILABLE AT SCIENCEDIRECT》 * |
李华等: "动态3D虚拟场景并行化光线跟踪加速结构设计", 《长春理工大学学报(自然科学版)》 * |
王瑞玲等: "基于场景建模的虚拟漫游***", 《计算机应用与软件》 * |
蒋晓瑜等: "集成成像三维显示***的研究进展及优化方法", 《光学与光电技术》 * |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021197370A1 (en) * | 2020-03-31 | 2021-10-07 | 京东方科技集团股份有限公司 | Light field display method and system, storage medium and display panel |
US11825064B2 (en) | 2020-03-31 | 2023-11-21 | Boe Technology Group Co., Ltd. | Light field display method and system, storage medium and display panel |
CN113654458A (en) * | 2021-01-21 | 2021-11-16 | 中国人民解放军陆军装甲兵学院 | Three-dimensional method and system for measuring transverse position error of lens array |
CN113654458B (en) * | 2021-01-21 | 2024-05-28 | 中国人民解放军陆军装甲兵学院 | Transverse position error three-dimensional measurement method and system for lens array |
CN113031262A (en) * | 2021-03-26 | 2021-06-25 | 中国人民解放军陆军装甲兵学院 | Integrated imaging system display end pixel value calculation method and system |
CN113031262B (en) * | 2021-03-26 | 2022-06-07 | 中国人民解放军陆军装甲兵学院 | Integrated imaging system display end pixel value calculation method and system |
CN113240785A (en) * | 2021-04-13 | 2021-08-10 | 西安电子科技大学 | Multi-camera combined rapid ray tracing method, system and application |
CN113240785B (en) * | 2021-04-13 | 2024-03-29 | 西安电子科技大学 | Multi-camera combined rapid ray tracing method, system and application |
Also Published As
Publication number | Publication date |
---|---|
CN110276823B (en) | 2023-04-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110276823A (en) | The integration imaging generation method and system that can be interacted based on ray tracing and in real time | |
US10937223B2 (en) | Multi-view processing unit systems and methods | |
US11704806B2 (en) | Scalable three-dimensional object recognition in a cross reality system | |
CN104427325B (en) | Fast integration image generating method and the naked eye three-dimensional display system with user mutual | |
Zhang et al. | Visibility culling using hierarchical occlusion maps | |
JP4555722B2 (en) | 3D image generator | |
CN110383337A (en) | Variable bit rate coloring | |
US10924727B2 (en) | High-performance light field display simulator | |
JP2009080578A (en) | Multiview-data generating apparatus, method, and program | |
CN110390711A (en) | Computer graphical based on layering ray cast | |
JP4856534B2 (en) | Image generating apparatus, program, and information storage medium | |
CN113593027B (en) | Three-dimensional avionics display control interface device | |
IL299465A (en) | Object recognition neural network for amodal center prediction | |
Chen et al. | Real-time lens based rendering algorithm for super-multiview integral photography without image resampling | |
US9401044B1 (en) | Method for conformal visualization | |
CN115861508A (en) | Image rendering method, device, equipment, storage medium and product | |
JPH09330423A (en) | Three-dimensional shape data transforming device | |
JP2006163547A (en) | Program, system and apparatus for solid image generation | |
CN115841539A (en) | Three-dimensional light field generation method and device based on visual shell | |
CN115830202A (en) | Three-dimensional model rendering method and device | |
Bettio et al. | Scalable rendering of massive triangle meshes on light field displays | |
JP2007081873A (en) | Apparatus and method for image generation, and program | |
WO2023109582A1 (en) | Light ray data processing method and apparatus, device and storage medium | |
US11861785B2 (en) | Generation of tight world space bounding regions | |
Hall et al. | Networked and multimodal 3d modeling of cities for collaborative virtual environments |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |