CN117315295A - BIM model similarity calculation method, system, equipment and storage medium - Google Patents

BIM model similarity calculation method, system, equipment and storage medium Download PDF

Info

Publication number
CN117315295A
CN117315295A CN202311256471.4A CN202311256471A CN117315295A CN 117315295 A CN117315295 A CN 117315295A CN 202311256471 A CN202311256471 A CN 202311256471A CN 117315295 A CN117315295 A CN 117315295A
Authority
CN
China
Prior art keywords
similarity
dimensional
bim model
dimensional texture
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311256471.4A
Other languages
Chinese (zh)
Inventor
陈彬彬
华来珍
程国军
张小军
于乃轩
梅鑫
刘大景
马俊鹏
鲁宸煜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Electronics System Engineering No2 Construction Co ltd
Original Assignee
China Electronics System Engineering No2 Construction Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Electronics System Engineering No2 Construction Co ltd filed Critical China Electronics System Engineering No2 Construction Co ltd
Priority to CN202311256471.4A priority Critical patent/CN117315295A/en
Publication of CN117315295A publication Critical patent/CN117315295A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/13Architectural design, e.g. computer-aided architectural design [CAAD] related to design of buildings, bridges, landscapes, production plants or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/54Extraction of image or video features relating to texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2119/00Details relating to the type or aim of the analysis or the optimisation
    • G06F2119/02Reliability analysis or reliability optimisation; Failure analysis, e.g. worst case scenario performance, failure mode and effects analysis [FMEA]

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Geometry (AREA)
  • Computer Hardware Design (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Civil Engineering (AREA)
  • Architecture (AREA)
  • Structural Engineering (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Generation (AREA)

Abstract

The invention discloses a BIM model similarity calculation method, a system, equipment and a storage medium, wherein the method comprises the following steps: simulating artificial vision by using a three-dimensional virtual camera, and rendering images of the BIM model before and after light weight at different positions on a spherical surface with the BIM model as a center to generate a plurality of pairs of two-dimensional texture maps; analyzing each pair of two-dimensional texture maps by utilizing a structural similarity index to obtain a similarity index map, eliminating background pixel interference, and obtaining similarity index values, namely similarity values, of the BIM models before and after light weight in the two-dimensional texture maps; and adding each combined object of the position angles and the similarity values to a result list, and sequencing the result list according to the similarity from small to large to finally obtain the comparison similarity of all angles of the building structure and an ordered list of the position angles. The method is used for automatic comparison before and after light weight of the BIM model, reduces manual workload, quantifies similarity and improves model accuracy and project implementation efficiency.

Description

BIM model similarity calculation method, system, equipment and storage medium
Technical Field
The invention relates to a BIM model lightweight front-rear similarity comparison technology, in particular to a BIM model similarity calculation method, a BIM model similarity calculation system, BIM model similarity calculation equipment and a BIM model similarity storage medium.
Background
After the general three-dimensional model format (such as the formats of FBX, OBJ and the like) is derived from BIM software, the BIM model (building information model) has large quantity of model vertexes and triangular faces, and a large quantity of redundant points and faces exist, so that the performance of the BIM model when the BIM model renders pictures in a three-dimensional engine is affected, and therefore the BIM model needs to be light in weight, and the redundant points and faces are removed. After the light weight is carried out, three-dimensional surrounding type multi-angle comparison is carried out on the light weight front and rear models by relying on manual operation at present through vision, and difference points are found out.
Because the model comparison before and after the light weight needs to be manually compared in a three-dimensional surrounding type multi-angle manner through vision, the comparison efficiency is low, the similarity is subjective judgment, and the problem that quantification cannot be achieved is solved.
Disclosure of Invention
The invention aims to: the invention aims to provide a BIM model similarity calculation method which is used for automatic comparison before and after light weight of a BIM model, reduces manual workload, quantifies similarity and improves model accuracy and project implementation efficiency.
The technical scheme is as follows: the invention discloses a BIM model similarity calculation method, which comprises the following steps:
simulating artificial vision by using a three-dimensional virtual camera, and rendering images of the building information model before and after light weight at different positions on a spherical surface with the building information model BIM as a center to generate a plurality of pairs of two-dimensional texture maps; analyzing each pair of two-dimensional texture images by utilizing a structural similarity index SSIM to obtain a similarity index image, removing interference of background pixels of a three-dimensional space in the two-dimensional texture image on the basis of the similarity index image, and obtaining similarity index values, namely similarity values, of building information model parts before and after light weight in the two-dimensional texture image;
the method for generating the multi-pair two-dimensional texture map comprises the following steps: respectively placing the three-dimensional virtual cameras on positions with x horizontal direction angle and y vertical direction angle on the spherical surface of the diagonal length diagonalLength of the boundary cube of the original BIM model by taking the light-weight front and rear BIM models as the center, and respectively facing the positions of the light-weight front and rear BIM models; respectively carrying out camera rendering on the original BIM model and the light BIM model to obtain an original two-dimensional texture map and a light two-dimensional texture map; the vertical direction angle y is circularly increased from-90 degrees to 90 degrees; for each y value, the horizontal direction angle x is circularly increased to 180 degrees from-180 degrees, the circular increasing step length of x and y is set as a comparison angle alpha, and for each x and each y, a pair of two-dimensional texture maps are obtained;
and adding each pair of x, y and similarity value combination objects to a result list, and sequencing the result list according to the similarity from small to large to finally obtain an ordered list of the comparison similarity of each angle and the position angle thereof.
Further, before rendering the image, the following operations are required:
firstly, an original BIM model and a light BIM model are imported into a three-dimensional scene; setting a contrast angle alpha and setting contrast image resolution;
creating a rendering material which is used as an output object when the camera renders the original BIM model and the light BIM model respectively; two-dimensional texture maps are created and respectively used as carriers of images which are output to rendered materials by a camera;
calculating diagonal length of boundary cube of original BIM model and calculating formulaThe method comprises the following steps: wherein x, y and z are the length, width and height of the original BIM model boundary cube;
the diagonal length diagonalllength of the camera at 2 times the distance of the far clipping plane is set to ensure that the depth of the camera's truncated cone fully contains the model when the camera takes a picture of the model in the subsequent step.
Further, the contrast angle alpha is an integer which can be divided by 360; the width and the height of the rendering material are consistent with the set contrast image resolution; the width and height of the two-dimensional texture maps are consistent with the set contrast image resolution.
Further, for a given horizontal direction angle x and vertical direction angle y, setting the rotation of the three-dimensional virtual camera to euler angles (y, x, 0), setting the position of the camera to the rotation of the camera multiplied by a vector (0, -diagonalllength); displaying an original BIM model, hiding the BIM model after light weight, performing camera rendering, outputting to a rendering material, and then reading in pixel information in the rendering material into an original two-dimensional texture map; hiding the original BIM model, displaying the light BIM model, performing camera rendering, outputting to a rendering material, and then reading pixel information in the rendering material into a light two-dimensional texture map; and obtaining paired two-dimensional texture maps.
Further, each pair of two-dimensional texture images is analyzed by using a structural similarity index SSIM to obtain a similarity index map, which specifically comprises the following steps:
(1) Creating a first two-dimensional array named as an original two-dimensional array img1, wherein the width is width and the height is height, and the value of each array element is a gray value converted by the RGB color of a pixel at the same position in the original two-dimensional texture map in a mode of 0.2989 x R+0.5870 x G+0.1140 x B;
(2) Creating a second two-dimensional array named as a light two-dimensional array img2, wherein the width is width and the height is height, and the value of each array element is a gray value converted from the RGB color of a pixel at the same position in the light two-dimensional texture map in a mode of 0.2989 x R+0.5870 x G+0.1140 x B;
(3) Creating a normalized Gaussian window of set size and standard deviation;
(4) For the original two-dimensional array img1, filtering by using a Gaussian window to obtain mu1;
(5) Filtering the light two-dimensional array img2 by using a Gaussian window to obtain mu2;
(6) Multiplying mu1 by mu2 to obtain mu1mu2;
(7) Multiplying mu1 by mu1 to obtain mu1Sq;
(8) Multiplying mu2 by mu2 to obtain mu2Sq;
(9) Filtering a two-dimensional array obtained by multiplying img1 by img2 by using a Gaussian window, and subtracting mu1mu2 to obtain sigma12;
(10) Filtering a two-dimensional array obtained by multiplying img1 by using a Gaussian window, and subtracting mu1Sq to obtain sigma1Sq;
(11) Filtering a two-dimensional array obtained by multiplying img2 by using a Gaussian window, and subtracting mu2Sq to obtain sigma2Sq;
(12) Defining a constant C1 with a value of 0.0001, defining a constant C2 with a value of 0.0009, and calculating (2 x mu1mu 2+C1) x (2 x sigma 12+C2)/((mu 1Sq+mu2 Sq+C1) x (sigma 1Sq+sigma2 Sq+C2)) to obtain a structural similarity index result Map SSIM Map.
Further, after eliminating the interference of background pixels of a three-dimensional space in the two-dimensional texture map, obtaining similarity index values, namely similarity values, of building information model parts before and after light weight in the two-dimensional texture map; the method comprises the following steps:
defining a double-precision type variable total for recording accumulation of effective values in the structural similarity index result graph SSIM Map; defining a long integer variable count for recording the number of effective values; circularly traversing each pixel of the original two-dimensional texture map and the light two-dimensional texture map, and respectively obtaining two pixels for a given position x and y; if the transparency value A of RGBA of the two pixels is 0, namely the pixel at the position is a three-dimensional scene background outside the outline of the three-dimensional BIM model shot by the camera, the pixel is not used as an effective pixel point, and the value of the pixel in the structural similarity index result graph SSIM Map is ignored; otherwise, accumulating the values of the x and y positions in the structural similarity index result Map SSIM Map into total, and adding 1 to count;
dividing total by count to obtain a structural similarity index, namely a similarity value, with background interference removed, and the value range is [0,1].
Further, the increasing range of the vertical direction angle y comprises-90 degrees and 90 degrees; the incremental range of the horizontal direction angle x includes-180 degrees, but does not include 180 degrees.
Based on the same inventive concept, the BIM model similarity calculation system of the invention comprises:
the two-dimensional texture map generation module is used for simulating artificial vision by using a three-dimensional virtual camera, rendering images of the building information model before and after light weight at different positions on a spherical surface with the building information model BIM as the center, and generating a plurality of pairs of two-dimensional texture maps;
the similarity index calculation module is used for analyzing each pair of two-dimensional texture images by utilizing the structural similarity index SSIM to obtain a similarity index graph; removing interference of background pixels of a three-dimensional space in the two-dimensional texture map on the basis, and obtaining similarity index values, namely similarity values, of building information model parts before and after light weight in the two-dimensional texture map;
the method for generating the multi-pair two-dimensional texture map comprises the following steps: respectively placing the three-dimensional virtual cameras on positions with x horizontal direction angle and y vertical direction angle on the spherical surface of the diagonal length diagonalLength of the boundary cube of the original BIM model by taking the light-weight front and rear BIM models as the center, and respectively facing the positions of the light-weight front and rear BIM models; respectively carrying out camera rendering on the original BIM model and the light BIM model to obtain an original two-dimensional texture map and a light two-dimensional texture map; the vertical direction angle y is circularly increased from-90 degrees to 90 degrees; for each y value, the horizontal direction angle x is circularly increased to 180 degrees from-180 degrees, the circular increasing step length of x and y is set as a comparison angle alpha, and for each x and each y, a pair of two-dimensional texture maps are obtained;
and the list generation module is used for adding each pair of x, y and similarity value combination objects into a result list, sequencing the result list from small to large in similarity, and finally obtaining an ordered list of the comparison similarity of each angle and the position angle thereof.
Based on the same inventive concept, the BIM model similarity calculation device of the invention comprises:
a memory storing executable program code;
a processor coupled to the memory;
the processor invokes the executable program code stored in the memory to perform a BIM model similarity calculation method as described above.
Based on the same inventive concept, a computer readable storage medium of the present invention stores computer instructions for executing a BIM model similarity calculation method as described above when the computer instructions are called.
The beneficial effects are that: compared with the prior art, the invention has the advantages that:
according to the invention, a three-dimensional engine virtual camera is utilized to simulate artificial vision, images are rendered from two models before and after light weight by multiple angles on a spherical surface taking a BIM model as a center, images of the two models rendered at the same position are automatically analyzed and compared by utilizing a structural similarity index SSIM method to obtain a similarity index graph, the interference of background pixels of a three-dimensional space in the images is eliminated on the basis, the similarity index value of a model part in the images is obtained, and further a quantized similarity value is obtained;
compared with the existing manual comparison light-weight difference, the invention has the advantages of high comparison speed, high efficiency and capability of quantifying the similarity; the invention adopts computer graphics technology, uses computer virtual camera to simulate artificial vision, namely, replaces artificial technical means with computer; the method has the advantage that the similarity can be quantified, and is obtained by comparing images shot at the same angle of the models before and after light weight by adopting a structural similarity index SSIM method for optimizing and eliminating the background.
Drawings
FIG. 1 is a flow chart of the method of the present invention;
FIG. 2 is an image rendering and contrast sub-flowchart;
FIG. 3 is a schematic view of camera position I, where (a) is the original BIM model and (b) is the lightweight BIM model;
FIG. 4 is a schematic view of camera position II, where (a) is the original BIM model and (b) is the lightweight BIM model;
FIG. 5 is a schematic view of camera position III, where (a) is the original BIM model and (b) is the light-weighted BIM model.
Detailed Description
The invention will now be described in detail with reference to the drawings and specific examples.
The invention relates to a method for comparing the similarity before and after the weight reduction of a three-dimensional building information model based on a three-dimensional virtual camera and a structural similarity index. The method mainly utilizes a computer graphics technology, uses a three-dimensional virtual camera to simulate artificial vision, and renders images of a Building Information Model (BIM) before and after light weight at different positions on a spherical surface which takes a three-dimensional building information model of a building structure as a center. Respectively carrying out image rendering on two three-dimensional building information models BIM before and after light weight at each position to generate a pair of two-dimensional texture maps of the building structure, analyzing the two-dimensional texture maps of each pair of building structures by utilizing a structural similarity index SSIM to obtain a similarity index map of the building structure, and eliminating the interference of background pixels of a three-dimensional space in the two-dimensional texture map of the building structure on the basis of the similarity index map to obtain a similarity index value, namely a similarity value, of a building information model part in the two-dimensional texture map of the building structure. And ordering the list of the combined objects of the similarity values and angles of all the positions of the building structure according to the similarity values, and finally obtaining a quantized ordered list of the similarity values which are arranged from small to large and the position angles of the quantized ordered list on the spherical surface with the BIM model as the center.
As shown in FIG. 1, the BIM model similarity calculation method of the invention comprises the following steps:
(1) A three-dimensional scene is created in a three-dimensional engine, i.e. a three-dimensional space is created containing a three-dimensional coordinate system, a virtual camera, parallel light sources.
(2) And importing the original BIM model of the building structure and the light BIM model into the three-dimensional scene.
(3) The comparison angle alpha is set, and the comparison angle alpha is an integer which can be divided by 360, and is an interval angle for comparing two-dimensional texture maps of the front building structure and the rear building structure.
(4) And setting the resolution of a contrast image, such as 1024 pixels in width and 1024 pixels in height, wherein the contrast image is a two-dimensional texture map of the building structure with two times of front and back contrast.
(5) Creating a rendering material, setting the rendering material as an output object of the camera, wherein the width and the height of the rendering material are consistent with the set contrast image resolution.
(6) Two-dimensional texture maps are created and respectively used as carriers of images after the cameras output to the rendered materials, and the width and the height of the two-dimensional texture maps are consistent with the set contrast image resolution.
(7) Calculating the diagonal length of the boundary cube of the original BIM model of the building structure, wherein the calculation formula is as follows:wherein x, y and z are the length, width and height of the original BIM model boundary cube of the building structure.
(8) The diagonal length diagonalllength of which the distance between the far cutting plane of the camera is 2 times is set so as to ensure that the depth of the truncated cone of the camera can completely contain the original BIM model and the light BIM model of the building structure when the camera photographs the original BIM model and the light BIM model of the building structure in the subsequent steps.
(9) The camera vertical direction angle y is set to be-90 degrees in initial value.
(10) Judging whether y is smaller than or equal to 90 degrees, if yes, continuing to step (11); if not, go to step (19).
(11) Judging whether y is equal to-90 degrees or 90 degrees, if so, continuing the step (12); if not, go to step (15).
(12) The camera horizontal direction angle x is set to 0 degrees.
(13) Image rendering and contrast sub-flows are performed as shown in fig. 2.
(14) y is incremented by α and the process jumps to step (10).
(15) The camera horizontal direction angle x is set, and the initial value is-180 degrees.
(16) Judging whether x is smaller than 180 degrees, if so, continuing to step (17); if not, go to step (14).
(17) Image rendering and contrast sub-flows are performed as shown in fig. 2.
(18) x is incremented by α and the process proceeds to step (16).
(19) And arranging the list of the result objects containing the similarity and the angles x and y from small to large according to the similarity to obtain an original BIM model of the building structure and a light-weight ordered comparison result list of the BIM model.
The purpose of the steps (9) to (19) is as follows: circularly selecting a plurality of points on a spherical surface with an origin (namely the position of an original BIM model of a building structure and a light BIM model) as a center by taking a comparison angle alpha as a step length; the spherical meridian direction is the vertical direction, and the spherical meridian direction is the horizontal direction. Setting a vertical direction angle y, and starting to circularly increase from-90 degrees (including) to 90 degrees (including), wherein the step length is a set contrast angle alpha; for each y value, a horizontal direction angle x is set, and the steps are increased from-180 degrees (inclusive) to 180 degrees (exclusive), wherein the step is the set contrast angle alpha, and the first three camera positions are shown in fig. 3 (a) and (b), fig. 4 (a) and (b) and fig. 5 (a) and (b) in the camera rendering process. And for each x and each y, enabling the camera to respectively render images towards the centers of the original BIM model and the light BIM model of the corresponding building structure, and calculating the structural similarity index.
As shown in fig. 2, the image rendering and comparing sub-process comprises the following specific steps:
(101) For a given horizontal direction angle x and vertical direction angle y, the rotation of the camera is set to euler angles (y, x, 0). The position of the camera is set to be the rotation of the camera multiplied by the vector (0, -diagonalLength), i.e. the camera is placed on a position with the origin as the center, the horizontal angle x and the vertical angle y on the sphere with radius diagonalLength, and the camera is oriented towards the origin, which is also the position where the model is located.
(102) Displaying an original BIM model of the building structure, hiding the BIM model after light weight, performing camera rendering, outputting a rendering material with an object being the set rendering material, and then reading pixel information in the rendering material into an original two-dimensional texture map.
(103) Hiding an original BIM model of the building structure, displaying the light-weighted BIM model, performing camera rendering, outputting a rendering material with an object being the set rendering material, and then reading pixel information in the rendering material into a light-weighted two-dimensional texture map.
(104) And calculating the SSIM Map of the structural similarity index result Map of the building structure by utilizing a structural similarity index method according to the obtained original two-dimensional texture Map and the light two-dimensional texture Map.
(105) And defining a double-precision type variable total for recording the accumulation of effective values in the SSIM Map of the structural similarity index of the building structure. A long integer variable count is defined for recording the number of significant values. And circularly traversing each pixel of the original two-dimensional texture map and the two-dimensional texture map after light weight, and respectively obtaining two pixels for a given position x and y. If the transparency value A of RGBA of the two pixels is 0, namely the pixel at the position is a three-dimensional scene background outside the outline of the three-dimensional BIM model shot by the camera, the pixel is not used as an effective pixel point, and the value of the pixel in the structural similarity index two-dimensional Map SSIM Map of the building structure is ignored; otherwise, accumulating the values of the x and y positions in the structural similarity index Map SSIM Map of the building structure into total, and adding 1 to count. The method comprises the following steps:
1) And defining a double-precision type variable total for recording accumulation of effective values in the structural similarity index result graph SSIM Map of the building structure. A long integer variable count is defined for recording the number of significant values.
2) And defining a transverse count i, wherein the initial value of the transverse count i is 0, and the transverse count is used for transversely traversing pixels of the original two-dimensional texture map and the two-dimensional texture map after light weight and is also used for transversely traversing pixels in the structural similarity index result map.
3) Judging whether i is smaller than the pixel width value of the structural similarity index result graph, if so, continuing to step 4); if not, then step (106) is continued.
4) And defining a longitudinal count j, wherein the initial value of the longitudinal count j is 0, and the longitudinal count is used for longitudinally traversing pixels of the original two-dimensional texture map and the lightweight two-dimensional texture map and is also used for longitudinally traversing pixels in the structural similarity index result map.
5) Judging whether j is smaller than the pixel height value of the structural similarity index result graph, if so, continuing to the step 6); if not, go to step 10).
6) The pixels at the (i, j) positions in the two-dimensional texture maps are acquired respectively and are denoted as p1 and p2.
7) It is determined whether at least one of the transparency channel values of the colors of the pixels p1 and p2 is greater than 0. If yes, continuing step 8); if not, go to step 9).
8) And accumulating the value at the (i, j) position in the structural similarity index graph to the total index number total, and adding 1 to the pixel point number count.
9) The vertical count j is incremented by 1 and the process jumps to step 5).
10 A lateral count i is incremented by 1, and the process jumps to step 3).
(106) Dividing the total index number total by the number count of pixel points to obtain or filter out the structural similarity index value, namely the similarity of the transparent background pixels.
(107) And adding the combined object of the similarity and the current spherical position angle x, y to a result list.
Examples:
the present invention may be implemented using three-dimensional Engine tools such as Unity or Unreal Engine. The present invention will be described in detail using Unity as an example.
(1) A three-dimensional scene is set, comprising a camera, parallel light. The parallel light uses default parameters; setting the distance between the near clipping plane nearest clip plane of the camera to be 0.001 so as to prevent the distance between the BIM model of the building structure and the camera from being smaller than the set distance and not being capable of completely rendering the model image; the camera projection mode is set to perspective and the field angle is set to 60 degrees.
(2) The original BIM model and the light BIM model of the building structure are imported into a three-dimensional scene, and the positions of the two models are set as original points (0, 0 and 0).
(3) The comparison angle alpha is set, such as 30 degrees, namely, the horizontal direction and the vertical direction, and is compared every interval alpha angle.
(4) Calculating the length of the diagonal of the boundary lengths x, y, z of the original BIM model of the building structure,recorded as diagonal length diagonalllength.
(5) The contrast image resolution is set, e.g., 1024 pixels by 1024 pixels, denoted width and height, respectively.
(6) A rendering material is created, which is renderTexture in Unity, is set as an output object of the camera, the width and height of the rendering material are set to the width and height of the set contrast image resolution, and the depth is set to 24.
(7) The far clipping plane farClipPlane distance of the camera is set to 2 times the diallength.
(8) Two-dimensional Texture maps Texture2D with the same size and parameters are created, wherein the width and the height of the two-dimensional Texture maps are respectively the width and the height of the resolution of a comparison image, the format is RGBA32, namely R, G, B, A and 4 channels are two-dimensional Texture maps with 8 bits. One of the two-dimensional texture maps is named as an original two-dimensional texture map and is used for storing image data of an original BIM model of a building structure, and the other two-dimensional texture map is named as a light-weighted two-dimensional texture map and is used for storing image data of the light-weighted BIM model.
(9) The purpose of the current step is: and circularly selecting a plurality of points on a spherical surface taking the origin (namely the position of the model) as the center by taking the contrast angle alpha as the step length. The spherical meridian direction is the vertical direction, and the spherical meridian direction is the horizontal direction. Setting a vertical direction angle y, and starting to circularly increase from-90 degrees (including) to 90 degrees (including), wherein the step length is a set contrast angle alpha; for each y value, setting a horizontal direction angle x, starting from-180 degrees (including) and circularly increasing to 180 degrees (not including), wherein the step length is a set contrast angle alpha, and for each x and each y, performing camera rendering pictures and performing structural similarity index calculation, wherein the specific method comprises the following steps:
(a) For a given horizontal direction angle x and vertical direction angle y, the camera rotation is set to Euler angle (y, x, 0), the camera position location is set to the camera rotation multiplied by the vector (0, -diagonalLength), i.e., the camera is placed on a position centered on the origin with the horizontal direction angle x and the vertical direction angle y on the sphere with the radius diagonalLength, and the camera is oriented toward the origin. The current camera parameter settings may fully contain the current model in depth, width, and height.
(b) Displaying an original BIM model of the building structure, hiding the BIM model after light weight, performing camera rendering, outputting a rendering material with an object being the set rendering material, and then reading pixel information in the rendering material into an original two-dimensional texture map.
(c) Hiding an original BIM model of the building structure, displaying the light-weighted BIM model, performing camera rendering, outputting a rendering material with an object being the set rendering material, and then reading pixel information in the rendering material into a light-weighted two-dimensional texture map.
(d) And calculating the SSIM Map of the structural similarity index result graph of the building structure according to the obtained original two-dimensional texture graph and the light two-dimensional texture graph. The method comprises the following steps:
1) Creating a first two-dimensional array named as an original two-dimensional array img1, wherein the width is width and the height is height, and the value of each array element is a gray value converted by the pixel RGB color at the same position in the original two-dimensional texture map according to the mode of 0.2989 x R+0.5870 x G+0.1140 x B.
2) Creating a second two-dimensional array named as a light two-dimensional array img2, wherein the width is width and the height is height, and the value of each array element is a gray value converted by the RGB color of a pixel at the same position in the light two-dimensional texture map according to the mode of 0.2989 x R+0.5870 x G+0.1140 x B.
3) A normalized gaussian window of size 11 and standard deviation 1.5 is created.
4) For the original two-dimensional array, mu1 is filtered using a Gaussian window.
5) For the lightweight two-dimensional array, mu2 is obtained by using Gaussian window filtering.
6) Mu1 is multiplied by mu2 to obtain mu1mu2.
7) Mu1 is multiplied by mu1 to obtain mu1Sq.
8) Mu2 is multiplied by mu2 to obtain mu2Sq.
9) For the two-dimensional array obtained by multiplying img1 by img2, the result of Gaussian window filtering is used, and mu1mu2 is subtracted to obtain sigma12.
10 For a two-dimensional array obtained by multiplying img1 by img1, the result after filtering by using a Gaussian window is subtracted by mu1Sq to obtain sigma1Sq.
11 For a two-dimensional array obtained by multiplying img2 by img2, subtracting mu2Sq from the result obtained by filtering by using a Gaussian window to obtain sigma2Sq.
12 A constant C1 of 0.0001 and a constant C2 of 0.0009.
13 Calculating (2 x mu1mu 2+C1) (2 x sigma 12+C2)/((mu 1Sq+mu2 Sq+C1) (sigma 1Sq+sigma2 Sq+C2)) to obtain a structural similarity index result Map SSIM Map.
(e) And defining a double-precision type variable total for recording accumulation of effective values in the structural similarity index result graph SSIM Map of the building structure. A long integer variable count is defined for recording the number of significant values.
(f) And defining a transverse count i, wherein the initial value of the transverse count i is 0, and the transverse count is used for transversely traversing pixels of the original two-dimensional texture map and the two-dimensional texture map after light weight and is also used for transversely traversing pixels in the structural similarity index result map.
(g) Judging whether i is smaller than the pixel width value of the structural similarity index result graph, if so, continuing the step (h); if not, continuing step (o).
(h) And defining a longitudinal count j, wherein the initial value of the longitudinal count j is 0, and the longitudinal count is used for longitudinally traversing pixels of the original two-dimensional texture map and the lightweight two-dimensional texture map and is also used for longitudinally traversing pixels in the structural similarity index result map.
(i) Judging whether j is smaller than the pixel height value of the structural similarity index result graph, if so, continuing the step (j); if not, jumping to step (n).
(j) The pixels at the (i, j) positions in the two-dimensional texture maps are acquired respectively and are denoted as p1 and p2.
(k) It is determined whether at least one of the transparency channel values of the colors of the pixels p1 and p2 is greater than 0. If yes, continuing the step (l); if not, then go to step (m).
(l) And accumulating the value at the (i, j) position in the structural similarity index graph to the total index number total, and adding 1 to the pixel point number count.
(m) the vertical count j is incremented by 1, and the process goes to step (i).
(n) the lateral count i is incremented by 1 and the process jumps to step (g).
(o) dividing the total index number total by the number of pixels count to obtain or filter out the structural similarity index value, i.e. the similarity, of the transparent background pixels.
(p) adding the combined object of similarity and current sphere position angle x, y to the result list.
(10) And sequencing the result list from small to large according to the similarity, and finally obtaining the comparison similarity of all angles of the building structure and an ordered list of the position angles of the comparison similarity.
In summary, the invention can be applied to the digital twin industry based on the BIM model. The method mainly solves the problems that the manual comparison efficiency of the similarity between the BIM model and the original model is low after the BIM model is light and the similarity cannot be quantized. The method is used for automatic comparison before and after light weight of the BIM model, reduces manual workload, quantifies similarity and improves model accuracy and project implementation efficiency.
Based on the same inventive concept, the BIM model similarity calculation system of the invention comprises:
the two-dimensional texture map generation module is used for simulating artificial vision by using a three-dimensional virtual camera, rendering images of the building information model before and after light weight at different positions on a spherical surface with the building information model BIM as the center, and generating a plurality of pairs of two-dimensional texture maps;
the similarity index calculation module is used for analyzing each pair of two-dimensional texture images by utilizing the structural similarity index SSIM to obtain a similarity index graph; removing interference of background pixels of a three-dimensional space in the two-dimensional texture map on the basis, and obtaining similarity index values, namely similarity values, of building information model parts before and after light weight in the two-dimensional texture map;
the method for generating the multi-pair two-dimensional texture map comprises the following steps: respectively placing the three-dimensional virtual cameras on positions with x horizontal direction angle and y vertical direction angle on the spherical surface of the diagonal length diagonalLength of the boundary cube of the original BIM model by taking the light-weight front and rear BIM models as the center, and respectively facing the positions of the light-weight front and rear BIM models; respectively carrying out camera rendering on the original BIM model and the light BIM model to obtain an original two-dimensional texture map and a light two-dimensional texture map; the vertical direction angle y is circularly increased from-90 degrees to 90 degrees; for each y value, the horizontal direction angle x is circularly increased to 180 degrees from-180 degrees, the circular increasing step length of x and y is set as a comparison angle alpha, and for each x and each y, a pair of two-dimensional texture maps are obtained;
and the list generation module is used for adding each pair of x, y and similarity value combination objects into a result list, sequencing the result list from small to large in similarity, and finally obtaining an ordered list of the comparison similarity of each angle and the position angle thereof.
Based on the same inventive concept, the BIM model similarity calculation device of the present invention may include: a memory storing executable program code; a processor coupled to the memory; the processor invokes executable program code stored in the memory for performing the steps in a BIM model similarity calculation method as described in embodiment one.
The memory may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM) and/or cache memory. The device may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, memory may be used to read from or write to non-removable, non-volatile magnetic media (commonly referred to as a "hard disk drive"). A program/utility having a set (at least one) of program modules may be stored, for example, in a memory, such program modules including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment. Program modules typically carry out the functions and/or methods of the embodiments described herein.
The processor executes various functional applications and data processing by running programs stored in the memory, for example, to implement the method provided by the first embodiment of the present invention.
Based on the same inventive concept, a computer readable storage medium of the present invention stores computer instructions for executing a BIM model similarity calculation method as described above when the computer instructions are called.
The computer storage media of embodiments of the invention may take the form of any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations of the present invention may be written in one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++, c#, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
Of course, the storage medium containing the computer executable instructions provided in the embodiments of the present invention is not limited to the above method operations, but may also perform the related operations in the method provided in any embodiment of the present invention.

Claims (10)

1. The BIM model similarity calculation method is characterized by comprising the following steps of:
simulating artificial vision by using a three-dimensional virtual camera, rendering images of the building information model before and after light weight at different position angles on a spherical surface with the building information model BIM of the building structure as a center, and generating a two-dimensional texture map of a plurality of pairs of building structures; analyzing the two-dimensional texture images of each pair of building structures by using a structural similarity index SSIM to obtain a similarity index graph, eliminating the interference of background pixels of a three-dimensional space in the two-dimensional texture graph of the building structures on the basis of the similarity index graph, obtaining similarity index values, namely similarity values, of building information model parts before and after light weight in the two-dimensional texture graph of the building structures, combining each pair of position angles and the similarity values, adding the combined objects into a result list, finally sorting the result list from small to large according to similarity, and finally obtaining the comparison similarity of each angle of the building structures and an ordered list of the position angles of the comparison similarity.
The method for rendering the image comprises the following steps: respectively placing the three-dimensional virtual cameras on positions with x horizontal direction angle and y vertical direction angle on the spherical surface of the diagonal length diagonalLength of the boundary cube of the original BIM model by taking the light-weight front and rear BIM models as the center, and respectively facing the positions of the light-weight front and rear BIM models; respectively carrying out camera rendering on the original BIM model and the light BIM model to obtain an original two-dimensional texture map and a light two-dimensional texture map; the vertical direction angle y is circularly increased from-90 degrees to 90 degrees; for each y value, the horizontal direction angle x is circularly increased from-180 degrees to 180 degrees, the circular increasing step length of x and y is set as a contrast angle alpha, and a pair of two-dimensional texture maps are obtained for each x and each y.
2. The BIM model similarity calculation method of claim 1, further comprising the following operations prior to rendering the image:
firstly, an original BIM model and a light BIM model are imported into a three-dimensional scene; setting a contrast angle alpha and contrast image resolution;
creating a rendering material which is used as an output object when the camera renders the original BIM model and the light BIM model respectively; two-dimensional texture maps are created and respectively used as carriers of images which are output to rendered materials by a camera;
the diagonal length of the boundary cube of the original BIM model is calculated, and the calculation formula is as follows: wherein x, y and z are the length, width and height of the original BIM model boundary cube;
the diagonal length diagonalllength of the camera at 2 times the distance of the far clipping plane is set to ensure that the depth of the camera's truncated cone fully contains the model when the camera takes a picture of the model in the subsequent step.
3. The method for calculating the similarity of a BIM model according to claim 2, wherein the comparison angle α is an integer divided by 360; the width and the height of the rendering material are consistent with the set contrast image resolution; the width and height of the two-dimensional texture maps are consistent with the set contrast image resolution.
4. The BIM model similarity calculation method according to claim 1, wherein for a given horizontal direction angle x and vertical direction angle y, the rotation of the three-dimensional virtual camera is set to euler angle (y, x, 0), and the position of the camera is set to be the rotation of the camera multiplied by a vector (0, -diagonalllength); displaying an original BIM model, hiding the BIM model after light weight, performing camera rendering, outputting a rendering material with an object being set, and then reading pixel information in the rendering material into an original two-dimensional texture map; hiding the original BIM model, displaying the light BIM model, performing camera rendering, outputting a rendering material with an object being the set rendering material, and then reading pixel information in the rendering material into a light two-dimensional texture map; and obtaining paired two-dimensional texture maps.
5. The BIM model similarity calculation method according to claim 1, wherein each pair of two-dimensional texture images is analyzed by using a structural similarity index SSIM to obtain a similarity index map, specifically:
(1) Creating a first two-dimensional array named as an original two-dimensional array img1, wherein the width is width and the height is height, and the value of each array element is a gray value converted by the RGB color of a pixel at the same position in the original two-dimensional texture map in a mode of 0.2989 x R+0.5870 x G+0.1140 x B;
(2) Creating a second two-dimensional array named as a light two-dimensional array img2, wherein the width is width and the height is height, and the value of each array element is a gray value converted from the RGB color of a pixel at the same position in the light two-dimensional texture map in a mode of 0.2989 x R+0.5870 x G+0.1140 x B;
(3) Creating a normalized Gaussian window of set size and standard deviation;
(4) For the original two-dimensional array img1, filtering by using a Gaussian window to obtain mu1;
(5) Filtering the light two-dimensional array img2 by using a Gaussian window to obtain mu2;
(6) Multiplying mu1 by mu2 to obtain mu1mu2;
(7) Multiplying mu1 by mu1 to obtain mu1Sq;
(8) Multiplying mu2 by mu2 to obtain mu2Sq;
(9) Filtering a two-dimensional array obtained by multiplying img1 by img2 by using a Gaussian window, and subtracting mu1mu2 to obtain sigma12;
(10) Filtering a two-dimensional array obtained by multiplying img1 by using a Gaussian window, and subtracting mu1Sq to obtain sigma1Sq;
(11) Filtering a two-dimensional array obtained by multiplying img2 by using a Gaussian window, and subtracting mu2Sq to obtain sigma2Sq;
(12) Defining a constant C1 with a value of 0.0001, defining a constant C2 with a value of 0.0009, and calculating (2 x mu1mu 2+C1) x (2 x sigma 12+C2)/((mu 1Sq+mu2 Sq+C1) x (sigma 1Sq+sigma2 Sq+C2)) to obtain a structural similarity index result Map SSIM Map.
6. The method for calculating the similarity of a BIM model according to claim 1, wherein the similarity index value, namely the similarity value, of the building information model parts before and after the light weight in the two-dimensional texture map is obtained after the interference of background pixels of the three-dimensional space in the two-dimensional texture map of the building structure is eliminated; the method comprises the following steps:
defining a double-precision type variable total for recording accumulation of effective values in the structural similarity index result graph SSIM Map; defining a long integer variable count for recording the number of effective values; circularly traversing each pixel of the original two-dimensional texture map and the light two-dimensional texture map, and respectively obtaining two pixels for a given position x and y; if the transparency value A of RGBA of the two pixels is 0, namely the pixel at the position is a three-dimensional scene background outside the outline of the three-dimensional BIM model shot by the camera, the pixel is not used as an effective pixel point, and the value of the pixel in the structural similarity index result graph SSIM Map is ignored; otherwise, accumulating the values of the x and y positions in the structural similarity index result Map SSIM Map into total, and adding 1 to count;
dividing total by count to obtain a structural similarity index, namely a similarity value, with background interference removed, and the value range is [0,1].
7. The method for calculating the similarity of a BIM model according to claim 1, wherein the increasing range of the vertical direction angle y includes-90 degrees and 90 degrees; the incremental range of the horizontal direction angle x includes-180 degrees, but does not include 180 degrees.
8. A BIM model similarity calculation system, comprising:
the two-dimensional texture map generation module is used for simulating artificial vision by using a three-dimensional virtual camera, rendering images of the building information model before and after light weight at different positions on a spherical surface with the building information model BIM of the building structure as the center, and generating a plurality of pairs of two-dimensional texture maps;
the similarity index calculation module is used for analyzing the two-dimensional texture image of each pair of building structures by utilizing the structural similarity index SSIM to obtain a similarity index graph; removing interference of background pixels of a three-dimensional space in the two-dimensional texture map on the basis, and obtaining similarity index values, namely similarity values, of building information model parts before and after light weight in the two-dimensional texture map;
the method for generating the multi-pair two-dimensional texture map comprises the following steps: respectively placing the three-dimensional virtual cameras on positions with x horizontal direction angle and y vertical direction angle on the spherical surface of the diagonal length diagonalLength of the boundary cube of the original BIM model by taking the light-weight front and rear BIM models as the center, and respectively facing the positions of the light-weight front and rear BIM models; respectively carrying out camera rendering on the original BIM model and the light BIM model to obtain an original two-dimensional texture map and a light two-dimensional texture map; the vertical direction angle y is circularly increased from-90 degrees to 90 degrees; for each y value, the horizontal direction angle x is circularly increased to 180 degrees from-180 degrees, the circular increasing step length of x and y is set as a comparison angle alpha, and for each x and each y, a pair of two-dimensional texture maps are obtained;
and the list generation module is used for combining each pair of x, y and similarity values and adding the combined objects into a result list, sequencing the result list from small to large in similarity, and finally obtaining an ordered list of the comparison similarity and the position angle of the building structure at each angle.
9. A BIM model similarity calculation device, the device comprising:
a memory storing executable program code;
a processor coupled to the memory;
the processor invokes the executable program code stored in the memory to perform a BIM model similarity calculation method as claimed in any one of claims 1 to 7.
10. A computer readable storage medium storing computer instructions which, when invoked, are adapted to perform a BIM model similarity calculation method according to any one of claims 1 to 7.
CN202311256471.4A 2023-09-27 2023-09-27 BIM model similarity calculation method, system, equipment and storage medium Pending CN117315295A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311256471.4A CN117315295A (en) 2023-09-27 2023-09-27 BIM model similarity calculation method, system, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311256471.4A CN117315295A (en) 2023-09-27 2023-09-27 BIM model similarity calculation method, system, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117315295A true CN117315295A (en) 2023-12-29

Family

ID=89286068

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311256471.4A Pending CN117315295A (en) 2023-09-27 2023-09-27 BIM model similarity calculation method, system, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117315295A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117952479A (en) * 2024-03-26 2024-04-30 广州珠江装修工程有限公司 BIM-based interior decoration engineering supervision method and system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117952479A (en) * 2024-03-26 2024-04-30 广州珠江装修工程有限公司 BIM-based interior decoration engineering supervision method and system

Similar Documents

Publication Publication Date Title
US11721071B2 (en) Methods and systems for producing content in multiple reality environments
CN108648269B (en) Method and system for singulating three-dimensional building models
US7450758B2 (en) Stylization of video
US8633939B2 (en) System and method for painting 3D models with 2D painting tools
CN111968216A (en) Volume cloud shadow rendering method and device, electronic equipment and storage medium
WO2021249091A1 (en) Image processing method and apparatus, computer storage medium, and electronic device
CN110163831B (en) Method and device for dynamically displaying object of three-dimensional virtual sand table and terminal equipment
CN117315295A (en) BIM model similarity calculation method, system, equipment and storage medium
CN110570496B (en) RGBD image environment light editing method and system based on spherical harmonic illumination
CN111476877A (en) Shadow rendering method and device, electronic equipment and storage medium
US20220375152A1 (en) Method for Efficiently Computing and Specifying Level Sets for Use in Computer Simulations, Computer Graphics and Other Purposes
CN115546027B (en) Image suture line determination method, device and storage medium
CN113920036A (en) Interactive relighting editing method based on RGB-D image
US20070190502A1 (en) System and method for creating a simulation of a terrain that includes simulated illumination effects
CN117197323A (en) Large scene free viewpoint interpolation method and device based on neural network
KR20230149093A (en) Image processing method, training method for image processing, and image processing apparatus
US11941782B2 (en) GPU-based lens blur rendering using depth maps
CN111739074A (en) Scene multipoint light source rendering method and device
CN110889889A (en) Oblique photography modeling data generation method applied to immersive display equipment
CN113781618B (en) Three-dimensional model light weight method, device, electronic equipment and storage medium
CN116012532A (en) Live-action three-dimensional model light-weight method and system
CN116109758B (en) Method and device for positioning projection position of light source and rendering scene
CN117197300B (en) Rendering synthesis method of three-dimensional wire frame perspective view map based on transparent channel
CN116777940B (en) Data processing method, device, equipment and storage medium
CN116883575B (en) Building group rendering method, device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination