CN111489416A - Tunnel axis fitting method and application in calculation of over-under excavation square measure - Google Patents

Tunnel axis fitting method and application in calculation of over-under excavation square measure Download PDF

Info

Publication number
CN111489416A
CN111489416A CN202010289026.8A CN202010289026A CN111489416A CN 111489416 A CN111489416 A CN 111489416A CN 202010289026 A CN202010289026 A CN 202010289026A CN 111489416 A CN111489416 A CN 111489416A
Authority
CN
China
Prior art keywords
data
point cloud
tunnel
fitting
cloud data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010289026.8A
Other languages
Chinese (zh)
Inventor
马坤
刘静海
阳俊
何敏
蒙扬露
梁晓燕
岳翠萍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Henan Yaoluanxi Highway Construction Co ltd
Sichuan safety science and technology research institute
Sichuan Road and Bridge Group Co Ltd
Original Assignee
Henan Yaoluanxi Highway Construction Co ltd
Sichuan safety science and technology research institute
Sichuan Road and Bridge Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Henan Yaoluanxi Highway Construction Co ltd, Sichuan safety science and technology research institute, Sichuan Road and Bridge Group Co Ltd filed Critical Henan Yaoluanxi Highway Construction Co ltd
Priority to CN202010289026.8A priority Critical patent/CN111489416A/en
Publication of CN111489416A publication Critical patent/CN111489416A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/20Drawing from basic elements, e.g. lines or circles
    • G06T11/203Drawing of straight lines or curves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • G06F18/2135Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on approximation criteria, e.g. principal component analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/005Tree description, e.g. octree, quadtree
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/06Topological mapping of higher dimensional structures onto lower dimensional surfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration using histogram techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • G06T2207/20028Bilateral filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • G06T2207/20032Median filtering

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a tunnel axis fitting method and application thereof in calculation of over-undermining square amount. And the step of fitting the tunnel axis based on the pretreatment result sequentially comprises the step of evaluating the surface normal vector of the primary surface part in the pretreatment result and the step of fitting the tunnel axis after the surface normal vector is evaluated. The method and the device perform topological structure fixing, noise filtering and data compression on disordered original data with noise, further reduce the calculated amount in the subsequent process, reduce the calculation difficulty, perform splicing and feature fitting on the data to highlight the features of a data set, improve the data effectiveness and further improve the calculation efficiency.

Description

Tunnel axis fitting method and application in calculation of over-under excavation square measure
Technical Field
The invention relates to the field of tunnel construction, in particular to a tunnel axis fitting method based on laser point cloud data and application of the method in calculation of over-under excavation square volume.
Background
In mountain areas, a large number of tunnels need to be excavated no matter railway engineering or highway engineering is built. The main method used in the current tunnel construction is a drilling and blasting method, and the condition of over-short excavation cannot be avoided. And the cost and the period of construction can be directly influenced by the over-under excavation of the tunnel. Major safety production accidents may also occur if the overbreak and underrun conditions are not monitored.
Typically, prior to overbreak analysis, we need to first determine the tunnel construction axis. A common vehicular approach to the tunnel axis is to use traditional surveying methods, such as with a total station. The method can only perform single-point analysis, selection of the measuring point depends on experience of operators, data acquisition efficiency is low, and construction progress is influenced.
With the popularization of three-dimensional laser scanners and the development of scanning technologies, the cost of using point clouds to measure real space is lower and lower, and the efficiency and the precision are gradually improved. Some studies apply three-dimensional laser scanning techniques to tunnel engineering. If the torpedo applies the three-dimensional laser scanning technology to the deformation detection of the subway tunnel, a tunnel section continuous intercepting method is provided, but the algorithm of the tunnel in construction cannot accurately fit the position of an axis. If the method is provided for detecting the tunnel break-over and break-under based on the laser point cloud, the tunnel center line measurement needs to be carried out independently, and the actual operation is time-consuming and labor-consuming.
In view of the current research situation, no tunnel axis measuring and calculating method which is simple to operate, rapid in calculation and high in accuracy exists.
Disclosure of Invention
The invention aims to: aiming at the existing problems, a tunnel axis fitting method based on laser point cloud data is provided. The method adopts a three-dimensional scanning mode for data acquisition and subsequent processing, completes the acquisition of original data through simple operation, and quickly and accurately fits the tunnel axis through the design of a corresponding algorithm.
The technical scheme adopted by the invention is as follows:
a tunnel axis fitting method based on laser point cloud data comprises the following steps:
A. the tunnel point cloud data preprocessing step sequentially comprises the steps of data topological structure establishment, data filtering, data compression, data splicing and feature fitting, wherein the tunnel point cloud data is three-dimensional point cloud data obtained by scanning an excavated tunnel through a three-dimensional laser scanning system;
B. and the step of fitting the tunnel axis based on the pretreatment result sequentially comprises the step of evaluating the surface normal vector of the primary surface part in the pretreatment result and the step of fitting the tunnel axis after the surface normal vector is evaluated.
The three-dimensional laser scanning technology is adopted to collect original excavation data, the operation is simple, and the labor cost is low. The tunnel point cloud data is subjected to data topological structure establishment, the calculation difficulty of a subsequent data calculation process can be reduced, inevitable dust noise can be removed through the design of a data filtering step, the accuracy of original data is improved, the calculation amount of the subsequent process can be reduced through a data compression step, a plurality of collected visual angles are spliced into a whole in a data splicing step, a site is restored on the whole for subsequent processing, the calculation of the whole divided visual angles can reduce the phenomenon that when distribution calculation results are combined, the combination of errors brings large errors on the whole, abnormal values can be automatically filtered in a feature fitting step, the features of a sample data set are extracted for calculation, and the accuracy of the sample data set used in subsequent calculation is improved. In addition, the point cloud surface of the primary surface part can be smoothed by evaluating the surface normal vector of the primary surface part, so that the surface normal vector of the primary surface part is corrected, and the fitting to the axis is more accurate in the subsequent process.
Further, in the step B, a method for performing surface normal vector evaluation on the primary surface part includes: and (4) performing surface normal vector evaluation on the primary surface part by adopting a moving least square method.
Further, in the step B, the method of fitting the tunnel axis after the surface normal vector evaluation is as follows: and fitting the axial direction of the tunnel by adopting a principal component analysis method.
Further, in the step a, the topological structure established for the tunnel point cloud data is as follows: a tree topology.
Further, in the step a, a filtering method based on statistics is adopted to denoise the tunnel point cloud data.
Further, in the step a, the tunnel point cloud data is compressed by using a voxel downsampling algorithm.
Further, in the step a, the data splicing step includes a rough splicing step and a fine splicing step.
Further, the rough splicing step adopts a gravity center coincidence method, and the fine splicing step adopts an iteration closest point method.
Further, in the step a, the feature fitting step fits a mathematical model in the data set by using a random sampling consistency method to extract features of the data set.
The invention also discloses application of the tunnel axis fitting method based on the laser point cloud data in the calculation of the over-under excavation square amount.
In summary, due to the adoption of the technical scheme, the invention has the beneficial effects that:
the method is realized based on a three-dimensional laser scanning technology, and the acquisition of the original data can be completed through simple operation. The set preprocessing step can fix the topological structure of unordered original data, filter abnormal points and noise of the original data, compress the original data, extract main characteristics, improve data effectiveness, and further reduce the calculation amount and calculation difficulty of a possible calculation process. The method corrects the surface normal vector of the primary surface part, and can improve the accuracy of the tunnel axis fitting. The method has the characteristics of simple operation, low cost, small calculated amount, quick calculation and high accuracy.
Drawings
The invention will now be described, by way of example, with reference to the accompanying drawings, in which:
fig. 1 is point cloud data obtained by excavating a tunnel by three times of scanning.
FIG. 2 is point cloud data in PCD format.
Fig. 3 is a diagram of the filtering effect of 3 smoothing filtering methods.
Fig. 4 is a diagram illustrating the filtering effect of outlier noise points.
FIG. 5 is a graph of the effect of line fitting using a random sampling consistency method.
Fig. 6 is an imaging effect of tunnel original point cloud data in a specific scene.
FIG. 7 is an error analysis effect of axial fitting.
FIG. 8 is a graph comparing the normal/uncorrected effects of a surface being initially built.
FIG. 9 is a RANSAC fitting primary projection point cloud circle center effect graph.
FIG. 10 is a graph of the effect of the fit of the feature radius within the interval.
FIG. 11 is a primary support surface fit radius feature histogram.
FIG. 12 is a shot slice fit radius feature histogram.
FIG. 13 is a diagram of the effect of the tunnel overbreak area infinitesimal.
Fig. 14 is a flowchart of the calculation of the amount of over undermining.
FIG. 15 is a shot surface point cloud overbreak chromatogram.
Fig. 16 is an overall view of a histogram comparing excavation faces and primary faces in an analysis interval.
Fig. 17 is an expanded view of the overbreak distribution.
Detailed Description
All of the features disclosed in this specification, or all of the steps in any method or process so disclosed, may be combined in any combination, except combinations of features and/or steps that are mutually exclusive.
Any feature disclosed in this specification (including any accompanying claims, abstract) may be replaced by alternative features serving equivalent or similar purposes, unless expressly stated otherwise. That is, unless expressly stated otherwise, each feature is only an example of a generic series of equivalent or similar features.
The invention is based on three-dimensional scanning technology, and a brief introduction is made to knowledge points involved in the invention.
The target is scanned through a three-dimensional laser scanning technology, and corresponding three-dimensional point cloud data can be obtained. Three-dimensional point cloud data formats are numerous, and each instrument manufacturer generally has a file format of its own, which causes difficulty in post-processing data. Currently, the more common formats mainly include the following formats:
(a) p L Y is a polygonal file format designed and developed by Turk et al, university of Stanford;
(b) ST L is a model file format created by 3D Systems company, and is mainly applied to the fields of CAD and CAM;
(c) OBJ is a geometrically defined file format first developed by Wavefront Technologies;
(d) X3D is an ISO-compliant XM L-based file format representing 3D computer graphics data;
(e) l AS is the standard L IDAR data format published by the L IDAR Committee under the American photogrammetry and remote sensing (ASPRS) agency.
(f) The PCD is a Point Cloud data storage format proposed and used by an open source algorithm library Point Cloud library (PC L, Point Cloud L ibrary) for Point Cloud data processing.
The specification of the PCD format is described below, and other point cloud data formats are similar.
Fig. 2 shows a PCD format file of a piece of point cloud data, line 11 in the first row of the file is called a file header, and the first line is a comment line, which indicates the file format and is not generally read in a program. The following main fields are as follows:
(a) VERSION specifies the PCD file VERSION. 0.7 in the figure indicates that the PCD file version is 0.7.
(b) The FIE L DS specifies the dimensions and field names of the points, in the figure the curvature represents the curvature,
intensity denotes reflectivity, normal _ x, normal _ y, and normal _ z denote components of a surface normal vector in x, y, and z directions, respectively, x, y, and z denote three-dimensional coordinates of a point, and data with RGB information is also noted in FIE L DS.
(c) SIZE indicates the SIZE of the number of bytes for each dimension. 4 in the figure indicates that one dimension takes 4 bytes.
(d) TYPE represents the data TYPE of each dimension with one character. As shown in the figure, F represents the float data type, I represents the int data type, and U represents the fluid data type.
(e) COUNT specifies the number of elements each dimension contains. For example, each dimension in the figure contains only one number, but there are 3 in the feature data such as RGB. If there is no COUNT field, the number of all dimensions is set to 1 by default.
(f) WIDTH of the WIDTH point cloud data, i.e., how many points there are per line. When the point cloud data is ordered, it refers to the number of points per line. And if unordered, represents the total number of data points. 215812 data points are shown.
(g) HEIGHT of the HEIGHT point cloud data, i.e., how many points there are per column. When the point cloud data is ordered, it refers to the number of points in each column. And if it is disordered, 1. The figure is 1, which illustrates the point cloud data disorder.
(h) VIEWPOINT specifies the point of view from which the point cloud dataset is acquired. The viewpoint information is vector and quaternion, and is a default value, namely a coordinate origin, as shown in the figure.
(i) Poits specifies the total number of POINTS in the point cloud. The values, i.e., WIDTH and HEIGHT, are multiplied and redundant, and in future versions this feature will be removed.
(j) DATA specifies the DATA type of the stored point cloud DATA. The PCD file is ascii or binary, which respectively represents that the data is stored by using ascii or binary, and the storage type shown in the figure is ascii.
From line 12 through the end are stored data details, one data point for each line, with the data point details arranged in order in FIE L D.
Example one
The embodiment discloses a tunnel axis fitting method based on laser point cloud data, which comprises the following steps:
1. and (3) preprocessing tunnel point cloud data, wherein the tunnel point cloud data is three-dimensional point cloud data obtained by scanning excavated tunnels through a ground three-dimensional laser scanning system. The pretreatment steps sequentially comprise: establishing a data topological structure, filtering data, compressing data, splicing data and fitting characteristics.
2. And the tunnel axis fitting step comprises a step of performing surface normal vector evaluation on the primary surface part in the preprocessing result and a subsequent tunnel axis fitting step, wherein the surface normal vector evaluation adopts a Moving L east Square (M L S) to perform surface normal vector evaluation on the primary surface part so as to correct the surface normal vector.
Example two
The embodiment discloses a tunnel axis fitting method based on laser point cloud data, which comprises the following steps:
1. preprocessing step of tunnel point cloud data
The tunnel point cloud data is three-dimensional point cloud data obtained by scanning excavated tunnels through a ground three-dimensional laser scanning system. Fig. 1 shows a set of tunnel three-dimensional point cloud data models. Before axis fitting, tunnel point cloud data needs to be preprocessed, and the preprocessing process comprises the following steps.
A. Data topology establishment
Before the point cloud data is calculated, a topological structure needs to be established for the point cloud data.
The adjacent point of any data point in the point cloud is closely related to the property of the point, wherein K points adjacent to the data point are called K adjacent points. The point cloud data obtained by the three-dimensional laser scanner is generally disordered point cloud, and the disordered point cloud data is different from the two-dimensional image data, and is mainly distinguished in that no inherent data adjacency relation exists. In many point cloud algorithms, K neighboring points need to be calculated, and when the direct method is used for calculation in the disordered point cloud, the calculation time is exponentially increased along with the increase of the number of points. Therefore, the point cloud data is constructed into a topological structure, and the complexity of subsequent calculation can be effectively reduced by using the proper topological structure. In the invention, a tree topology structure is constructed for the tunnel point cloud data. The tree topology structure comprises two topological relations: KD-trees and octree (octree).
The KD-tree is a binary search tree proposed by Bentley in 1975, which uses a hyperplane to divide a high-dimensional space into two parts cyclically, the division being based on a dimensional midpoint where search points are sparse, and finally a binary search tree structure is formed. When the adjacent point is searched, only the father node and the child node are required to search, so that the search range of the adjacent point is greatly reduced, and the speed is improved. The KD-tree is characterized by being fast and suitable for laser scanning data.
The octree is similar to a quadtree which divides a two-dimensional space into four quadrants in the two-dimensional space for management, and the octree divides the point cloud into eight octagrams according to space circulation to form an octree structure. When the adjacent point needs to be searched, the method is only used in the adjacent divinatory limits, so that the search range is reduced. Octree is easy to understand relative to KD tree, and can be used in a point cloud model of a specific object, such as a point cloud model of a part.
B. Data filtering
Tunnel point cloud data (original point cloud data) acquired by using a three-dimensional laser scanning system often has a large number of noise points due to environmental and system noises in the data acquisition process, and the acquired point cloud data often has a large number of noise points and often generates interference on subsequent data processing, so that the tunnel point cloud data needs to be preprocessed before the subsequent processing.
Because the data volume of the obtained original tunnel point cloud is not large, the denoising processing can be directly carried out on the original tunnel point cloud.
For the denoising processing method, there are many options, such as two methods of smoothing filtering and statistical-based filtering to remove the noise data.
The smoothing filtering is mainly used for smoothing the surface of the point cloud, and the smoothing does not remove noise points, and common methods include mean filtering, median filtering, gaussian filtering and bilateral filtering. The methods firstly need to establish topological relation to accelerate the search of the adjacent points, then analyze the adjacent point set and estimate the search points by using the relation of the adjacent points. The Gaussian filtering uses a Gaussian function to carry out smoothing, each point is weighted according to the distance from the search point, and the closer the point is to the search point, the higher the weight is. And the mean filtering directly calculates the gravity center of the point set to replace the original search point as a filtering result. And (4) sorting the distances by median filtering, and taking the median as a result to replace the original search point. The filter effect pair is shown in fig. 3.
Bilateral filtering is improved by a bilateral filtering algorithm in two-dimensional image filtering, a normal vector relation is used as a first weight of a feature, a distance is used as a second weight, and filtering is performed by simultaneously using the two weights, so that excessive reduction of features like corners can be avoided.
For outlier noise points, a grid algorithm is generally used, the point cloud data is firstly rasterized, and then each grid is analyzed. If there are data points in a certain grid and there are no data points in the surrounding grid, the data points in the grid are marked as discrete points and removed. The method is sensitive to network division of the grid, and the outlier noise point only comprises one point in the space and also forms a small block, and grid parameters are difficult to set. Most of outlier noise points can be filtered by using a statistical analysis method, the algorithm needs to firstly obtain K neighbor points of a search point, then the distance from the search point to each neighbor point is calculated, the distribution of the distance meets Gaussian distribution, the shape of the function is determined by a distance mean value and a standard deviation, and if the average distance is out of a standard range, the outlier noise points are removed.
In this embodiment, a filtering method based on statistics is adopted to denoise the tunnel point cloud data, mainly to remove outlier noise points generated by floating dust in the air, and the filtering effect is shown in fig. 4.
C. Data compression
Because the experiment has low requirements on the point cloud precision, the denoised data can be compressed to improve the calculation speed of subsequent data processing.
A common compression method in point cloud data processing is generally lossy compression, also referred to as downsampling. Different down-sampling methods exist for ordered and unordered point clouds, wherein the sampling method for unordered point clouds is also common for ordered point clouds, and the down-sampling method for unordered point clouds is mainly introduced here. The disordered point cloud is compressed by adopting methods such as random sampling, body sampling, curvature weighted sampling and the like.
The random sampling method is a method which is easy to understand and realize, only a random number generating function is used for generating a random number which is less than the total number of points, and then the data points are removed through the indexes of the points until the sampling requirement is met. Due to the fact that the randomness of random sampling is too strong, the sampling effect is not very stable, and the obtained point cloud is possibly not uniform. And the characteristics are not protected, so that the details of the characteristics are easily lost, but the operation speed is the fastest.
The body sampling method covers all point clouds by using a cuboid bounding box, and then divides the bounding box according to the set size of the body. And replacing all points in the voxel body by using the center point of the voxel body or the gravity center of each point in the voxel body so as to realize down-sampling. The method is high in speed, the sampled point cloud is uniform in spatial distribution, and the problem of loss of detail characteristics is also solved.
Curvature is an intrinsic property describing the relationship between three-dimensional points and their surrounding data points, and the accuracy of curvature feature regions can be guaranteed by weighting the samples with curvature features. Firstly, a point cloud is segmented by using a body sampling dividing method, and meanwhile, the number of points needing to be reserved is judged according to the curvature value of each body internal data point, and finally, random extraction is carried out. The obtained sampling point cloud is uniform, more data points can be extracted in the curvature characteristic area, and the attenuation of the characteristic is reduced.
Besides, the compression method includes a clustering method, an iterative method, a particle simulation method, and the like.
In this embodiment, a voxel downsampling algorithm is used to compress the experimental data, and the running speed of subsequent calculation and analysis is increased by reducing the data amount of the point cloud.
D. Data stitching
In point cloud data acquisition, sometimes the target to be detected cannot be completely covered by one-station scanning, and multi-station data acquisition is required, and then splicing (also referred to as registration or registration in some documents) is performed to acquire complete target information. There are many splicing methods, but generally, the splicing method can be divided into two steps: the first is rough splicing and the second is fine splicing. The rough splicing is used for greatly reducing the translation and rotation errors between two pieces of point cloud, and the efficiency of fine splicing can be improved; the fine splicing is to reduce the splicing error as much as possible, and is often large in computation and time-consuming.
The common rough splice is: the centers of gravity are overlapped, and the centers of gravity of two pieces of point cloud are overlapped together through translation, so that the translation error can be reduced, but the rotation error is not changed; by using the pasting mark or placing the target, namely manually setting the characteristic points and then positioning and splicing the characteristic points, the method at least needs four different-surface characteristic points to obtain a coordinate transformation matrix, and the splicing effect is obviously influenced by the precision of the obtained characteristic points.
In the fine splicing, an Iterative Closest Point (ICP) algorithm is used as a main flow, and the algorithm is characterized by simplicity, high efficiency and good precision and stability. However, the algorithm has a large operation amount, and iteration may be trapped in local convergence due to poor initial positions, so that a lot of improved algorithms are generated aiming at the problem. And a correct corresponding point set is established in the iteration process so as to avoid the situation that the iteration is trapped in a local extremum to become the key for improving the algorithm and determine the convergence speed and the final splicing precision of the algorithm.
The essence of the ICP algorithm is an optimal registration method based on the least squares method. For the two three-dimensional data sets P1 and P2 to be registered, the main steps of the algorithm are as follows:
(1) the corresponding point of each point in P2 in the P1 point set is calculated.
(2) And obtaining rigid transformation which minimizes the average distance of corresponding points according to the corresponding relation, and obtaining a rotation parameter R and a translation parameter T.
(3) The transformation parameters obtained in the previous step are used for P2 to obtain a new three-dimensional data set P3.
(4) And (4) analyzing the corresponding points of P3 and P1 by using an error function, finishing the registration if the error function is less than a threshold value, and otherwise, jumping to the step (2). Until a jump-out condition is met or a maximum number of iterations is exceeded.
The error function is shown below.
Figure BDA0002449680640000121
In the formula, n is the logarithm of the corresponding point, qkAnd pkThe two points are respectively corresponding points in two point clouds to be spliced, R is a rotation matrix, t is a translation vector, and E is an error value.
E. Feature fitting
In the scanned point cloud data, a large amount of abnormal data exists, and a mathematical model in a sample data set can be quickly and accurately fitted through a random sample consensus (RANSAC) algorithm.
The RANSAC algorithm estimates parameters of a mathematical model from a set of data sets including local interior points in an iterative manner, and is an uncertain algorithm because it has a certain probability of obtaining a reasonable result, and the number of iterations must be increased in order to improve the fitting accuracy. Unlike a general iteration, there is no numerical connection between any two iterations of the algorithm, so parallelization is possible.
The RANSAC algorithm randomly extracts a group of data points from the data set to calculate model parameters, and then evaluates the calculation effect. This is cycled through until the evaluation score reaches a threshold, or exceeds a maximum number of iterations.
Assume the number of local points is ninThe number of data lump points is nallThe probability of each extraction of an inlier point from the dataset is denoted by w:
Figure BDA0002449680640000122
let it be assumed that the determination of the parameters requires t points, wtIs the probability that the t points are all local points, then the model calculated from the extracted data should be optimal. 1-wtIs the probability that at least one point is an outlier, the estimated model is not optimal. If iterated k times, (1-w)t)kIf the probability that t points are local interior points cannot be selected for k times, and p is set as the probability that at least one point t in the iteration is local interior point, then the following steps are carried out:
1-p=(1-wt)k(3)
p=1-(1-wt)k(4)
so when k is large enough, the value of p approaches 1, i.e., there must be at least one local point chosen.
As shown in fig. 5, a RANSAC algorithm can be used to fit a straight line equation in a group of data sets with a large amount of interference data, and it can be seen that the robustness of straight line fitting is good, and the algorithm design is simple and can be parallelized, and these advantages are not comparable to the least square method, for example. Therefore, in the present invention, there is a certain advantage in fitting the sample data set using RANSAC.
And then, intercepting (extracting) point cloud data of the part of the primary lining surface needing to be calculated. The interception process can be manually intercepted, and can also be automatically intercepted by training a corresponding machine model. To achieve automatic segmentation of the data, the point cloud data of the primary lining surface portion can be separated by a clustering algorithm (e.g., K-Means algorithm).
2. Tunnel axis fitting procedure
A. Surface normal vector estimation
After the preprocessing is completed, the estimation of the surface normal vector is performed on the primary surface portion using the Moving L east Square (M L S) (or other methods with the same effect).
B. Tunnel axis fitting
After the estimation of the surface normal vector is completed, Principal Component Analysis (PCA) is used to fit the data that has completed satisfying the requirements.
In the step 2, M L S is firstly used for normal vector calculation, the point cloud surface of the primary surface part can be smoothed, so that the normal vector calculation is more accurate, and PCA is used for fitting out a feature vector corresponding to the minimum feature value, namely the axis direction of the tunnel, FIG. 8 shows the difference between the axis of fitting after smoothing by using M L S and the axis of fitting without using M L S, and as can be seen from FIG. 8, the axis of fitting after smoothing by using M L S is more accurate.
EXAMPLE III
The embodiment discloses an analysis method for the overbreak and underexcavation in tunnel construction on the basis of fitting an initial plane axis: and on the basis of fitting the tunnel axis vector, rotating the point cloud data by using a rotation matrix, and enabling the vector to coincide with the positive direction of the z axis through rotation. Three rotation matrices around the x-axis, y-axis, and z-axis are shown as equations (5.1), (5.2), and (5.3):
Figure BDA0002449680640000141
Figure BDA0002449680640000142
Figure BDA0002449680640000143
in the formula: rx(ψ)、
Figure BDA0002449680640000144
Rz(theta) is a clockwise rotation phi around the x, y and z axes,
Figure BDA0002449680640000145
A rotation matrix of theta angles.
In order to observe the deviation of the excavation condition and the design scheme more intuitively, after the axial direction is fitted, the primary surface part is projected to a two-dimensional plane from the section direction for observation. And fitting the center of the projected point cloud data of the primary surface part on the two-dimensional plane by using a RANSAC algorithm. And (4) taking the central point as a tunnel central point, and carrying out cloud translation on the excavation surface part and the initial surface part to ensure that the center of the tunnel is coincided with the origin of the coordinate axes. As shown in fig. 9.
On the basis, the overbreak and underbreak condition of the excavation surface can be analyzed:
and extracting the point cloud of the excavation surface besides the point cloud of the primary surface by the mathematical model in the sample data set. After the two-dimensional projection of the point cloud is finished, the feature histograms of the point clouds of the primary lining surface and the excavation surface are calculated respectively, and then the data of the corresponding intervals of the point cloud of the excavation surface and the primary lining feature histogram are compared and calculated to obtain the super-under excavation value of a single point; and comparing and calculating the same interval values of the two histograms by using the concept of the micro elements to obtain the ultra-under-cut square element, and finally summing the micro elements to obtain the total ultra-under-cut square element.
First, a primary-building part feature histogram is calculated. After the point cloud is rotated, the circle center fitted by the RANSAC algorithm is used to divide the two-dimensional plane into a plurality of intervals according to the circumference, and the distances from each point in each interval to the circle center are averaged or median and calculated as a fitting characteristic radius, and in one embodiment, the average is calculated as the fitting characteristic radius, as shown in fig. 10.
And counting the characteristic radius value of each interval. In one embodiment, statistics are performed in a counter-clockwise direction to form a histogram, which facilitates subsequent data comparison, and the primary initial face portion fitting radius feature histogram is shown in fig. 11.
The point cloud of the excavation surface is compared with the primary surface histogram to calculate the overbreak and underbreak numerical value of each point, and then the specific numerical value is corresponding to different color values or gray values to form a chromatogram map, so that the overbreak and underbreak distribution condition can be analyzed visually, as shown in fig. 15. Similarly, a histogram of the blasting excavation surface is calculated, and the histogram of one slice (section) is shown in fig. 12.
By using the comparison calculation of the corresponding excavation surface histogram and the primary surface histogram, more accurate underwritten square area infinitesimal of the tunnel in each section can be quickly obtained, as shown in fig. 13.
The volume infinitesimal of the tunnel overburdened can be calculated through the histogram, and then the total overburdened square quantity is obtained through a summation method, wherein the specific calculation method is shown in formulas (6.1) to (6.3):
Figure BDA0002449680640000151
ΔV=ΔS×d (6.2)
V=∑ΔV (6.3)
wherein α is the unit interval angle divided by the point cloud along the circumferential direction, R1Fitting the radius for the divided regions corresponding to the primary lining point cloud; r2Fitting a radius for the corresponding excavation surface interval; d is the unit length of axial division; delta S is a unit interval area infinitesimal obtained by calculation; the delta V is the calculated unit interval overbreak volume; v is the calculated total over-under-excavation square amount of the tunnel excavation surface.
To sum up, the calculation flow of the under-run blasting is as shown in fig. 14, firstly, T L S is used to complete the acquisition of point cloud data of a tunnel, point cloud data of two parts of a primary support and a blasting outline are required to be obtained in the acquisition, then basic preprocessing including filtering and compression is carried out, the point cloud data of the two parts of the primary support and the blasting outline are segmented, point cloud data of the primary support part is used, a tunnel axis vector is fitted through a PCA algorithm, a rotation matrix of the point cloud projected to a two-dimensional plane is calculated through the vector, the two parts of the point cloud are rotated through the rotation matrix, slice analysis is carried out on the two parts of the point cloud after rotation along the axis, two groups of feature data are respectively obtained and are arranged to form a histogram, the histogram of the support part and the cloud slice of the point of the blasting area can be obtained through comparison, and the histogram of the blasting area can be quickly estimated.
The method is only used for analyzing the overbreak and underbreak condition of a slice or a small section of tunnel, and in practice, the overbreak and underbreak analysis needs to be carried out on the whole tunnel or a tunnel in an interval. For the calculation of the over-under excavation square amount of the blasting part, the excavation surface needs to be divided according to the axial direction of the tunnel, and then comparison histograms are sequentially generated, namely the comparison histograms are subtracted from the previous primary building histogram respectively to obtain the over-under excavation square amount histogram. In one embodiment, the analysis intervals are divided into 20 axial segments and 30 circumferential segments, each segment overbreak data is calculated, and a comparison histogram is generated, as shown in FIG. 16. Each small block in the histogram represents a volume infinitesimal, and the total excess undermining amount of the excavation surface is obtained by adding all the volume infinitesimals. By increasing the number of divisions in the axial and circumferential directions, more accurate data can be obtained. The histogram may be further visualized as a two-dimensional image, as shown in FIG. 17, to facilitate overall analysis of the blasting results.
Example four
The embodiment discloses practical application of a tunnel axis fitting method based on laser point cloud data. The application scene is as follows: the radius of the primary lining of the tunnel design is about 6.27m, the footage is about 2.80m, and the thickness of the primary lining is 0.27 m.
Firstly, a three-dimensional laser scanner of a model Z + F5010C is used to measure a tunnel environment to obtain tunnel point cloud data, and a data visualization effect of a scanning result is shown in fig. 6. After blasting and slag removal, a large amount of floating dust still exists in the air, so that the obtained image effect is poor. The floating dust can generate a large number of noise points in the three-dimensional point cloud data. The scanned tunnel is poor in illumination condition, and a built-in camera of a scanner is not used for photographing and mapping in the scanning process. The tunnel display colors are generated according to the reflectivity, wherein the blue part is the part with lower reflectivity, and is the position where water seepage occurs in the tunnel measurement environment.
For the calculation of the tunnel axis, the tunnel axis may be acquired by a total station. However, in actual operation, the total station needs to add corresponding measuring equipment and professional measuring personnel, and the measuring time is also increased, which will affect the construction progress. Therefore, the experimental process should be simplified as much as possible while ensuring the accuracy, so as to reduce the influence on the construction progress. In this embodiment, the tunnel axis is fitted based on the point cloud data acquired by the scanning in the method in the first embodiment. In brief, the complete blasting surface and part of primary support point cloud data are obtained from the tunnel point cloud data obtained through scanning, and the tunnel axis direction can be fitted through the primary support part, so that the scheme is improved from the directions of labor cost, the degree of simplicity and convenience in operation, the experiment precision and the like.
The curve radius design of the speed-limiting section of the railway tunnel is generally shown in the following table:
TABLE 2.1 Curve radius Table (m) of speed-limiting zone
Figure BDA0002449680640000171
The tunnel axis fitted by the above method is a straight line. As shown in fig. 7, taking the minimum turning radius of the tunnel as 600m and the single blasting footage as 3m as an example, the radial error is within 8mm, so that the error generated by considering the axis as a straight line in the experiment of the calculation model is not too large in general.
The invention is not limited to the foregoing embodiments. The invention extends to any novel feature or any novel combination of features disclosed in this specification and any novel method or process steps or any novel combination of features disclosed.

Claims (10)

1. A tunnel axis fitting method based on laser point cloud data is characterized by comprising the following steps:
A. the tunnel point cloud data preprocessing step sequentially comprises the steps of data topological structure establishment, data filtering, data compression, data splicing and feature fitting, wherein the tunnel point cloud data is three-dimensional point cloud data obtained by scanning an excavated tunnel through a three-dimensional laser scanning system;
B. and the step of fitting the tunnel axis based on the pretreatment result sequentially comprises the step of evaluating the surface normal vector of the primary surface part in the pretreatment result and the step of fitting the tunnel axis after the surface normal vector is evaluated.
2. The method for fitting a tunnel axis based on laser point cloud data of claim 1, wherein in the step B, the method for performing surface normal vector evaluation on the primary surface part comprises the following steps: and (4) performing surface normal vector evaluation on the primary surface part by adopting a moving least square method.
3. The method for fitting a tunnel axis based on laser point cloud data of claim 1, wherein in the step B, the method for fitting the tunnel axis after the surface normal vector evaluation is as follows: and fitting the axial direction of the tunnel by adopting a principal component analysis method.
4. The method for fitting tunnel axis based on laser point cloud data of claim 1, wherein in the step a, the topological structure established for the tunnel point cloud data is as follows: a tree topology.
5. The method of claim 1, wherein in step a, the tunnel point cloud data is denoised by a statistical-based filtering method.
6. The method of claim 1, wherein in step a, the tunnel point cloud data is compressed using a volume downsampling algorithm.
7. The method for fitting a tunnel axis based on laser point cloud data of claim 1, wherein in the step A, the data splicing step comprises a rough splicing step and a fine splicing step.
8. The method for fitting tunnel axis based on laser point cloud data of claim 7, wherein the rough splicing step adopts a method of gravity center coincidence, and the fine splicing step adopts a method of iterative closest point.
9. The method of claim 1, wherein in the step A, the feature fitting step adopts a random sampling consistency method to fit a mathematical model in the data set so as to extract features of the data set.
10. Use of the method of any one of claims 1 to 9 for tunnel axis fitting based on laser point cloud data for under-run volume calculation.
CN202010289026.8A 2020-04-14 2020-04-14 Tunnel axis fitting method and application in calculation of over-under excavation square measure Pending CN111489416A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010289026.8A CN111489416A (en) 2020-04-14 2020-04-14 Tunnel axis fitting method and application in calculation of over-under excavation square measure

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010289026.8A CN111489416A (en) 2020-04-14 2020-04-14 Tunnel axis fitting method and application in calculation of over-under excavation square measure

Publications (1)

Publication Number Publication Date
CN111489416A true CN111489416A (en) 2020-08-04

Family

ID=71798268

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010289026.8A Pending CN111489416A (en) 2020-04-14 2020-04-14 Tunnel axis fitting method and application in calculation of over-under excavation square measure

Country Status (1)

Country Link
CN (1) CN111489416A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112085759A (en) * 2020-09-07 2020-12-15 凌云光技术股份有限公司 Straight line fitting method and device based on big data
CN113093217A (en) * 2021-02-19 2021-07-09 中铁第一勘察设计院集团有限公司 Three-dimensional reconstruction method for multi-line laser scanning tunnel
CN113970291A (en) * 2021-09-23 2022-01-25 中国电建集团华东勘测设计研究院有限公司 Method for rapidly measuring super-underexcavation amount of surrounding rock of underground cavern based on three-dimensional laser scanning
EP4224114A1 (en) * 2022-01-27 2023-08-09 Topcon Corporation Laser scanning data processing device, laser scanning method, and program

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109919955A (en) * 2019-03-11 2019-06-21 南京林业大学 The tunnel axis of ground formula laser radar point cloud extracts and dividing method
CN109993697A (en) * 2019-03-19 2019-07-09 中交第二航务工程局有限公司 A kind of method of tunnel three-dimensional laser data prediction
CN110298795A (en) * 2019-05-22 2019-10-01 中交第二航务工程局有限公司 Tunnel three-dimensional laser aggregation of data denoising method
CN111462017A (en) * 2020-04-14 2020-07-28 四川省安全科学技术研究院 Denoising method for tunnel laser point cloud data

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109919955A (en) * 2019-03-11 2019-06-21 南京林业大学 The tunnel axis of ground formula laser radar point cloud extracts and dividing method
CN109993697A (en) * 2019-03-19 2019-07-09 中交第二航务工程局有限公司 A kind of method of tunnel three-dimensional laser data prediction
CN110298795A (en) * 2019-05-22 2019-10-01 中交第二航务工程局有限公司 Tunnel three-dimensional laser aggregation of data denoising method
CN111462017A (en) * 2020-04-14 2020-07-28 四川省安全科学技术研究院 Denoising method for tunnel laser point cloud data

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
李徐然: """三维激光点云数字化***质量分析与评价""" *
李徐然: ""三维激光点云数字化***质量分析与评价"" *
李徐然;施富强;廖学燕;: "基于激光点云的隧道超欠挖自动计算方法研究" *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112085759A (en) * 2020-09-07 2020-12-15 凌云光技术股份有限公司 Straight line fitting method and device based on big data
CN112085759B (en) * 2020-09-07 2023-11-10 凌云光技术股份有限公司 Linear fitting method and device based on big data
CN113093217A (en) * 2021-02-19 2021-07-09 中铁第一勘察设计院集团有限公司 Three-dimensional reconstruction method for multi-line laser scanning tunnel
CN113970291A (en) * 2021-09-23 2022-01-25 中国电建集团华东勘测设计研究院有限公司 Method for rapidly measuring super-underexcavation amount of surrounding rock of underground cavern based on three-dimensional laser scanning
CN113970291B (en) * 2021-09-23 2024-03-15 中国电建集团华东勘测设计研究院有限公司 Quick measuring method for underground cavern surrounding rock super-underexcavation amount based on three-dimensional laser scanning
EP4224114A1 (en) * 2022-01-27 2023-08-09 Topcon Corporation Laser scanning data processing device, laser scanning method, and program

Similar Documents

Publication Publication Date Title
Xu et al. Reconstruction of scaffolds from a photogrammetric point cloud of construction sites using a novel 3D local feature descriptor
CN111489416A (en) Tunnel axis fitting method and application in calculation of over-under excavation square measure
Che et al. Multi-scan segmentation of terrestrial laser scanning data based on normal variation analysis
Gross et al. Extraction of lines from laser point clouds
CN114998338B (en) Mining quantity calculation method based on laser radar point cloud
CN111652241B (en) Building contour extraction method integrating image features and densely matched point cloud features
CN110111375B (en) Image matching gross error elimination method and device under Delaunay triangulation network constraint
CN102708385A (en) Method and system for comparison and recognition of three-dimensional vehicle types in video monitoring scenes
Melo et al. 3D correspondence and point projection method for structures deformation analysis
Tarsha Kurdi et al. Automatic filtering and 2D modeling of airborne laser scanning building point cloud
CN111932669A (en) Deformation monitoring method based on slope rock mass characteristic object
CN111462017A (en) Denoising method for tunnel laser point cloud data
JP2023530449A (en) Systems and methods for air and ground alignment
CN114494385A (en) Visual early warning method for water delivery tunnel diseases
CN115690184A (en) Tunnel face displacement measurement method based on three-dimensional laser scanning
CN116518864A (en) Engineering structure full-field deformation detection method based on three-dimensional point cloud comparison analysis
Jiang et al. Learned local features for structure from motion of uav images: A comparative evaluation
CN115222884A (en) Space object analysis and modeling optimization method based on artificial intelligence
Demir Automated detection of 3D roof planes from Lidar data
Lee et al. Determination of building model key points using multidirectional shaded relief images generated from airborne LiDAR data
Jung et al. Progressive modeling of 3D building rooftops from airborne Lidar and imagery
Kang et al. 3D urban reconstruction from wide area aerial surveillance video
Lari Adaptive Processing of Laser Scanning Data and Texturing of the Segmentation Outcome
Guldur et al. Laser-based automatic cross-sectional change detection for steel frames
CN117854060B (en) Tunnel rock body planar crack identification method and system based on deep learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200804

RJ01 Rejection of invention patent application after publication