CN109345620B - Improved object point cloud splicing method for ICP (inductively coupled plasma) to-be-measured object by fusing fast point feature histogram - Google Patents

Improved object point cloud splicing method for ICP (inductively coupled plasma) to-be-measured object by fusing fast point feature histogram Download PDF

Info

Publication number
CN109345620B
CN109345620B CN201810917197.3A CN201810917197A CN109345620B CN 109345620 B CN109345620 B CN 109345620B CN 201810917197 A CN201810917197 A CN 201810917197A CN 109345620 B CN109345620 B CN 109345620B
Authority
CN
China
Prior art keywords
point
point cloud
points
interpolation
cloud
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810917197.3A
Other languages
Chinese (zh)
Other versions
CN109345620A (en
Inventor
赵昕玥
连巧龙
何再兴
张树有
谭建荣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN201810917197.3A priority Critical patent/CN109345620B/en
Publication of CN109345620A publication Critical patent/CN109345620A/en
Application granted granted Critical
Publication of CN109345620B publication Critical patent/CN109345620B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/005Tree description, e.g. octree, quadtree
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses an improved point cloud splicing method for an ICP (inductively coupled plasma) object to be measured, which is fused with a fast point feature histogram. Projecting a standard sine digital grating to the surface of an object to be measured, and shooting stripe images of the surface of the object projected with the standard sine digital grating through different visual angles of a CCD camera to obtain a multi-visual-angle shooting point cloud; aiming at two shooting point clouds needing to be spliced, constructing a k-d tree by each shooting point cloud and interpolating to obtain interpolated point clouds; aiming at the point clouds after two interpolations need to be spliced, calculating a fast point feature histogram, and carrying out random sampling consistent transformation to obtain a point cloud result; processing by adopting an improved iteration closest point mode to obtain a point cloud after first interpolation and after precise registration; and overlaying point clouds and performing grid division to realize the splicing of the shooting point clouds at two different visual angles. The method has low requirement on the initial position of the spliced point cloud, obviously improves the robustness, is not easy to fall into local optimum, improves the splicing precision, realizes the accurate splicing of the point cloud under multiple visual angles, and can meet the actual industrial application requirement.

Description

Improved object point cloud splicing method for ICP (inductively coupled plasma) to-be-measured object by fusing fast point feature histogram
Technical Field
The invention relates to a point cloud processing method and a point cloud processing system, in particular to an improved Iteration Closest Point (ICP) point cloud splicing method fusing a fast point feature histogram.
Background
The multi-view point cloud splicing is an important link for reconstructing surface data of an object, and the technology is always a research hotspot and difficulty in the fields of reverse engineering, computer vision, curved surface quality detection, photogrammetry and the like. The core of point cloud splicing is to solve coordinate transformation parameters R (rotation matrix) and T (translation matrix) so as to minimize the distance of three-dimensional data measured under two visual angles after coordinate transformation.
At present, point cloud registration methods mainly include the following two types. The method comprises the steps of automatically splicing on the basis of manual assistance, calculating a point cloud rotation and translation matrix by using a motion positioning device, and carrying out corresponding transformation on the point cloud obtained by shooting calculation, wherein the method has the defect that the registration precision is easily influenced by the motion device; or mark points are pasted on the object for matching, the method has high precision but is troublesome to operate, holes can be formed on the surface of the object, the point cloud splicing efficiency is seriously reduced, and the integrity of the object is damaged. Unassisted automated stitching has gained widespread attention in recent years in both industrial and academic fields, both from registration efficiency and accuracy considerations. There are three main types of methods for unassisted automatic splicing: a feature-based corresponding point solution, a statistical rigid body transformation solution, and an iterative closest point solution. The method for solving the corresponding points based on the characteristics usually utilizes a point characteristic histogram to calculate the geometrical characteristics of the point cloud, such as a point cloud normal vector, density distribution, a position relation among neighborhood points and the like, and the method is based on sampling consistency to sample and match, so that the algorithm has better stability, but the splicing precision is lower, and the point cloud registration efficiency can be obviously reduced if the number of sampling points is increased; the rigid transformation based on statistics is to discretize a transformation space and establish a certain registration error evaluation function, a typical algorithm is Normal Distribution Transformation (NDT), the method does not use corresponding point characteristics, the operation speed is high, but the precision is low; the method is characterized in that automatic splicing is realized by utilizing an Iterative Closest Point (ICP) algorithm, the algorithm is based on the optimization idea of a least square method, a registration matrix between point clouds is calculated, the operation speed and the accuracy of the algorithm are mainly determined by given initial transformation estimation, and if the two spliced point clouds do not have good initial positions and postures, the ICP algorithm not only can obviously reduce the speed but also can seriously reduce the accuracy.
Aiming at the fact that the three methods can not realize effective and rapid splicing of point clouds, the invention integrates two methods of characteristic search corresponding points and iterative closest points, adopts a fast point characteristic histogram (FPFH) and improves the iterative closest point method by removing error point pairs by utilizing a point cloud Euclidean distance and a direction vector included angle threshold value, and finally realizes self-adaptive splicing of the point clouds.
Disclosure of Invention
The invention provides an improved Iteration Closest Point (ICP) point cloud splicing method fusing a fast point feature histogram, aiming at solving the problems of low splicing precision, insufficient stability, low splicing efficiency and the like in the point cloud splicing process.
As shown in fig. 2, the technical solution of the present invention includes the following steps:
step 1), projecting a standard sine digital grating to the surface of an object to be detected, shooting stripe images of the object surface projected with the standard sine digital grating from different visual angles through a CCD camera, solving each stripe image by using an arc tangent function to obtain a phase value, unwrapping the obtained phase value to obtain surface point clouds of the object to be detected, and then preprocessing the surface point clouds to obtain a plurality of shooting point clouds, thereby obtaining multi-visual-angle shooting point clouds;
step 2), shooting point cloud iterative interpolation: aiming at two shooting point clouds needing to be spliced, constructing a k-d tree by each shooting point cloud, traversing and constructing a triangular grid and interpolating, iterating for multiple times until the density requirement of the point clouds is met, and obtaining the point clouds after interpolation;
step 3), coarse registration stage: aiming at two point clouds after interpolation needing to be spliced, calculating a fast point feature histogram of the point cloud after the first interpolation, finding out points similar to the fast point feature histogram in the point cloud after the second interpolation, and carrying out random sampling consistent transformation to obtain a point cloud result after the first point cloud after interpolation is subjected to rough registration;
step 4), a fine registration stage: roughly coinciding the point cloud after the first point cloud is subjected to rough registration with the point cloud after the second interpolation, and processing the point cloud after the first point cloud is subjected to rough registration by adopting an improved iteration closest point mode to obtain a point cloud after the first point cloud is subjected to fine registration;
specifically, searching neighboring points, combining Euclidean distances among corresponding points and a direction vector included angle, comparing preset thresholds, judging the point set corresponding relation, removing wrong point pairs, and iteratively solving the point cloud optimal coordinate transformation based on a least square method.
Step 5), a superposition stage: and superposing the point cloud subjected to precise registration of the point cloud subjected to the first interpolation and the point cloud subjected to the second interpolation by adopting a uniform grid simplification mode, carrying out grid division, calculating a central point for all points in the grid to replace all points in the coverage grid, and realizing the splicing of the shot point clouds at two different visual angles.
The method carries out iterative interpolation on the input point cloud, takes a fast point feature histogram as a feature, and carries out coarse registration through sampling consistency; and performing nearest point iterative fine registration on the basis of removing the error point pair by combining the Euclidean distance and the direction normal vector included angle and comparing a preset threshold, registering the two spliced point sets to the same three-dimensional coordinate system, and performing superposition simplification to finish splicing.
The step 1) is to specifically obtain a plane model and extract background plane point clouds by adopting a sampling consistency method to remove the background plane point clouds; voxel downsampling filtering is carried out on the retained point cloud, and the density of the point cloud is reduced; and finally, carrying out statistical filtering, and removing outlier noise points by searching for the number of neighboring points and the outlier threshold.
In the step 1), preprocessing is carried out to obtain a shot point cloud, specifically, background point cloud segmentation based on random sampling consistency, voxel unit downsampling filtering and statistical filtering are carried out on the surface point cloud of the object to be detected in sequence to remove outlier noise points, so that the operation of carrying out invalid background point and outlier noise point removal and density reduction on the shot surface point cloud of the object to be detected is realized:
1.2) aiming at the surface point cloud of an object to be detected, a desktop background is used as an element, the element plane shape and a plane error threshold value of the desktop background are preset, a random sampling method is adopted to fit the element plane shape of the desktop background and multiple times of processing are carried out to obtain different plane models, the number of points in the plane models is calculated, the plane model with the largest number of points is selected as an optimal plane, the points in the optimal plane are taken as invalid background points to be removed from the surface point cloud of the object to be detected, and therefore the processing efficiency and precision of subsequent point cloud processing can be improved;
1.3) carrying out voxel downsampling processing on the point cloud without the invalid background points, generating voxel units according to the size of an object to be detected, and replacing all points in the covered voxel units with the barycenter of all points of each voxel unit in voxel downsampling, so that the density of the point cloud can be reduced while surface shape characteristics are kept;
1.4) carrying out statistical filtering on the point cloud after voxel downsampling, setting the number of searching neighbor points and an outlier threshold, traversing all points in the point cloud, and processing each point by adopting the following mode: establishing a sphere with points as the sphere center and an outlier threshold as the sphere radius, and comparing the number of the points in the sphere with the number of the points of the search neighbor: if the point number is less than the number of the searching neighbor point, the point is regarded as an outlier noise point and is removed; if the point number is larger than or equal to the number of the searching neighbor point, the point is not regarded as the outlier noise point and is reserved; and finally, the obtained shooting point cloud is reserved.
The above steps are carried out for the original point cloud, the background point is removed, and the interference caused by the desktop point cloud is removed; the outliers are removed through statistical filtering, and the subsequent registration precision and the defect detection effect are improved; and voxel downsampling filtering is performed, the point cloud density is reduced, and the point cloud processing efficiency is improved.
The outlier threshold is an allowable error caused by equipment precision and environmental factors.
The step 2) is specifically as follows:
2.1) taking points in the shot point cloud as nodes of a k-d tree, and constructing the k-d tree according to the shot point cloud;
2.2) search through the k-d tree for each point piWill point piAnd is adjacent to point piTwo points q ofiEstablishing a triangular mesh, interpolating and establishing an interpolation point at the center of the triangular mesh, and adding the interpolation point into a node of the k-d tree;
2.3) repeating the step 2.2) continuously iterating until the shot point cloud corresponding to the k-d tree meets the point cloud density requirement, and obtaining the point cloud after interpolation. Therefore, the problem of insufficient point cloud density caused by the limitation of the resolution of the camera can be effectively solved, and the point cloud splicing precision can be obviously improved.
Under the configuration environment of a Point Cloud Library, a Point Cloud data processing Library interface is adopted to convert Point Cloud data into a pcd file format. Because of the limitation of the resolution of the camera, the points on the physical target are not completely displayed on the point clouds, and a certain point on the physical target may appear on one point cloud or not appear on the point clouds shot twice at the same time.
In the step 1), under a Point Cloud Library (Point Cloud Library) configuration environment, a Point Cloud data processing Library program interface is adopted to convert Point Cloud data into a pcd file format for reading in. Because of the limitation of the resolution of the camera, the points on the physical target are not completely displayed on the point clouds, and a certain point on the physical target may appear on one point cloud or not appear on the point clouds shot twice, so that the error caused by the scanning position of the found corresponding point is an important reason for causing the registration error, but the dislocation distance between the corresponding points is obviously reduced along with the increase of the density of the point clouds, the registration accuracy is obviously improved, and the requirement of the splicing accuracy is met.
The step 3) is specifically as follows:
3.1) randomly selecting s points from the first point cloud P after interpolation as sample points, and simultaneously enabling the distance between any two points of the s sample points to be larger than a preset minimum threshold value dmin
3.2) calculating a Simplified Point Feature Histogram (SPFH) according to the relationship between each point P of all sample points and other points adjacent to the point cloud P in the first interpolation point cloud, and calculating according to the distance weight to obtain a Fast Point Feature Histogram (FPFH) of the sample points;
3.3) randomly matching the points in the second point cloud after interpolation with the sample points:
3.3.1) for each sample point, finding one point similar to a Fast Point Feature Histogram (FPFH) as a matching point in the point cloud after the second interpolation, forming a group of corresponding point groups by all the sample points and the matching points corresponding to the sample points, calculating a rigid transformation matrix between all the sample points in the group of corresponding point groups and the matching points corresponding to the sample points, calculating the distance difference between a point cloud result obtained by transforming the point cloud after the interpolation by using the rigid transformation matrix and the point cloud after the second interpolation, and further calculating by adopting the following formula to obtain a measurement error:
Figure BDA0001763324880000041
in the formula, H (l)i) Represents the measurement error of the ith group of corresponding points, | liI represents the distance difference, mlFor presetting a comparison threshold,/iThe distance between the point cloud result obtained for the ith group of corresponding points and the standard point cloud;
the distance difference is the distance between the point in the point cloud result obtained by transforming the interpolated point cloud with the rigid body transformation matrix and the nearest point in the second interpolated point cloud.
3.3.2) repeating the step 3.3.1) for a plurality of times, namely, carrying out random matching for a plurality of times to obtain a plurality of groups of corresponding point groups and measurement errors thereof, and then calculating the measurement errors and functions
Figure BDA0001763324880000042
Wherein n represents the total number of the corresponding point group, and i represents the group ordinal number of the corresponding point group; and (4) taking a rigid transformation matrix corresponding to the corresponding point group with the minimum measurement error and function to transform the point cloud after interpolation to obtain a coarse registration point cloud.
When the FPFH characteristics are calculated, the neighbor point set is not suitable to be excessively small, otherwise, the characteristic value of the point cannot be accurately estimated, and the coarse registration precision is too low; the neighbor set should not be too large, which would increase the computation and make the computation of FPFH meaningless (too large neighbor set cannot reflect local features).
In the step 3), after the fast point feature histogram is obtained, a sampling consistency initial registration algorithm (SAC-IA) is introduced in an initial registration stage by using the feature that point cloud data is not changed in rotation, a large number of samples are carried out in the same corresponding relation, and the SAC-IA obtains a transformation matrix to carry out coarse matching. The processing in the mode can improve the uniformity of sampling and enable the algorithm to be more stable.
The step 4) is specifically as follows:
4.1) taking the point cloud after the first interpolation and rough registration as a source point cloud, taking the point cloud after the second interpolation as a target point cloud, and processing by adopting a k-d tree searching mode: the method comprises the steps that points in a source are used as sample points, for each sample point, a point which is closest to the Euclidean distance of the sample point is found in a target point cloud and used as a matching point, a pair of corresponding points are formed by one sample point and the corresponding matching point, the Euclidean distance and the direction vector included angle of the pair of corresponding points are calculated, and the Euclidean distance and the direction vector included angle of the pair of corresponding points refer to the vector included angle between the Euclidean distance between the sample point and the corresponding matching point and the direction vector of the sample point; respectively comparing the Euclidean distance and the direction vector included angle with a preset distance threshold value and an included angle threshold value, taking a sample point of which the Euclidean distance is greater than the preset distance threshold value or the direction vector included angle is greater than the preset included angle threshold value as a normal point, and moving the normal point to a corresponding matching point by a moving step;
therefore, the situation that corresponding points do not exist due to view angle shielding, camera noise and the like can be eliminated, the possibility that iteration cannot be carried out to the global optimum is reduced, and the registration precision is improved.
4.2) calculating to obtain the mean square error e from the normal point to the target point cloud by adopting the following formula so as to avoid wasting hardware resources by invalid iteration and improve the iteration efficiency;
Figure BDA0001763324880000051
wherein q isi、piRespectively representing coordinates of corresponding points in the middle points of the source point cloud and the target point cloud, e representing the mean square error of the target distance, i representing the ordinal number of the normal point, N representing the total number of the normal point, R representing a rotation matrix, and T representing a translation matrix;
the formula rotation matrix R and the translation matrix T are obtained by solving coordinate transformation by adopting a Singular Value Decomposition (SVD) method of the following formula, and the source point cloud is registered to the coordinate system of the target point cloud through the rotation matrix R and the translation matrix T:
Q=RP+T
wherein, R represents a rotation matrix, which is a rotation matrix of 3 × 3, T represents a translation matrix, which is a translation matrix of 1 × 3, P represents the point cloud before transformation, and Q represents the point cloud after transformation.
Based on coarse registrationFine registration is performed using an iteration based on the closest point from which the error point pair was removed. Firstly, a point-to-point corresponding point set searching method is adopted, and a k-d tree searching mode is utilized, so that the searching efficiency of corresponding point pairs is obviously improved by comparison: in the traditional ICP algorithm, a brute force method is used for searching corresponding point pairs, and the complexity of the algorithm is O (N)2) N is the number of point clouds, a k-d tree method is adopted to classify the point clouds into a tree structure according to coordinates, and the complexity of the algorithm is O (Nlog)2N), the search efficiency is remarkably improved.
In the process of searching for the adjacent point and setting a threshold value for comparison, when the distance between two corresponding points is greater than the set threshold value, the matching point which is considered to be wrong is removed. Selecting corresponding point sets P 'and Q' obtained after Euclidean distance judgment, calculating corresponding point pair method vectors through a neighborhood covariance analysis method, solving an included angle alpha of the two vectors and comparing the included angle alpha with a preset threshold theta, and removing the point pair if the included angle alpha is larger than the preset threshold theta. The reason for forming the error point pairs is mainly as follows: because the shape of the object to be detected is shielded, part of the point cloud appears in one data; optical devices, environmental factors, etc.
The threshold setting needs to be selected according to the point cloud density and the splicing precision requirement, and if the threshold setting is too large, the error point pair cannot be removed; if the threshold is set too small, the correct point is caused to be deleted incorrectly. And the threshold judgment of the error point pair is introduced, so that the requirement on the registration point cloud contact ratio is greatly reduced, and the registration precision is improved.
The step 5) is specifically as follows: superposing the point cloud subjected to fine registration of the point cloud subjected to the first interpolation and the point cloud subjected to the second interpolation, performing voxel downsampling after superposition, setting a voxel unit as a grid, generating the voxel unit according to the size of an object to be detected, and if the voxel unit is set to be overlarge, leading to excessive sparsity of the point cloud, losing the characteristic points of the point cloud, and reducing the subsequent defect rate; if the voxel unit is set to be too small, the point cloud cannot be filtered, the point cloud density cannot be reduced, the point cloud density in the overlapped area is too high, and the point cloud density in the shielded area is insufficient. And the centroids of all points in each voxel unit in the voxel down-sampling are used for replacing all points in the voxel unit, so that the point cloud density is uniformized, the shape characteristic of the point cloud is reserved, and the point cloud density can be reduced while the surface shape characteristic is reserved.
The technical scheme of the invention has the beneficial effects that:
1) the method utilizes the fast point feature histogram as the matching feature, improves the stability of the algorithm based on the rough registration of sampling consistency, reduces the requirement on the initial position of the point cloud, and provides better initial position and posture for iterative closest point registration;
2) on the basis of the traditional iteration nearest point, the invention adopts a nearest point searching mode of a k-d tree to search the corresponding point from O (N) for the complexity of the algorithm2) Reduced to O (Nlog)2And N), improving the point cloud splicing efficiency.
3) On the basis of the traditional iteration closest point, the method utilizes the threshold value combining the Euclidean distance and the direction vector to judge and remove the error point pairs, and eliminates the error point pairs caused by noise points and shooting shielding, thereby not only avoiding the iteration process from falling into local optimum, but also improving the registration precision and reducing the requirement on the contact ratio of point clouds to be spliced.
In summary, the invention has low requirement on the initial position of the point cloud for splicing, obviously improves the robustness, is not easy to fall into local optimum, improves the splicing precision, realizes the accurate splicing of the point cloud under multiple visual angles, and can meet the actual industrial application requirement.
Drawings
FIG. 1 is a block diagram of a structured light detection system in accordance with an embodiment of the present invention;
FIG. 2 is a flow chart of the method of the present invention;
FIG. 3 is a diagram of relative initial positions of point clouds to be stitched;
FIG. 4 is a graph of the results of conventional nearest point iterative stitching;
FIG. 5 is a graph showing the result of the splicing process of the present invention.
Detailed Description
The invention is further illustrated by the following examples in conjunction with the drawings.
As shown in fig. 2, the embodiment and the implementation process of the method according to the present invention completely comprises the following steps:
specifically, a structured light detection system is adopted, as shown in fig. 1, the structured light detection system comprises a projector, a computer, a CCD camera and a platform, an object to be detected is placed on the platform, the projector is connected with the computer, the projector and the camera are respectively placed on two sides above the object to be detected, and a lens of the projector and a lens of the CCD camera both face the object to be detected; an object to be measured is placed on a desktop, a signal of an input grating mode is sent out from a computer, the signal is input into a projector to generate a stripe grating which is used as a grating light source to irradiate the object to be measured and the desktop, the projection direction is not vertical to the object to be measured, and an image of the stripe grating which irradiates the object to be measured and the desktop is collected through a CCD camera, wherein the schematic diagram is shown in fig. 1. The projector projects the stripe grating on an object to be measured, the object to be measured enables the stripe grating to be distorted, the camera shoots the distorted stripe, and the point cloud on the surface of the object is reconstructed after the phase is resolved by the computer.
And a first step of interpolation iteration of the input point cloud.
Under the configuration environment of a Point Cloud Library, a Point Cloud data processing Library program interface is adopted to convert Point Cloud data into a pcd file format, and appropriate preprocessing operations such as background Point segmentation, outlier filtering and the like are carried out on input Point Cloud. Analyzing the data structure of an input point set, constructing a K-d tree, establishing an M multiplied by K query table matrix, and storing the index value of K neighbor points of each point (M is the number of points in the point cloud) searched by the K-d tree; traverse search point piAnd p isiNeighbor point qiAnd establishing interpolation points, constructing triangular grid computing grid central interpolation, and iterating until the point cloud density meets the requirements.
And a second coarse registration stage.
And shooting the point clouds on the surface of the object to be measured at different visual angles, and splicing every two points.
One is selected as a source point cloud, and the other is selected as a target point cloud. Calculating a fast point feature histogram (SPFH) of each query point p of the source point cloud, and estimating the relationship among three-dimensional coordinates, normal vectors and surrounding points of feature points; and calculating a fast point feature histogram according to the distance weight. Introducing a sampling consistency initial registration algorithm (SAC-IA) at an initial registration stage by using the characteristic that point cloud data is unchanged in rotation, carrying out a large number of samplings in the same corresponding relation, and carrying out the sampling byThe following steps rank them: 1) selecting s sample points from the cloud P to be registered, and simultaneously determining that the pairing distance of the s sample points is greater than the set minimum value of 0.005 m; 2) for each sample point, finding out points which meet the similarity of histograms of the sample points in the target point cloud Q and storing the points into a list, and randomly selecting corresponding relations representing sampling points from the points; 3) rigid transformation defined by the sampling points and their corresponding transformations are calculated, and the measurement error of the point cloud is calculated to evaluate the conversion quality. Here, the metric error and the function representation are calculated as
Figure BDA0001763324880000081
Wherein:
Figure BDA0001763324880000082
in the formula: m islIs a preset contrast value ofiAnd the distance between the point cloud result obtained for the ith group of corresponding points and the standard point cloud.
And a third step of fine registration. After the fast point feature histogram is roughly registered, two pieces of point clouds to be registered are roughly registered, but deviation still exists, and the registration precision is low.
Firstly, introducing a k-d tree neighbor point searching mode, and enabling the complexity of a searching algorithm to be O (N)2) Reduced to O (Nlog)2N), obviously improving the ICP algorithm speed and shortening the search time; and setting a threshold value of the error point pair, wherein the Euclidean threshold value of the error point pair is set to be 0.003m and the normal vector included angle threshold value is set to be 0.05 based on the shooting resolution and the actual shooting effect of the experimental camera, so that correct corresponding points can be effectively reserved, and meanwhile, the error point pair is removed. And introducing an iteration evaluation standard for judging the precision of the fine registration and simultaneously serving as an iteration stopping condition. Taking mean square error as an index for evaluating registration, corresponding to a point piAnd q isiComparing the distance with a preset threshold, and if the distance is larger than the preset threshold, not counting registration evaluation; if less than the thresholdAnd (4) adding one to the count of the corresponding point pair, superposing errors, traversing the registration point set, and calculating the average value.
Setting a closest point iteration termination condition: the maximum iteration number is 100, the difference value between two transformation matrixes is 1e-10, and the mean square error is 0.2, and the iteration stops when any standard iteration is reached.
And solving coordinate transformation by using a Singular Value Decomposition (SVD) method, and registering the two splicing point sets to the same three-dimensional coordinate system.
And fourthly, a cloud data superposition simplification stage is carried out. After accurate registration, the two point sets are superposed, voxel unit down sampling is adopted again, the side length of the voxel unit is set to be 0.001m, the gravity centers of all the points in the voxel unit represent all the points in the grid, and the uniformity of the point cloud is ensured: the point cloud density at the position cannot rise exponentially due to superposition of the point cloud data at the superposition position, and meanwhile, the retention degree of the feature points is high, and most features can be retained.
And (3) experimental verification:
the Tibetan tower point cloud is used as an example for verification, the point set is subjected to any-angle rotation translation transformation, an improved iteration nearest point self-adaptive point cloud splicing method of a fusion fast point feature histogram is adopted, the effect is shown in figure 2, the mean square error of a registration result is 4 × e-7The registration time is 4.359s, and the registration precision and efficiency are higher than those of the traditional iterative closest point algorithm.
The complete fota point cloud (38504) is cut randomly to obtain two different point sets with overlapped lists, the number of the point clouds is 23263 and 33706, the two incomplete Tibetan fota point clouds with mutually deviated initial positions are shown in figure 3, the improved iterative closest point splicing method of splicing and fusing fast point feature histograms is adopted respectively, the point cloud effect pairs are shown in figures 4 and 5, and the splicing result pairs are shown in table 1.
TABLE 1
Figure BDA0001763324880000091
As can be seen from the above table, the splicing method of the present invention is significantly superior to the traditional ICP splicing method in both splicing efficiency and precision, and meanwhile, the present invention has a lower requirement on the coincidence degree of the point clouds to be spliced at different viewing angles, and the algorithm is more stable.
Although the present invention has been described with reference to the preferred embodiments, it is not intended to be limited thereto. Those skilled in the art can make various changes and modifications without departing from the spirit and scope of the invention. Therefore, the protection scope of the present invention should be determined by the appended claims.

Claims (4)

1. An improved ICP (inductively coupled plasma) object point cloud splicing method fusing a fast point feature histogram is characterized by comprising the following steps:
step 1), projecting a standard sine digital grating to the surface of an object to be detected, shooting stripe images of the object surface projected with the standard sine digital grating from different visual angles through a CCD camera, solving each stripe image by using an arc tangent function to obtain a phase value, unwrapping the obtained phase value to obtain surface point clouds of the object to be detected, and then preprocessing the surface point clouds to obtain a plurality of shooting point clouds, thereby obtaining multi-visual-angle shooting point clouds;
step 2), shooting point cloud iterative interpolation: aiming at two shooting point clouds needing to be spliced, constructing a k-d tree by each shooting point cloud, traversing and constructing a triangular grid and interpolating, iterating for multiple times until the density requirement of the point clouds is met, and obtaining the point clouds after interpolation;
step 3), coarse registration stage: aiming at two point clouds after interpolation needing to be spliced, calculating a fast point feature histogram of the point cloud after the first interpolation, finding out points similar to the fast point feature histogram in the point cloud after the second interpolation, and carrying out random sampling consistent transformation to obtain a point cloud result after the first point cloud after interpolation is subjected to rough registration;
step 4), fine registration: processing the point cloud after the first interpolation and subjected to coarse registration by adopting an improved iteration closest point mode to obtain a point cloud after the first interpolation and subjected to fine registration;
step 5), a superposition stage: superposing the point cloud subjected to fine registration of the point cloud subjected to the first interpolation and the point cloud subjected to the second interpolation, carrying out grid division, solving a central point for all points in the grid to replace and cover all points in the grid, and realizing the splicing of the shot point clouds at two different visual angles;
in the step 1), the shot point cloud is obtained by preprocessing, and the method specifically comprises the following steps:
1.2) aiming at the surface point cloud of the object to be detected, taking a desktop background as an element, presetting the element plane shape of the desktop background, fitting the element plane shape of the desktop background by adopting a random sampling method, carrying out multiple processing to obtain different plane models, calculating the number of points in the plane models, selecting the plane model with the largest number of points as an optimal plane, and removing the points in the optimal plane as invalid background points from the surface point cloud of the object to be detected;
1.3) carrying out voxel down-sampling treatment on the point cloud without the invalid background points, and replacing all points in the covered voxel units with the barycenter of all points of each voxel unit in the voxel down-sampling treatment;
1.4) carrying out statistical filtering on the point cloud after voxel downsampling, setting the number of searching neighbor points and an outlier threshold, traversing all points in the point cloud, and processing each point by adopting the following mode: establishing a sphere with points as the sphere center and an outlier threshold as the sphere radius, and comparing the number of points in the sphere with the number of points of search neighbors: if the number of points is less than the number of points of the search neighbor, the points are regarded as outlier noise points and removed; if the number of the points is more than or equal to the number of the searching neighbor points, the points are not regarded as outlier noise points and reserved; finally, the obtained shot point cloud is reserved;
the step 4) is specifically as follows:
4.1) taking the point cloud after the first interpolation and rough registration as a source point cloud, taking the point cloud after the second interpolation as a target point cloud, and processing by adopting a k-d tree searching mode: the method comprises the steps that points in a source are used as sample points, for each sample point, a point which is closest to the Euclidean distance of the sample point is found in a target point cloud and used as a matching point, a pair of corresponding points are formed by one sample point and the corresponding matching point, the Euclidean distance and the direction vector included angle of the pair of corresponding points are calculated, and the Euclidean distance and the direction vector included angle of the pair of corresponding points refer to the vector included angle between the Euclidean distance between the sample point and the corresponding matching point and the direction vector of the sample point; respectively comparing the Euclidean distance and the direction vector included angle with a preset distance threshold value and an included angle threshold value, taking a sample point of which the Euclidean distance is greater than the preset distance threshold value or the direction vector included angle is greater than the preset included angle threshold value as a normal point, and moving the normal point to a corresponding matching point by a moving step;
4.2) calculating and obtaining the mean square error e from the normal point to the target point cloud by adopting the following formula;
Figure FDA0003627068740000021
wherein q isi、piRespectively representing coordinates of corresponding points in the midpoint of the source point cloud and the target point cloud, e representing the mean square error of the target distance, i representing the ordinal number of the normal points, N representing the total number of the normal points, R representing a rotation matrix, and T representing a translation matrix;
the formula rotation matrix R and the translation matrix T are obtained by solving coordinate transformation by using a Singular Value Decomposition (SVD) method of the following formula:
Q=RP+T
wherein R represents a rotation matrix, T represents a translation matrix, P represents the point cloud before transformation, and Q represents the point cloud after transformation.
2. The method for splicing the point clouds of the improved ICP object to be measured by fusing the fast point feature histograms as claimed in claim 1, wherein the method comprises the following steps: the step 2) is specifically as follows:
2.1) taking points in the shot point cloud as nodes of a k-d tree, and constructing the k-d tree according to the shot point cloud;
2.2) search through the k-d tree for each point piWill point piAnd is adjacent to the point piTwo points q ofiEstablishing a triangular mesh, interpolating and establishing an interpolation point at the center of the triangular mesh, and adding the interpolation point into a node of the k-d tree;
and 2.3) repeating the step 2.2) and continuously iterating until the shot point cloud corresponding to the k-d tree meets the point cloud density requirement, and obtaining the point cloud after interpolation.
3. The method for splicing the point clouds of the improved ICP object to be measured by fusing the fast point feature histograms as claimed in claim 1, wherein the method comprises the following steps: the step 3) is specifically as follows:
3.1) randomly selecting s points from the first point cloud P after interpolation as sample points, and simultaneously enabling the distance between any two points of the s sample points to be larger than a preset minimum threshold value dmin
3.2) calculating to obtain a Fast Point Feature Histogram (FPFH) according to the relationship between each point P of all sample points and other points adjacent to the point cloud P after the first interpolation;
3.3) randomly matching the points in the second point cloud after interpolation with the sample points:
3.3.1) for each sample point, finding one point similar to the fast point feature histogram FPFH as a matching point in the point cloud after the second interpolation, forming a group of corresponding point groups by all the sample points and the matching points corresponding to the sample points, calculating a rigid body transformation matrix between all the sample points in the group of corresponding point groups and the corresponding matching points, calculating the distance difference between the point cloud result obtained by transforming the point cloud after the interpolation by using the rigid body transformation matrix and the point cloud after the second interpolation, and further calculating by adopting the following formula to obtain a measurement error:
Figure FDA0003627068740000031
in the formula, H (l)i) Represents the measurement error of the ith group of corresponding points, | liI represents the distance difference, mlFor presetting a comparison threshold,/iThe distance between the point cloud result obtained for the ith group of corresponding points and the standard point cloud;
3.3.2) repeating the step 3.3.1) for a plurality of times to obtain a plurality of groups of corresponding point groups and measurement errors thereof, and then calculating the measurement errors and functions
Figure FDA0003627068740000032
Wherein n represents the total number of the corresponding point group, and i represents the group ordinal number of the corresponding point group; and taking a rigid transformation matrix corresponding to the corresponding point group with the minimum measurement error and function to transform the point cloud after interpolation to be used as the point cloud after the first interpolation and subjected to rough registration.
4. The method for splicing the point clouds of the improved ICP object to be measured by fusing the fast point feature histograms as claimed in claim 1, wherein the method comprises the following steps: the step 5) is specifically as follows: and superposing the point cloud of the first point cloud after interpolation after fine registration and the second point cloud after interpolation, performing voxel downsampling after superposition, setting a voxel unit as a grid, and replacing all points in the covered voxel unit with the barycenter of all points of each voxel unit in the voxel downsampling.
CN201810917197.3A 2018-08-13 2018-08-13 Improved object point cloud splicing method for ICP (inductively coupled plasma) to-be-measured object by fusing fast point feature histogram Active CN109345620B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810917197.3A CN109345620B (en) 2018-08-13 2018-08-13 Improved object point cloud splicing method for ICP (inductively coupled plasma) to-be-measured object by fusing fast point feature histogram

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810917197.3A CN109345620B (en) 2018-08-13 2018-08-13 Improved object point cloud splicing method for ICP (inductively coupled plasma) to-be-measured object by fusing fast point feature histogram

Publications (2)

Publication Number Publication Date
CN109345620A CN109345620A (en) 2019-02-15
CN109345620B true CN109345620B (en) 2022-06-24

Family

ID=65296878

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810917197.3A Active CN109345620B (en) 2018-08-13 2018-08-13 Improved object point cloud splicing method for ICP (inductively coupled plasma) to-be-measured object by fusing fast point feature histogram

Country Status (1)

Country Link
CN (1) CN109345620B (en)

Families Citing this family (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110097598B (en) * 2019-04-11 2021-09-07 暨南大学 Three-dimensional object pose estimation method based on PVFH (geometric spatial gradient frequency) features
CN109919984A (en) * 2019-04-15 2019-06-21 武汉惟景三维科技有限公司 A kind of point cloud autoegistration method based on local feature description's
CN110033409B (en) * 2019-04-18 2021-04-23 中国科学技术大学 Iteration closest point rigid registration method and system
CN110097581B (en) * 2019-04-28 2021-02-19 西安交通大学 Method for constructing K-D tree based on point cloud registration ICP algorithm
CN110111374B (en) * 2019-04-29 2020-11-17 上海电机学院 Laser point cloud matching method based on grouped stepped threshold judgment
CN110160528B (en) * 2019-05-30 2021-06-11 华中科技大学 Mobile device pose positioning method based on angle feature recognition
CN110363800B (en) * 2019-06-19 2021-08-13 西安交通大学 Accurate rigid body registration method based on fusion of point set data and characteristic information
CN110363801B (en) * 2019-07-04 2023-04-18 陕西丝路机器人智能制造研究院有限公司 Method for matching corresponding points of workpiece real object and three-dimensional CAD (computer-aided design) model of workpiece
CN111009002B (en) * 2019-10-16 2020-11-06 贝壳找房(北京)科技有限公司 Point cloud registration detection method and device, electronic equipment and storage medium
CN110766733B (en) * 2019-10-28 2022-08-12 广东三维家信息科技有限公司 Single-space point cloud registration method and device
CN111178138B (en) * 2019-12-04 2021-01-12 国电南瑞科技股份有限公司 Distribution network wire operating point detection method and device based on laser point cloud and binocular vision
CN110749895B (en) * 2019-12-23 2020-05-05 广州赛特智能科技有限公司 Laser radar point cloud data-based positioning method
CN111275810B (en) * 2020-01-17 2022-06-24 五邑大学 K nearest neighbor point cloud filtering method and device based on image processing and storage medium
CN111353985B (en) * 2020-03-02 2022-05-03 电子科技大学 Airport self-service consignment luggage detection method based on depth camera
CN111429492B (en) * 2020-03-20 2020-12-15 南京航空航天大学 Airplane C-shaped beam registration method based on local non-deformation
CN111461981B (en) * 2020-03-30 2023-09-01 北京百度网讯科技有限公司 Error estimation method and device for point cloud stitching algorithm
CN111369607B (en) * 2020-05-26 2020-09-04 上海建工集团股份有限公司 Prefabricated part assembling and matching method based on picture analysis
CN111738945B (en) * 2020-06-15 2023-11-10 鞍钢集团矿业有限公司 Point cloud data preprocessing method based on mine
CN111563861B (en) * 2020-07-14 2020-11-20 武汉数字化设计与制造创新中心有限公司 Workpiece allowance fast solving method based on three-dimensional measurement point cloud data
CN112037178A (en) * 2020-08-10 2020-12-04 泉州市澳莱格电子有限责任公司 Cylinder two-dimensional image generation method based on multi-view camera
CN112019747B (en) * 2020-09-01 2022-06-17 北京德火科技有限责任公司 Foreground tracking method based on holder sensor
CN112115954B (en) * 2020-09-30 2022-03-29 广州云从人工智能技术有限公司 Feature extraction method and device, machine readable medium and equipment
CN112013792B (en) * 2020-10-19 2021-02-02 南京知谱光电科技有限公司 Surface scanning three-dimensional reconstruction method for complex large-component robot
CN112529945B (en) * 2020-11-17 2023-02-21 西安电子科技大学 Multi-view three-dimensional ISAR scattering point set registration method
CN112446114B (en) * 2020-12-08 2023-09-05 国网江苏省电力工程咨询有限公司 Three-dimensional model comparison-based power transmission line engineering construction progress monitoring method
CN112232319A (en) * 2020-12-14 2021-01-15 成都飞机工业(集团)有限责任公司 Scanning splicing method based on monocular vision positioning
CN112581457B (en) * 2020-12-23 2023-12-12 武汉理工大学 Pipeline inner surface detection method and device based on three-dimensional point cloud
CN113706587B (en) * 2021-07-14 2022-12-09 西安交通大学 Rapid point cloud registration method, device and equipment based on space grid division
CN113658194B (en) * 2021-07-23 2024-06-07 佛山缔乐视觉科技有限公司 Point cloud splicing method and device based on reference object and storage medium
CN113587816A (en) * 2021-08-04 2021-11-02 天津微深联创科技有限公司 Array type large-scene structured light three-dimensional scanning measurement method and device
CN113674425B (en) * 2021-10-25 2022-02-15 深圳市信润富联数字科技有限公司 Point cloud sampling method, device, equipment and computer readable storage medium
CN114049256B (en) * 2021-11-09 2024-05-14 苏州中科全象智能科技有限公司 Uniform downsampling method based on online splice point cloud
CN113945217B (en) * 2021-12-15 2022-04-12 天津云圣智能科技有限责任公司 Air route planning method, device, server and computer readable storage medium
CN114387319B (en) * 2022-01-13 2023-11-14 北京百度网讯科技有限公司 Point cloud registration method, device, equipment and storage medium
CN114459378A (en) * 2022-02-16 2022-05-10 河南城建学院 Tunnel engineering three-dimensional laser scanning sectional measurement method and measurement system
CN115423854B (en) * 2022-08-31 2024-02-06 哈尔滨岛田大鹏工业股份有限公司 Multi-view point cloud registration and point cloud fusion method based on multi-scale feature extraction
CN116663408B (en) * 2023-05-30 2023-12-22 昆明理工大学 Establishment method of optimal digging pose of pseudo-ginseng
CN116563561B (en) * 2023-07-06 2023-11-14 北京优脑银河科技有限公司 Point cloud feature extraction method, point cloud registration method and readable storage medium
CN117456612B (en) * 2023-12-26 2024-03-12 西安龙南铭科技有限公司 Cloud computing-based body posture automatic assessment method and system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104299260A (en) * 2014-09-10 2015-01-21 西南交通大学 Contact network three-dimensional reconstruction method based on SIFT and LBP point cloud registration
CN105976312A (en) * 2016-05-30 2016-09-28 北京建筑大学 Point cloud automatic registering method based on point characteristic histogram
CN106296693A (en) * 2016-08-12 2017-01-04 浙江工业大学 Based on 3D point cloud FPFH feature real-time three-dimensional space-location method
WO2017096299A1 (en) * 2015-12-04 2017-06-08 Autodesk, Inc. Keypoint-based point-pair-feature for scalable automatic global registration of large rgb-d scans

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104299260A (en) * 2014-09-10 2015-01-21 西南交通大学 Contact network three-dimensional reconstruction method based on SIFT and LBP point cloud registration
WO2017096299A1 (en) * 2015-12-04 2017-06-08 Autodesk, Inc. Keypoint-based point-pair-feature for scalable automatic global registration of large rgb-d scans
CN105976312A (en) * 2016-05-30 2016-09-28 北京建筑大学 Point cloud automatic registering method based on point characteristic histogram
CN106296693A (en) * 2016-08-12 2017-01-04 浙江工业大学 Based on 3D point cloud FPFH feature real-time three-dimensional space-location method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
《用于多视点云拼接的改进ICP算法》;陈金广等;《计算机***应用》;20171222;第180-184页 *

Also Published As

Publication number Publication date
CN109345620A (en) 2019-02-15

Similar Documents

Publication Publication Date Title
CN109345620B (en) Improved object point cloud splicing method for ICP (inductively coupled plasma) to-be-measured object by fusing fast point feature histogram
CN111815757B (en) Large member three-dimensional reconstruction method based on image sequence
EP3382644B1 (en) Method for 3d modelling based on structure from motion processing of sparse 2d images
CN109816703B (en) Point cloud registration method based on camera calibration and ICP algorithm
US8447099B2 (en) Forming 3D models using two images
US8452081B2 (en) Forming 3D models using multiple images
US8218858B2 (en) Enhanced object reconstruction
Micusik et al. Descriptor free visual indoor localization with line segments
CN111524168B (en) Point cloud data registration method, system and device and computer storage medium
CN107610219B (en) Pixel-level point cloud densification method for sensing geometric clues in three-dimensional scene reconstruction
CN113393524B (en) Target pose estimation method combining deep learning and contour point cloud reconstruction
O'Byrne et al. A stereo‐matching technique for recovering 3D information from underwater inspection imagery
Gadasin et al. Reconstruction of a Three-Dimensional Scene from its Projections in Computer Vision Systems
CN114612412A (en) Processing method of three-dimensional point cloud data, application of processing method, electronic device and storage medium
CN116935013B (en) Circuit board point cloud large-scale splicing method and system based on three-dimensional reconstruction
CN113838069A (en) Point cloud segmentation method and system based on flatness constraint
Loaiza et al. Matching segments in stereoscopic vision
CN116843829A (en) Concrete structure crack three-dimensional reconstruction and length quantization method based on binocular video
Tseng et al. Computing location and orientation of polyhedral surfaces using a laser-based vision system
Hlubik et al. Advanced point cloud estimation based on multiple view geometry
Kang et al. 3D urban reconstruction from wide area aerial surveillance video
CN117541537B (en) Space-time difference detection method and system based on all-scenic-spot cloud fusion technology
CN117115242B (en) Identification method of mark point, computer storage medium and terminal equipment
Kabbour et al. Human ear surface reconstruction through morphable model deformation
Calderón et al. An approach for estimating the fundamental matrix

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant