CN112132875A - Multi-platform point cloud matching method based on surface features - Google Patents

Multi-platform point cloud matching method based on surface features Download PDF

Info

Publication number
CN112132875A
CN112132875A CN202010891648.8A CN202010891648A CN112132875A CN 112132875 A CN112132875 A CN 112132875A CN 202010891648 A CN202010891648 A CN 202010891648A CN 112132875 A CN112132875 A CN 112132875A
Authority
CN
China
Prior art keywords
point cloud
plane
parameters
follows
matching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010891648.8A
Other languages
Chinese (zh)
Other versions
CN112132875B (en
Inventor
刘如飞
卢秀山
刘以旭
马新江
柴永宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Xiushan Mobile Surveying Co ltd
Original Assignee
Qingdao Xiushan Mobile Surveying Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Xiushan Mobile Surveying Co ltd filed Critical Qingdao Xiushan Mobile Surveying Co ltd
Priority to CN202010891648.8A priority Critical patent/CN112132875B/en
Publication of CN112132875A publication Critical patent/CN112132875A/en
Application granted granted Critical
Publication of CN112132875B publication Critical patent/CN112132875B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • G06T7/344Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a multi-platform point cloud matching method based on surface features, and belongs to the technical field of mobile measurement point cloud matching. The implementation steps of the invention are as follows: acquiring point cloud data of a target object, carrying out data denoising pretreatment, and converting a pretreated original point cloud coordinate system into a local coordinate system which takes the gravity as a coordinate origin and has unchanged coordinate axis direction; uniformly selecting three or more pairs of matching planes from the acquired point cloud data, performing plane fitting by adopting a steady RANSAC algorithm to extract common and coplanar plane parameters (A, B, C and D), and performing self-flatness evaluation on the plane point cloud; converting the rotation parameters into a nonlinear model and then carrying out linearization processing on the basis of three or more pairs of plane parameters of corresponding planes, keeping the translation parameters unchanged, and carrying out parameter solution by adopting an integral least square method; evaluating the correlation of the planes to be registered based on the correction number of the adjustment result, and reselecting the planes which do not meet the requirements; and finishing the point cloud matching based on the coordinate conversion model.

Description

Multi-platform point cloud matching method based on surface features
Technical Field
The invention discloses a multi-platform point cloud matching method based on surface features, and belongs to the technical field of mobile measurement point cloud matching.
Background
At present, vehicle-mounted, airborne and single-station LiDAR scanners are influenced by attitude of a carrying platform, precision of control points, precision of a GPS (global positioning system), precision of inertial navigation and the like, and small deviation exists between collected multi-source data coordinate systems. The above-mentioned each platform acquisition mode all has certain application condition, in the actual production application, need many platforms point cloud data to match and fuse to obtain the multidimensional, many space-time point cloud data of target ground object. In order to fully utilize information in the multi-source data, a coordinate system of the multi-source point cloud data needs to be matched, and the point cloud is spliced according to homonymous features (points, lines and surfaces). The precision of point cloud splicing affects the precision of subsequent data processing, and therefore point cloud splicing is a key step of data processing.
In the volume measurement mode, there may be no common point for two measurements of the same surface, so a model-based or feature-based coordinate transformation method needs to be studied; the plane is a regular model widely existing in nature, and if a plurality of planes can be fitted from the point cloud data and coordinate conversion is performed by using a common plane, the defect that common points need to be searched from massive point cloud data in the prior art can be overcome.
Disclosure of Invention
The invention discloses a road marking line extraction method based on driving direction structural feature constraint, which aims to solve the problem that in the prior art, public points need to be searched from massive point cloud data.
A multi-platform point cloud matching method based on surface features comprises the following steps:
s1, acquiring ground objects of a target area through data acquisition equipment erected on different platforms, and acquiring and preprocessing original point cloud data under the different platforms;
s2, uniformly selecting three or more pairs of non-coplanar matching planes from the preprocessed point cloud data in a man-machine interaction mode, performing plane fitting by adopting a steady RANSAC algorithm, extracting parameters of common planes, and performing self-flatness evaluation on the plane point cloud;
s3, converting the rotation parameters into a nonlinear model based on three or more pairs of plane parameters of corresponding planes, performing linearization processing, keeping the translation parameters unchanged, and performing parameter solution by adopting an integral least square method; evaluating the correlation of the planes to be registered through the correction number based on the adjustment result, and reselecting the planes which do not meet the requirements;
and S4, completing point cloud matching based on a coordinate conversion model.
Step S1, in the data processing stage, filtering and denoising original point cloud data of different platforms, and converting a preprocessed point cloud coordinate system into a local coordinate system which takes the gravity center as a coordinate origin and is unchanged in coordinate axis direction; the local origin of coordinates calculation is as follows:
Figure BDA0002657207920000011
wherein, [ x ]0,y0,z0]Denotes the origin of a local coordinate system, [ x ]i,yi,zi]Representing the ith point cloud coordinate; the calculation formula of the transformed point cloud coordinates is as follows: [ x ', y ', z ']=[xi,yi,zi]-[x0,y0,z0]Wherein, [ x ', y ', z ']The transformed point cloud coordinates are represented, and m represents the number of point clouds.
S2 includes the following substeps:
s2.1, setting related parameters: iteration times n, an effective point proportion threshold p and an effective threshold t;
s2.2, randomly selecting 3 non-collinear points in the point cloud set, and calculating a corresponding plane equation, wherein the formula is as follows: a is1x+b1y+c1z=d1Wherein (a)1,b1,c1) Normal vector expressing the face, d1Represents the distance from the origin to the plane;
s2.3, calculating the distances from all the observed values to the plane, wherein the formula is as follows: di=(a1xj+b1yj+c1zj+d1) (ii) a If d isiWhen t is less than or equal to t, thenConsidering the point as a valid point or an interior point, otherwise, the point is an invalid point;
s2.4. calculating plane m1F is the proportion of the effective points in the total points, and if f is more than or equal to p, the M is regarded as plane1If the point cloud is an effective plane, otherwise, the point cloud is an invalid plane, and the self flatness evaluation of the point cloud of the surface is finished;
s2.5, sampling for multiple times according to the set iteration times n, and selecting a plane with the highest effective point ratio as a final fitting plane;
s2.6, performing parameter calculation on the plane obtained by fitting in the S2.5 by adopting a characteristic value method to obtain a plane parameter equation as follows: ax + By + Cz + D ═ 0, where (a, B, C, D) are planar parameters;
s2.7, uniformly selecting more than or equal to 3 pairs of unparallel opposite planes on point cloud data of different platforms, defining the positive direction of a normal vector, calculating plane parameters (A, B, C and D) of the plane parameters, and determining the plane parameters uniquely.
Step S3 includes the following sub-steps:
s3.1, constructing a coordinate conversion model:
let the expressions of the same plane in different coordinate systems be (a)1,b1,c1,d1),(a2,b2,c2,d2) Then the affine transformation model is:
Figure BDA0002657207920000021
wherein the content of the first and second substances,
Figure BDA0002657207920000022
the parameters of the rotation matrix are represented by,
Figure BDA0002657207920000023
representing translation matrix parameters;
s3.2, carrying out linearization on the rotation matrix, converting the 9 parameters into 3 parameters:
and replacing the rotation matrix and the translation matrix by coordinate conversion parameters, wherein the formula is as follows:
Figure BDA0002657207920000024
wherein, alpha is the rotation angle around the x axis, beta is the rotation angle around the y axis, and gamma is the rotation angle around the z axis;
normal vector (a)2,b2,c2) The calculation formula is as follows:
a2=cosγcosβa1+(cosγsinβsinα+sinγcosα)b1+(sinγsinα-cosγsinβcosα)c1
b2=-sinγcosβa1+(cosγcosα-sinγsinβsinα)b1+(sinγsinβcosα+cosγsinα)c1
c2=sinβa1-cosβsinαb1+cosβcosαc1
s3.3, constructing an indirect adjustment model:
separately relating alpha, beta and gamma to a2、b2And c2Partial derivatives of (a):
Figure BDA0002657207920000031
d2the linear processing is not needed, and the calculation formula is as follows: d2=txa1+tyb1+tzc1+d1
The indirect adjustment model formula is as follows:
Figure BDA0002657207920000032
the above equation is rewritten to the form of the least squares error equation as follows:
V=AX-L;
wherein the content of the first and second substances,
Figure BDA0002657207920000033
X=(dα dβ dγ dtx dty dtz)′,
Figure BDA0002657207920000034
Figure BDA0002657207920000035
is (a)2,b2,c2,d2) An estimated value of (d);
s3.4, solving parameters by using an integral least square method:
assuming that the observation error in the observed quantity L is E, and the error in the coefficient matrix A is EAThen the least squares error equation can be written as:
(A+EA)X=L+e,
namely existence of
Figure BDA0002657207920000041
Wherein E ═ EA e];
The optimization constraint conditions at this time are:
Figure BDA0002657207920000042
solving by using a singular value decomposition method, and performing singular value decomposition on the augmentation matrix [ A L ]:
Figure BDA0002657207920000043
Figure BDA0002657207920000044
decomposing to obtain
Figure BDA0002657207920000045
An orthogonal matrix of m +1 eigenvectors,
the estimate of the parameter X can be expressed as:
Figure BDA0002657207920000046
and S3.5, evaluating the correlation of the plane to be registered according to the correction number of the adjustment result, wherein the correlation is weaker when the correction number is larger, and the plane pair is worse, the plane pair which does not meet the requirement needs to be reselected.
Step S4 includes the following sub-steps:
s4.1, converting corresponding point cloud coordinate systems of different platforms according to the calculated 3 rotation parameters, wherein a conversion model is as follows:
Figure BDA0002657207920000047
s4.2, converting the converted point cloud from the local coordinate system to the original coordinate system to complete multi-platform point cloud matching based on the surface characteristics, wherein the formula is as follows: [ x, y, z ]]=[x2,y2,z2]+[x0,y0,z0]Wherein [ x, y, z)]Representing the final coordinates after the matching transformation.
Compared with the prior art, the invention has the beneficial effects that:
(1) by establishing a local coordinate system, point cloud coordinates of different platforms are converted into respective local coordinate systems, so that matching errors caused by overlarge numerical difference of the point cloud coordinates of the different platforms are avoided, and matching precision is improved;
(2) the 9-parameter rotation model of the plane conversion model is converted into the 3-parameter model, and the parameter solution is carried out by adopting an integral least square method in consideration of the possibility of errors in an observation vector and a coefficient matrix, so that the parameter solution precision is effectively improved, the robustness of the algorithm is increased, and powerful support is provided for point cloud matching;
(3) because the point cloud matching method is based on the surface characteristics, the matching precision is determined by the selection and fitting precision of the surfaces, so that the evaluation indexes of the two surfaces are set to select a better plane and reduce the influence of plane errors on the matching precision.
Drawings
Fig. 1 is a technical flowchart of a multi-platform point cloud matching method based on surface features.
Detailed Description
The present invention will be described in further detail with reference to specific embodiments below:
a multi-platform point cloud matching method based on surface features is disclosed, a technical flow chart is shown in figure 1, and the method comprises the following steps:
s1, acquiring ground objects of a target area through data acquisition equipment erected on different platforms, and acquiring and preprocessing original point cloud data under the different platforms;
s2, uniformly selecting three or more pairs of non-coplanar matching planes from the preprocessed point cloud data in a man-machine interaction mode, performing plane fitting by adopting a steady RANSAC algorithm, extracting parameters of common planes, and performing self-flatness evaluation on the plane point cloud;
s3, converting the rotation parameters into a nonlinear model based on three or more pairs of plane parameters of corresponding planes, performing linearization processing, keeping the translation parameters unchanged, and performing parameter solution by adopting an integral least square method; evaluating the correlation of the planes to be registered through the correction number based on the adjustment result, and reselecting the planes which do not meet the requirements;
and S4, completing point cloud matching based on a coordinate conversion model.
Step S1, in the data processing stage, filtering and denoising original point cloud data of different platforms, and converting a preprocessed point cloud coordinate system into a local coordinate system which takes the gravity center as a coordinate origin and is unchanged in coordinate axis direction; the local origin of coordinates calculation is as follows:
Figure BDA0002657207920000051
wherein, [ x ]0,y0,z0]Denotes the origin of a local coordinate system, [ x ]i,yi,zi]Representing the ith point cloud coordinate; the calculation formula of the transformed point cloud coordinates is as follows: [ x ', y ', z ']=[xi,yi,zi]-[x0,y0,z0]Wherein, [ x ', y ', z ']Representing transformed point cloud coordinates, m representingNumber of point clouds.
S2 includes the following substeps:
s2.1, setting related parameters: iteration times n, an effective point proportion threshold p and an effective threshold t;
s2.2, randomly selecting 3 non-collinear points in the point cloud set, and calculating a corresponding plane equation, wherein the formula is as follows: a is1x+b1y+c1z=d1Wherein (a)1,b1,c1) Normal vector expressing the face, d1Represents the distance from the origin to the plane;
s2.3, calculating the distances from all the observed values to the plane, wherein the formula is as follows: di=(a1xj+b1yj+c1zj+d1) (ii) a If d isiIf the t is less than or equal to t, the point is regarded as a valid point or an inner point, otherwise, the point is regarded as an invalid point;
s2.4. calculating plane m1F is the proportion of the effective points in the total points, and if f is more than or equal to p, the M is regarded as plane1If the point cloud is an effective plane, otherwise, the point cloud is an invalid plane, and the self flatness evaluation of the point cloud of the surface is finished;
s2.5, sampling for multiple times according to the set iteration times n, and selecting a plane with the highest effective point ratio as a final fitting plane;
s2.6, performing parameter calculation on the plane obtained by fitting in the S2.5 by adopting a characteristic value method to obtain a plane parameter equation as follows: ax + By + Cz + D ═ 0, where (a, B, C, D) are planar parameters;
s2.7, uniformly selecting more than or equal to 3 pairs of unparallel opposite planes on point cloud data of different platforms, defining the positive direction of a normal vector, calculating plane parameters (A, B, C and D) of the plane parameters, and determining the plane parameters uniquely.
Step S3 includes the following sub-steps:
s3.1, constructing a coordinate conversion model:
let the expressions of the same plane in different coordinate systems be (a)1,b1,c1,d1),(a2,b2,c2,d2) Then the affine transformation model is:
Figure BDA0002657207920000061
wherein the content of the first and second substances,
Figure BDA0002657207920000062
the parameters of the rotation matrix are represented by,
Figure BDA0002657207920000063
representing translation matrix parameters;
s3.2, carrying out linearization on the rotation matrix, converting the 9 parameters into 3 parameters:
and replacing the rotation matrix and the translation matrix by coordinate conversion parameters, wherein the formula is as follows:
Figure BDA0002657207920000064
wherein, alpha is the rotation angle around the x axis, beta is the rotation angle around the y axis, and gamma is the rotation angle around the z axis;
normal vector (a)2,b2,c2) The calculation formula is as follows:
a2=cosγcosβa1+(cosγsinβsinα+sinγcosα)b1+(sinγsinα-cosγsinβcosα)c1
b2=-sinγcosβa1+(cosγcosα-sinγsinβsinα)b1+(sinγsinβcosα+cosγsinα)c1
c2=sinβa1-cosβsinαb1+cosβcosαc1
s3.3, constructing an indirect adjustment model:
separately relating alpha, beta and gamma to a2、b2And c2Partial derivatives of (a):
Figure BDA0002657207920000071
d2the linear processing is not needed, and the calculation formula is as follows: d2=txa1+tyb1+tzc1+d1
The indirect adjustment model formula is as follows:
Figure BDA0002657207920000072
the above equation is rewritten to the form of the least squares error equation as follows:
V=AX-L;
wherein V is (V)a2 Vb2 Vc2 Vd2)′,X=(dα dβ dγ dtx dty dtz)′,
Figure BDA0002657207920000073
Figure BDA0002657207920000074
Is (a)2,b2,c2,d2) An estimated value of (d);
s3.4, solving parameters by using an integral least square method:
assuming that the observation error in the observed quantity L is E, and the error in the coefficient matrix A is EAThen the least squares error equation can be written as: (A + E)A)X=L+e,
Namely existence of
Figure BDA0002657207920000075
Wherein E ═ EA e];
The optimization constraint conditions at this time are:
Figure BDA0002657207920000076
solving by using a singular value decomposition method, and performing singular value decomposition on the augmentation matrix [ A L ]:
Figure BDA0002657207920000081
Figure BDA0002657207920000082
decomposing to obtain
Figure BDA0002657207920000083
An orthogonal matrix of m +1 eigenvectors,
the estimate of the parameter X can be expressed as:
Figure BDA0002657207920000084
and S3.5, evaluating the correlation of the plane to be registered according to the correction number of the adjustment result, wherein the correlation is weaker when the correction number is larger, and the plane pair is worse, the plane pair which does not meet the requirement needs to be reselected.
Step S4 includes the following sub-steps:
s4.1, converting corresponding point cloud coordinate systems of different platforms according to the calculated 3 rotation parameters, wherein a conversion model is as follows:
Figure BDA0002657207920000085
s4.2, converting the converted point cloud from the local coordinate system to the original coordinate system to complete multi-platform point cloud matching based on the surface characteristics, wherein the formula is as follows: [ x, y, z ]]=[x2,y2,z2]+[x0,y0,z0]Wherein [ x, y, z)]Representing the final coordinates after the matching transformation.
It is to be understood that the above description is not intended to limit the present invention, and the present invention is not limited to the above examples, and those skilled in the art may make modifications, alterations, additions or substitutions within the spirit and scope of the present invention.

Claims (5)

1. A multi-platform point cloud matching method based on surface features is characterized by comprising the following steps:
s1, acquiring ground objects of a target area through data acquisition equipment erected on different platforms, and acquiring and preprocessing original point cloud data under the different platforms;
s2, uniformly selecting three or more pairs of non-coplanar matching planes from the preprocessed point cloud data in a man-machine interaction mode, performing plane fitting by adopting a steady RANSAC algorithm, extracting parameters of common planes, and performing self-flatness evaluation on the plane point cloud;
s3, converting the rotation parameters into a nonlinear model based on three or more pairs of plane parameters of corresponding planes, performing linearization processing, keeping the translation parameters unchanged, and performing parameter solution by adopting an integral least square method; evaluating the correlation of the planes to be registered through the correction number based on the adjustment result, and reselecting the planes which do not meet the requirements;
and S4, completing point cloud matching based on a coordinate conversion model.
2. The method of claim 1, wherein in the step S1, in the data processing stage, the original data of different platform point clouds are filtered and denoised, and the coordinate system of the preprocessed point clouds is converted into a local coordinate system with the center of gravity as the origin of coordinates and the coordinate axis direction unchanged; the local origin of coordinates calculation is as follows:
Figure FDA0002657207910000011
wherein, [ x ]0,y0,z0]Denotes the origin of a local coordinate system, [ x ]i,yi,zi]Representing the ith point cloud coordinate; the calculation formula of the transformed point cloud coordinates is as follows: [ x ', y ', z ']=[xi,yi,zi]-[x0,y0,z0]Wherein, [ x ', y ', z ']The transformed point cloud coordinates are represented, and m represents the number of point clouds.
3. The method for matching a multi-platform point cloud based on surface features as claimed in claim 1, wherein the step S2 comprises the following sub-steps:
s2.1, setting related parameters: iteration times n, an effective point proportion threshold p and an effective threshold t;
s2.2, randomly selecting 3 non-collinear points in the point cloud set, and calculating a corresponding plane equation, wherein the formula is as follows: a is1x+b1y+c1z=d1Wherein (a)1,b1,c1) Normal vector expressing the face, d1Represents the distance from the origin to the plane;
s2.3, calculating the distances from all the observed values to the plane, wherein the formula is as follows: di=(a1xj+b1yj+c1zj+d1) (ii) a If d isiIf the t is less than or equal to t, the point is regarded as a valid point or an inner point, otherwise, the point is regarded as an invalid point;
s2.4. calculating plane m1F is the proportion of the effective points in the total points, and if f is more than or equal to p, the M is regarded as plane1If the point cloud is an effective plane, otherwise, the point cloud is an invalid plane, and the self flatness evaluation of the point cloud of the surface is finished;
s2.5, sampling for multiple times according to the set iteration times n, and selecting a plane with the highest effective point ratio as a final fitting plane;
s2.6, performing parameter calculation on the plane obtained by fitting in the S2.5 by adopting a characteristic value method to obtain a plane parameter equation as follows: ax + By + Cz + D ═ 0, where (a, B, C, D) are planar parameters;
s2.7, uniformly selecting more than or equal to 3 pairs of unparallel opposite planes on point cloud data of different platforms, defining the positive direction of a normal vector, calculating plane parameters (A, B, C and D) of the plane parameters, and determining the plane parameters uniquely.
4. The method for matching a multi-platform point cloud based on surface features as claimed in claim 1, wherein the step S3 comprises the following sub-steps:
s3.1, constructing a coordinate conversion model:
let and holdThe representation of a plane in different coordinate systems is respectively (a)1,b1,c1,d1),(a2,b2,c2,d2) Then the affine transformation model is:
Figure FDA0002657207910000021
wherein the content of the first and second substances,
Figure FDA0002657207910000022
the parameters of the rotation matrix are represented by,
Figure FDA0002657207910000023
representing translation matrix parameters;
s3.2, carrying out linearization on the rotation matrix, converting the 9 parameters into 3 parameters:
and replacing the rotation matrix and the translation matrix by coordinate conversion parameters, wherein the formula is as follows:
Figure FDA0002657207910000024
wherein, alpha is the rotation angle around the x axis, beta is the rotation angle around the y axis, and gamma is the rotation angle around the z axis;
normal vector (a)2,b2,c2) The calculation formula is as follows:
a2=cosγcosβa1+(cosγsinβsinα+sinγcosα)b1+(sinγsinα-cosγsinβcosα)c1
b2=-sinγcosβa1+(cosγcosα-sinγsinβsinα)b1+(sinγsinβcosα+cosγsinα)c1
c2=sinβa1-cosβsinαb1+cosβcosαc1
s3.3, constructing an indirect adjustment model:
separately relating alpha, beta and gamma to a2、b2And c2Partial derivatives of (a):
Figure FDA0002657207910000025
d2the linear processing is not needed, and the calculation formula is as follows: d2=txa1+tyb1+tzc1+d1
The indirect adjustment model formula is as follows:
Figure FDA0002657207910000031
the above equation is rewritten to the form of the least squares error equation as follows:
V=AX-L;
wherein the content of the first and second substances,
Figure FDA0002657207910000032
X=(dα dβ dγ dtx dty dtz)′,
Figure FDA0002657207910000033
Figure FDA0002657207910000034
is (a)2,b2,c2,d2) An estimated value of (d);
s3.4, solving parameters by using an integral least square method:
assuming that the observation error in the observed quantity L is E, and the error in the coefficient matrix A is EAThen the least squares error equation can be written as: (A + E)A)X=L+e,
Namely existence of
Figure FDA0002657207910000035
Wherein E ═ EA e];
The optimization constraint conditions at this time are:
Figure FDA0002657207910000036
solving by using a singular value decomposition method, and performing singular value decomposition on the augmentation matrix [ A L ]:
Figure FDA0002657207910000037
Figure FDA0002657207910000041
decomposing to obtain
Figure FDA0002657207910000042
An orthogonal matrix of m +1 eigenvectors,
the estimate of the parameter X can be expressed as:
Figure FDA0002657207910000043
and S3.5, evaluating the correlation of the plane to be registered according to the correction number of the adjustment result, wherein the correlation is weaker when the correction number is larger, and the plane pair is worse, the plane pair which does not meet the requirement needs to be reselected.
5. The method for matching a multi-platform point cloud based on surface features as claimed in claim 1, wherein the step S4 comprises the following sub-steps:
s4.1, converting corresponding point cloud coordinate systems of different platforms according to the calculated 3 rotation parameters, wherein a conversion model is as follows:
Figure FDA0002657207910000044
s4.2, converting the converted point cloud from the local coordinate system to the original coordinate system to complete multi-platform point cloud matching based on the surface characteristics, wherein the formula is as follows: [ x, y, z ]]=[x2,y2,z2]+[x0,y0,z0]Wherein [ x, y, z)]Representing the final coordinates after the matching transformation.
CN202010891648.8A 2020-08-31 2020-08-31 Multi-platform point cloud matching method based on surface features Active CN112132875B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010891648.8A CN112132875B (en) 2020-08-31 2020-08-31 Multi-platform point cloud matching method based on surface features

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010891648.8A CN112132875B (en) 2020-08-31 2020-08-31 Multi-platform point cloud matching method based on surface features

Publications (2)

Publication Number Publication Date
CN112132875A true CN112132875A (en) 2020-12-25
CN112132875B CN112132875B (en) 2023-07-28

Family

ID=73847647

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010891648.8A Active CN112132875B (en) 2020-08-31 2020-08-31 Multi-platform point cloud matching method based on surface features

Country Status (1)

Country Link
CN (1) CN112132875B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113050103A (en) * 2021-02-05 2021-06-29 上海擎朗智能科技有限公司 Ground detection method, device, electronic equipment, system and medium
CN114755666A (en) * 2022-06-01 2022-07-15 苏州一径科技有限公司 Point cloud expansion evaluation method, device and equipment
CN115830080A (en) * 2022-10-27 2023-03-21 上海神玑医疗科技有限公司 Point cloud registration method and device, electronic equipment and storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012141235A1 (en) * 2011-04-13 2012-10-18 株式会社トプコン Three-dimensional point group position data processing device, three-dimensional point group position data processing system, three-dimensional point group position data processing method and program
US20160076880A1 (en) * 2014-09-11 2016-03-17 Fu Tai Hua Industry (Shenzhen) Co., Ltd. Computing device and method for processing point clouds
CN106570823A (en) * 2016-10-11 2017-04-19 山东科技大学 Planar feature matching-based point cloud crude splicing method
CN108090960A (en) * 2017-12-25 2018-05-29 北京航空航天大学 A kind of Object reconstruction method based on geometrical constraint
CN109377521A (en) * 2018-09-11 2019-02-22 武汉大学 Terrestrial Laser scanner data acquire midpoint to the point cloud registration method of best fit plane
US20190128670A1 (en) * 2017-10-30 2019-05-02 Xyzprinting, Inc. Apparatus for producing 3d point-cloud model of physical object and producing method thereof
CN109946701A (en) * 2019-03-26 2019-06-28 新石器慧通(北京)科技有限公司 A kind of cloud coordinate transformation method and device
CN110443836A (en) * 2019-06-24 2019-11-12 中国人民解放军战略支援部队信息工程大学 A kind of point cloud data autoegistration method and device based on plane characteristic
CN110910454A (en) * 2019-10-11 2020-03-24 华南农业大学 Automatic calibration registration method of mobile livestock three-dimensional reconstruction equipment

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012141235A1 (en) * 2011-04-13 2012-10-18 株式会社トプコン Three-dimensional point group position data processing device, three-dimensional point group position data processing system, three-dimensional point group position data processing method and program
US20160076880A1 (en) * 2014-09-11 2016-03-17 Fu Tai Hua Industry (Shenzhen) Co., Ltd. Computing device and method for processing point clouds
CN106570823A (en) * 2016-10-11 2017-04-19 山东科技大学 Planar feature matching-based point cloud crude splicing method
US20190128670A1 (en) * 2017-10-30 2019-05-02 Xyzprinting, Inc. Apparatus for producing 3d point-cloud model of physical object and producing method thereof
CN108090960A (en) * 2017-12-25 2018-05-29 北京航空航天大学 A kind of Object reconstruction method based on geometrical constraint
CN109377521A (en) * 2018-09-11 2019-02-22 武汉大学 Terrestrial Laser scanner data acquire midpoint to the point cloud registration method of best fit plane
CN109946701A (en) * 2019-03-26 2019-06-28 新石器慧通(北京)科技有限公司 A kind of cloud coordinate transformation method and device
CN110443836A (en) * 2019-06-24 2019-11-12 中国人民解放军战略支援部队信息工程大学 A kind of point cloud data autoegistration method and device based on plane characteristic
CN110910454A (en) * 2019-10-11 2020-03-24 华南农业大学 Automatic calibration registration method of mobile livestock three-dimensional reconstruction equipment

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
卢秀山等: "车载激光点云与序列化全景影像融合方法", 中国激光, vol. 45, no. 5, pages 1 *
张东;黄腾;: "基于平面特征的地面雷达点云配准算法", 测绘科学, vol. 40, no. 11, pages 146 *
杨雷等: "一种车载激光扫描点云中路面坑槽自动提取方法", 测绘工程, vol. 29, no. 1, pages 66 - 71 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113050103A (en) * 2021-02-05 2021-06-29 上海擎朗智能科技有限公司 Ground detection method, device, electronic equipment, system and medium
CN114755666A (en) * 2022-06-01 2022-07-15 苏州一径科技有限公司 Point cloud expansion evaluation method, device and equipment
CN115830080A (en) * 2022-10-27 2023-03-21 上海神玑医疗科技有限公司 Point cloud registration method and device, electronic equipment and storage medium
CN115830080B (en) * 2022-10-27 2024-05-03 上海神玑医疗科技有限公司 Point cloud registration method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN112132875B (en) 2023-07-28

Similar Documents

Publication Publication Date Title
CN111563442B (en) Slam method and system for fusing point cloud and camera image data based on laser radar
CN112132875A (en) Multi-platform point cloud matching method based on surface features
CN112017220B (en) Point cloud accurate registration method based on robust constraint least square algorithm
CN108230375B (en) Registration method of visible light image and SAR image based on structural similarity rapid robustness
CN110487286B (en) Robot pose judgment method based on point feature projection and laser point cloud fusion
CN107871327A (en) The monocular camera pose estimation of feature based dotted line and optimization method and system
CN113139453A (en) Orthoimage high-rise building base vector extraction method based on deep learning
CN114526745A (en) Drawing establishing method and system for tightly-coupled laser radar and inertial odometer
CN113313047B (en) Lane line detection method and system based on lane structure prior
CN111242000A (en) Road edge detection method combining laser point cloud steering
CN113920198B (en) Coarse-to-fine multi-sensor fusion positioning method based on semantic edge alignment
CN112197773B (en) Visual and laser positioning mapping method based on plane information
CN112652020A (en) Visual SLAM method based on AdaLAM algorithm
CN113759364A (en) Millimeter wave radar continuous positioning method and device based on laser map
CN117333846A (en) Detection method and system based on sensor fusion and incremental learning in severe weather
WO2021063756A1 (en) Improved trajectory estimation based on ground truth
CN115761528A (en) Push-broom type remote sensing satellite image high-precision wave band alignment method based on integral graph
CN114399547B (en) Monocular SLAM robust initialization method based on multiframe
CN113393507B (en) Unmanned aerial vehicle point cloud and ground three-dimensional laser scanner point cloud registration method
CN112258391B (en) Fragmented map splicing method based on road traffic marking
CN114004949A (en) Airborne point cloud assisted mobile measurement system arrangement parameter calibration method and system
CN113138395A (en) Point cloud map construction method based on laser radar data fusion of total station
Guo et al. 3D Lidar SLAM Based on Ground Segmentation and Scan Context Loop Detection
CN112767458B (en) Method and system for registering laser point cloud and image
Yu et al. High Precision Positioning and Rotation Angle Estimation of Flatbed Truck Based on Beidou and Vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant