CN112132875B - Multi-platform point cloud matching method based on surface features - Google Patents

Multi-platform point cloud matching method based on surface features Download PDF

Info

Publication number
CN112132875B
CN112132875B CN202010891648.8A CN202010891648A CN112132875B CN 112132875 B CN112132875 B CN 112132875B CN 202010891648 A CN202010891648 A CN 202010891648A CN 112132875 B CN112132875 B CN 112132875B
Authority
CN
China
Prior art keywords
plane
point cloud
parameters
point
follows
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010891648.8A
Other languages
Chinese (zh)
Other versions
CN112132875A (en
Inventor
刘如飞
卢秀山
刘以旭
马新江
柴永宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Xiushan Mobile Surveying Co ltd
Original Assignee
Qingdao Xiushan Mobile Surveying Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Xiushan Mobile Surveying Co ltd filed Critical Qingdao Xiushan Mobile Surveying Co ltd
Priority to CN202010891648.8A priority Critical patent/CN112132875B/en
Publication of CN112132875A publication Critical patent/CN112132875A/en
Application granted granted Critical
Publication of CN112132875B publication Critical patent/CN112132875B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • G06T7/344Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a multi-platform point cloud matching method based on surface characteristics, and belongs to the technical field of mobile measurement point cloud matching. The implementation steps of the invention are as follows: acquiring point cloud data of an object, carrying out data denoising pretreatment, and converting an original point cloud coordinate system after pretreatment into a local coordinate system with the center of gravity as a coordinate origin and unchanged coordinate axis direction; uniformly selecting three or more pairs of matching planes from the acquired point cloud data, performing plane fitting by adopting a robust RANSAC algorithm to extract plane parameters (A, B, C and D) of a common plane, and performing self-flatness evaluation of the plane point cloud; based on plane parameters of three or more pairs of corresponding planes, converting the rotation parameters into nonlinear models, performing linearization treatment, keeping the translation parameters unchanged, and performing parameter solution by adopting an integral least square method; evaluating the correlation of the planes to be registered based on the correction of the adjustment result, and reselecting the planes which do not meet the requirements; and based on the coordinate conversion model, completing the matching of the point cloud.

Description

Multi-platform point cloud matching method based on surface features
Technical Field
The invention discloses a multi-platform point cloud matching method based on surface characteristics, and belongs to the technical field of mobile measurement point cloud matching.
Background
At present, vehicle-mounted, airborne and single-station LiDAR scanners are affected by the posture of a carrying platform, the precision of a control point, the precision of a GPS (global positioning system), the precision of inertial navigation and the like, and tiny deviation exists among acquired multi-source data coordinate systems. The collection modes of the platforms have certain application conditions, and in practical production application, the multi-platform point cloud data are required to be matched and fused so as to obtain multi-dimensional and multi-time space point cloud data of the target ground object. In order to fully utilize information in the multi-source data, it is necessary to match the coordinate system of the multi-source point cloud data and complete the point cloud splicing according to the homonymous features (points, lines and planes). The accuracy of the point cloud stitching affects the accuracy of subsequent data processing, and therefore, the point cloud stitching is a key step of data processing.
In the volumetric measurement mode, there may be no common point for two measurements of the same surface, so a model-based or feature-based coordinate transformation method needs to be studied; the plane is a rule model widely existing in the nature, and if a plurality of planes can be fitted from point cloud data and coordinate conversion is performed by using a common plane, the defect that in the prior art, common points need to be searched from massive point cloud data can be avoided.
Disclosure of Invention
The invention discloses a road identification line extraction method based on a driving direction structural feature constraint, which aims to solve the problem that in the prior art, a common point is required to be searched from massive point cloud data.
A multi-platform point cloud matching method based on surface features comprises the following steps:
s1, acquiring ground objects of a target area through data acquisition equipment erected on different platforms, obtaining original point cloud data under different platforms and preprocessing the original point cloud data;
s2, uniformly selecting three or more pairs of non-coplanar matching planes from the preprocessed point cloud data in a man-machine interaction mode, performing plane fitting by adopting a robust RANSAC algorithm, extracting parameters of a common plane, and performing self-flatness evaluation of the plane point cloud;
s3, based on three or more pairs of plane parameters of corresponding planes, converting the rotation parameters into a nonlinear model, carrying out linearization treatment, keeping the translation parameters unchanged, and carrying out parameter solution by adopting an integral least square method; evaluating the correlation of the planes to be registered by a correction based on a adjustment result, and reselecting the planes which do not meet the requirements;
s4, based on the coordinate conversion model, matching of the point cloud is completed.
Step S1, in a data processing stage, filtering and denoising preprocessing is carried out on point cloud original data of different platforms, and a preprocessed point cloud coordinate system is converted into a local coordinate system which takes the center of gravity as a coordinate origin and has unchanged coordinate axis direction; the local origin of coordinates is calculated as follows:wherein [ x ] 0 ,y 0 ,z 0 ]Represents the origin of the local coordinate system, [ x ] i ,y i ,z i ]Representing an ith point cloud coordinate; the calculation formula of the transformed point cloud coordinates is as follows: [ x ', y ', z ] ']=[x i ,y i ,z i ]-[x 0 ,y 0 ,z 0 ]Wherein [ x ', y ', z ] ']And representing the transformed point cloud coordinates, and m represents the number of point clouds.
S2 comprises the following substeps:
s2.1, setting related parameters: iteration times n, effective point proportion threshold p and effective threshold t;
s2.2, randomly selecting 3 non-collinear points in the point cloud set, and calculating a corresponding plane equation, wherein the formula is as follows: a, a 1 x+b 1 y+c 1 z=d 1 Wherein (a) 1 ,b 1 ,c 1 ) Expressing the normal vector of the face, d 1 Representing the distance from the origin to the plane;
s2.3, calculating the distance from all the observed values to the plane, wherein the formula is as follows: d, d i =(a 1 x j +b 1 y j +c 1 z j +d 1 ) The method comprises the steps of carrying out a first treatment on the surface of the If d i If t is less than or equal to t, the point is considered to be an effective point or an inner point, otherwise, the point is an ineffective point;
s2.4. calculating plane Μ 1 The effective point number in the plane M is the proportion f of the total point number, if f is more than or equal to p, the plane M is considered 1 If the plane is an effective plane or an ineffective plane, the self-flatness evaluation of the surface point cloud is completed;
s2.5, sampling for a plurality of times by the set iteration times n, and selecting a plane with the highest effective point occupation ratio as a final fitting plane;
s2.6, carrying out parameter calculation on the plane obtained by fitting the S2.5 by adopting a eigenvalue method to obtain a plane parameter equation as follows: ax+by+cz+d=0, wherein (a, B, C, D) is a plane parameter;
s2.7, uniformly selecting non-parallel planes with k more than or equal to 3 pairs on point cloud data of different platforms, defining the positive direction of a normal vector, and calculating plane parameters (A, B, C and D) of the planes, wherein the plane parameters are uniquely determined.
Step S3 comprises the following sub-steps:
s3.1, constructing a coordinate transformation model:
let the representation of the same plane in different coordinate systems be (a) 1 ,b 1 ,c 1 ,d 1 ),(a 2 ,b 2 ,c 2 ,d 2 ) The affine transformation model is:wherein (1)>The parameters of the rotation matrix are represented,representing translation matrix parameters;
s3.2, linearizing the rotation matrix, and converting 9 parameters into 3 parameters:
the rotation matrix and the translation matrix are replaced by the coordinate transformation parameters, and the formula is as follows:
wherein α is a rotation angle around the x-axis, β is a rotation angle around the y-axis, and γ is a rotation angle around the z-axis;
normal vector (a) 2 ,b 2 ,c 2 ) The calculation formula is as follows:
a 2 =cosγcosβa 1 +(cosγsinβsinα+sinγcosα)b 1 +(sinγsinα-cosγsinβcosα)c 1
b 2 =-sinγcosβa 1 +(cosγcosα-sinγsinβsinα)b 1 +(sinγsinβcosα+cosγsinα)c 1
c 2 =sinβa 1 -cosβsinαb 1 +cosβcosαc 1
s3.3, constructing an indirect adjustment model:
alpha, beta, gamma are respectively calculated to be related to a 2 、b 2 And c 2 Is a partial derivative of:
d 2 the linearization process is not needed, and the calculation formula is as follows: d, d 2 =t x a 1 +t y b 1 +t z c 1 +d 1
The indirect adjustment model formula is as follows:
the form of the least squares error equation is rewritten as follows:
V=AX-L;
wherein,,X=(dα dβ dγ dt x dt y dt z )′,
is (a) 2 ,b 2 ,c 2 ,d 2 ) Is a function of the estimated value of (2);
s3.4, integral least squares solution parameters:
assuming that the observed error in the observed quantity L is E, and the error in the coefficient matrix A is E A The least squares error equation can be written as:
(A+E A )X=L+e,
i.e. is presentWherein E= [ E ] A e];
The optimization constraint conditions at this time are:
solving by adopting a singular value decomposition method, and performing singular value decomposition on an augmentation matrix [ A L ]:
decomposing to obtainAn orthogonal matrix of m+1 eigenvectors,
the estimate of parameter X can be expressed as:
s3.5, evaluating the correlation of the planes to be registered according to the correction of the adjustment result, wherein the weaker the correlation is when the correction is larger, and the worse the plane pairs are, the re-selection of the plane pairs which do not meet the requirement is needed.
Step S4 comprises the following sub-steps:
s4.1, converting corresponding point cloud coordinate systems of different platforms according to the calculated 3 rotation parameters, wherein a conversion model is as follows:
s4.2, transforming the converted point cloud from a local coordinate system to an original coordinate system to complete multi-platform point cloud matching based on the surface characteristics, wherein the formula is as follows: [ x, y, z]=[x 2 ,y 2 ,z 2 ]+[x 0 ,y 0 ,z 0 ]Wherein [ x, y, z]Representing the final coordinates after the matching transformation.
Compared with the prior art, the invention has the beneficial effects that:
(1) The point cloud coordinates of different platforms are converted into the respective local coordinate systems by establishing the local coordinate systems, so that matching errors caused by overlarge numerical value differences of the point cloud coordinates of the different platforms are avoided, and the matching precision is improved;
(2) The 9-parameter rotation model of the plane conversion model is converted into a 3-parameter model, and the overall least square method is adopted for carrying out parameter solving in consideration of possible errors of the observation vector and the coefficient matrix, so that the parameter solving precision is effectively improved, the robustness of an algorithm is increased, and a powerful support is provided for point cloud matching;
(3) Because the point cloud matching method is based on the surface characteristics, the matching precision is determined by the surface selection and the fitting precision, and therefore, the evaluation indexes of the two surfaces are set to select a better plane, and the influence of plane errors on the matching precision is reduced.
Drawings
Fig. 1 is a technical flowchart of a multi-platform point cloud matching method based on face features.
Detailed Description
The invention is described in further detail below in connection with the following detailed description:
a multi-platform point cloud matching method based on surface features is shown in a technical flow chart as shown in fig. 1, and comprises the following steps:
s1, acquiring ground objects of a target area through data acquisition equipment erected on different platforms, obtaining original point cloud data under different platforms and preprocessing the original point cloud data;
s2, uniformly selecting three or more pairs of non-coplanar matching planes from the preprocessed point cloud data in a man-machine interaction mode, performing plane fitting by adopting a robust RANSAC algorithm, extracting parameters of a common plane, and performing self-flatness evaluation of the plane point cloud;
s3, based on three or more pairs of plane parameters of corresponding planes, converting the rotation parameters into a nonlinear model, carrying out linearization treatment, keeping the translation parameters unchanged, and carrying out parameter solution by adopting an integral least square method; evaluating the correlation of the planes to be registered by a correction based on a adjustment result, and reselecting the planes which do not meet the requirements;
s4, based on the coordinate conversion model, matching of the point cloud is completed.
Step S1, in a data processing stage, filtering and denoising preprocessing is carried out on point cloud original data of different platforms, and a preprocessed point cloud coordinate system is converted into a local coordinate system which takes the center of gravity as a coordinate origin and has unchanged coordinate axis direction; the local origin of coordinates is calculated as follows:wherein [ x ] 0 ,y 0 ,z 0 ]Represents the origin of the local coordinate system, [ x ] i ,y i ,z i ]Representing an ith point cloud coordinate; the calculation formula of the transformed point cloud coordinates is as follows: [ x ', y ', z ] ']=[x i ,y i ,z i ]-[x 0 ,y 0 ,z 0 ]Wherein [ x ', y ', z ] ']And representing the transformed point cloud coordinates, and m represents the number of point clouds.
S2 comprises the following substeps:
s2.1, setting related parameters: iteration times n, effective point proportion threshold p and effective threshold t;
s2.2, randomly selecting 3 non-collinear points in the point cloud set, and calculating a corresponding plane equation, wherein the formula is as follows: a, a 1 x+b 1 y+c 1 z=d 1 Wherein (a) 1 ,b 1 ,c 1 ) Expressing the normal vector of the face, d 1 Representing the distance from the origin to the plane;
s2.3, calculating the distance from all the observed values to the plane, wherein the formula is as follows: d, d i =(a 1 x j +b 1 y j +c 1 z j +d 1 ) The method comprises the steps of carrying out a first treatment on the surface of the If d i If t is less than or equal to t, the point is considered to be an effective point or an inner point, otherwise, the point is an ineffective point;
s2.4. calculating plane Μ 1 The effective point number in the plane M is the proportion f of the total point number, if f is more than or equal to p, the plane M is considered 1 If the plane is an effective plane or an ineffective plane, the self-flatness evaluation of the surface point cloud is completed;
s2.5, sampling for a plurality of times by the set iteration times n, and selecting a plane with the highest effective point occupation ratio as a final fitting plane;
s2.6, carrying out parameter calculation on the plane obtained by fitting the S2.5 by adopting a eigenvalue method to obtain a plane parameter equation as follows: ax+by+cz+d=0, wherein (a, B, C, D) is a plane parameter;
s2.7, uniformly selecting non-parallel planes with k more than or equal to 3 pairs on point cloud data of different platforms, defining the positive direction of a normal vector, and calculating plane parameters (A, B, C and D) of the planes, wherein the plane parameters are uniquely determined.
Step S3 comprises the following sub-steps:
s3.1, constructing a coordinate transformation model:
let the representation of the same plane in different coordinate systems be (a) 1 ,b 1 ,c 1 ,d 1 ),(a 2 ,b 2 ,c 2 ,d 2 ) The affine transformation model is:wherein (1)>The parameters of the rotation matrix are represented,representing translation matrix parameters;
s3.2, linearizing the rotation matrix, and converting 9 parameters into 3 parameters:
the rotation matrix and the translation matrix are replaced by the coordinate transformation parameters, and the formula is as follows:
wherein α is a rotation angle around the x-axis, β is a rotation angle around the y-axis, and γ is a rotation angle around the z-axis;
normal vector (a) 2 ,b 2 ,c 2 ) The calculation formula is as follows:
a 2 =cosγcosβa 1 +(cosγsinβsinα+sinγcosα)b 1 +(sinγsinα-cosγsinβcosα)c 1
b 2 =-sinγcosβa 1 +(cosγcosα-sinγsinβsinα)b 1 +(sinγsinβcosα+cosγsinα)c 1
c 2 =sinβa 1 -cosβsinαb 1 +cosβcosαc 1
s3.3, constructing an indirect adjustment model:
alpha, beta, gamma are respectively calculated to be related to a 2 、b 2 And c 2 Is a partial derivative of:
d 2 the linearization process is not needed, and the calculation formula is as follows: d, d 2 =t x a 1 +t y b 1 +t z c 1 +d 1
The indirect adjustment model formula is as follows:
the form of the least squares error equation is rewritten as follows:
V=AX-L;
wherein v= (V a2 V b2 V c2 V d2 )′,X=(dα dβ dγ dt x dt y dt z )′,
Is (a) 2 ,b 2 ,c 2 ,d 2 ) Is a function of the estimated value of (2);
s3.4, integral least squares solution parameters:
assuming that the observed error in the observed quantity L is E, and the error in the coefficient matrix A is E A The least squares error equation can be written as: (A+E) A )X=L+e,
I.e. is presentWherein E= [ E ] A e];
The optimization constraint conditions at this time are:
solving by adopting a singular value decomposition method, and performing singular value decomposition on an augmentation matrix [ A L ]:
decomposing to obtainAn orthogonal matrix of m+1 eigenvectors,
the estimate of parameter X can be expressed as:
s3.5, evaluating the correlation of the planes to be registered according to the correction of the adjustment result, wherein the weaker the correlation is when the correction is larger, and the worse the plane pairs are, the re-selection of the plane pairs which do not meet the requirement is needed.
Step S4 comprises the following sub-steps:
s4.1, converting corresponding point cloud coordinate systems of different platforms according to the calculated 3 rotation parameters, wherein a conversion model is as follows:
s4.2, transforming the converted point cloud from a local coordinate system to an original coordinate system to complete multi-platform point cloud matching based on the surface characteristics, wherein the formula is as follows: [ x, y, z]=[x 2 ,y 2 ,z 2 ]+[x 0 ,y 0 ,z 0 ]Wherein [ x, y, z]Representing the final coordinates after the matching transformation.
It should be understood that the above description is not intended to limit the invention to the particular embodiments disclosed, but to limit the invention to the particular embodiments disclosed, and that the invention is not limited to the particular embodiments disclosed, but is intended to cover modifications, adaptations, additions and alternatives falling within the spirit and scope of the invention.

Claims (4)

1. The multi-platform point cloud matching method based on the surface features is characterized by comprising the following steps of:
s1, acquiring ground objects of a target area through data acquisition equipment erected on different platforms, obtaining original point cloud data under different platforms and preprocessing the original point cloud data;
s2, uniformly selecting more than three pairs of non-coplanar matching planes from the preprocessed point cloud data in a man-machine interaction mode, performing plane fitting by adopting a steady RANSAC algorithm, extracting parameters of a common plane, and performing self-flatness evaluation of the plane point cloud;
s3, based on plane parameters of more than three pairs of corresponding planes, converting the rotation parameters into nonlinear models, carrying out linearization treatment, keeping the translation parameters unchanged, and carrying out parameter solving by adopting an integral least square method; evaluating the correlation of the planes to be registered by a correction based on a adjustment result, and reselecting the planes which do not meet the requirements;
s3.1, constructing a coordinate transformation model:
let the representation of the same plane in different coordinate systems be (a) 1 ,b 1 ,c 1 ,d 1 ),(a 2 ,b 2 ,c 2 ,d 2 ) The affine transformation model is:
wherein,,representing rotation matrix parameters, +.>Representing translation matrix parameters;
s3.2, linearizing the rotation matrix, and converting 9 parameters into 3 parameters:
the rotation matrix and the translation matrix are replaced by the coordinate transformation parameters, and the formula is as follows:
wherein α is a rotation angle around the x-axis, β is a rotation angle around the y-axis, and γ is a rotation angle around the z-axis;
normal vector (a) 2 ,b 2 ,c 2 ) The calculation formula is as follows:
a 2 =cosγcosβa 1 +(cosγsinβsinα+sinγcosα)b 1 +(sinγsinα-cosγsinβcosα)c 1
b 2 =-sinγcosβa 1 +(cosγcosα-sinγsinβsinα)b 1 +(sinγsinβcosα+cosγsinα)c 1
c 2 =sinβa 1 -cosβsinαb 1 +cosβcosαc 1
s3.3, constructing an indirect adjustment model:
alpha, beta, gamma are respectively calculated to be related to a 2 、b 2 And c 2 Is a partial derivative of:
d 2 the linearization process is not needed, and the calculation formula is as follows: d, d 2 =t x a 1 +t y b 1 +t z c 1 +d 1
The indirect adjustment model formula is as follows:
the form of the least squares error equation is rewritten as follows:
V=AX-L;
wherein,,X=(dα dβ dγ dt x dt y dt z )′,
is (a) 2 ,b 2 ,c 2 ,d 2 ) Is a function of the estimated value of (2);
s3.4, integral least squares solution parameters:
assuming that the observed error in the observed quantity L is E, and the error in the coefficient matrix A is E A The least squares error equation is written as: (A+E) A )X=L+e,
I.e. is presentWherein E= [ E ] A e];
The optimization constraint conditions at this time are:
solving by adopting a singular value decomposition method, and performing singular value decomposition on an augmentation matrix [ A L ]:
decomposing to obtainAn orthogonal matrix of m+1 eigenvectors,
the estimate of parameter X is expressed as:
s3.5, evaluating the correlation of the planes to be registered according to the correction of the adjustment result, wherein the weaker the correlation is, the worse the plane pair is, and the plane pair which does not meet the requirement is needed to be selected again;
s4, based on the coordinate conversion model, matching of the point cloud is completed.
2. The method for matching point clouds of multiple platforms based on surface features according to claim 1, wherein in the step S1, in a data processing stage, filtering and denoising preprocessing is performed on original data of point clouds of different platforms, and a preprocessed point cloud coordinate system is converted into a local coordinate system with a center of gravity as an origin of coordinates and a coordinate axis direction unchanged; the local origin of coordinates is calculated as follows:wherein [ x ] 0 ,y 0 ,z 0 ]Represents the origin of the local coordinate system, [ x ] i ,y i ,z i ]Representing an ith point cloud coordinate; the calculation formula of the transformed point cloud coordinates is as follows: [ x ', y ', z ] ']=[x i ,y i ,z i ]-[x 0 ,y 0 ,z 0 ]Wherein [ x ', y ', z ] ']And representing the transformed point cloud coordinates, and m represents the number of point clouds.
3. The method for matching multiple platform point clouds based on surface features as claimed in claim 1, wherein said step S2 comprises the following sub-steps:
s2.1, setting related parameters: iteration times n, effective point proportion threshold p and effective threshold t;
s2.2, randomly selecting 3 non-collinear points in the point cloud set, and calculating a corresponding plane equation, wherein the formula is as follows: a, a 1 x+b 1 y+c 1 z=d 1 Wherein (a) 1 ,b 1 ,c 1 ) Expressing the normal vector of the face, d 1 Representing the distance from the origin to the plane;
s2.3, calculating the distance from all the observed values to the plane, wherein the formula is as follows: d, d i =(a 1 x j +b 1 y j +c 1 z j +d 1 ) The method comprises the steps of carrying out a first treatment on the surface of the If d i If t is less than or equal to t, the point is considered to be an effective point or an inner point, otherwise, the point is an ineffective point;
s2.4. calculating plane M 1 The effective point number in the total point number is f, if f is more than or equal to p, the plane M is considered 1 If the plane is an effective plane or an ineffective plane, the self-flatness evaluation of the surface point cloud is completed;
s2.5, sampling for a plurality of times by the set iteration times n, and selecting a plane with the highest effective point occupation ratio as a final fitting plane;
s2.6, carrying out parameter calculation on the plane obtained by fitting the S2.5 by adopting a eigenvalue method to obtain a plane parameter equation as follows: ax+by+cz+d=0, wherein (a, B, C, D) is a plane parameter;
s2.7, uniformly selecting non-parallel planes with k more than or equal to 3 pairs on point cloud data of different platforms, defining the positive direction of a normal vector, and calculating plane parameters (A, B, C and D) of the planes, wherein the plane parameters are uniquely determined.
4. The method for matching multiple platform point clouds based on surface features as claimed in claim 1, wherein said step S4 comprises the following sub-steps:
s4.1, converting corresponding point cloud coordinate systems of different platforms according to the calculated 3 rotation parameters, wherein a conversion model is as follows:
s4.2, transforming the converted point cloud from a local coordinate system to an original coordinate system to complete multi-platform point cloud matching based on the surface characteristics, wherein the formula is as follows: [ x, y, z]=[x 2 ,y 2 ,z 2 ]+[x 0 ,y 0 ,z 0 ]Wherein [ x, y, z]Representing the final coordinates after the matching transformation.
CN202010891648.8A 2020-08-31 2020-08-31 Multi-platform point cloud matching method based on surface features Active CN112132875B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010891648.8A CN112132875B (en) 2020-08-31 2020-08-31 Multi-platform point cloud matching method based on surface features

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010891648.8A CN112132875B (en) 2020-08-31 2020-08-31 Multi-platform point cloud matching method based on surface features

Publications (2)

Publication Number Publication Date
CN112132875A CN112132875A (en) 2020-12-25
CN112132875B true CN112132875B (en) 2023-07-28

Family

ID=73847647

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010891648.8A Active CN112132875B (en) 2020-08-31 2020-08-31 Multi-platform point cloud matching method based on surface features

Country Status (1)

Country Link
CN (1) CN112132875B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113050103A (en) * 2021-02-05 2021-06-29 上海擎朗智能科技有限公司 Ground detection method, device, electronic equipment, system and medium
CN114755666B (en) * 2022-06-01 2022-09-13 苏州一径科技有限公司 Point cloud expansion evaluation method, device and equipment
CN115830080B (en) * 2022-10-27 2024-05-03 上海神玑医疗科技有限公司 Point cloud registration method and device, electronic equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012141235A1 (en) * 2011-04-13 2012-10-18 株式会社トプコン Three-dimensional point group position data processing device, three-dimensional point group position data processing system, three-dimensional point group position data processing method and program
CN106570823A (en) * 2016-10-11 2017-04-19 山东科技大学 Planar feature matching-based point cloud crude splicing method
CN108090960A (en) * 2017-12-25 2018-05-29 北京航空航天大学 A kind of Object reconstruction method based on geometrical constraint
CN109377521A (en) * 2018-09-11 2019-02-22 武汉大学 Terrestrial Laser scanner data acquire midpoint to the point cloud registration method of best fit plane
CN109946701A (en) * 2019-03-26 2019-06-28 新石器慧通(北京)科技有限公司 A kind of cloud coordinate transformation method and device
CN110443836A (en) * 2019-06-24 2019-11-12 中国人民解放军战略支援部队信息工程大学 A kind of point cloud data autoegistration method and device based on plane characteristic
CN110910454A (en) * 2019-10-11 2020-03-24 华南农业大学 Automatic calibration registration method of mobile livestock three-dimensional reconstruction equipment

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105427248A (en) * 2014-09-11 2016-03-23 富泰华工业(深圳)有限公司 Point cloud registration processing system and method
CN109727308A (en) * 2017-10-30 2019-05-07 三纬国际立体列印科技股份有限公司 The three-dimensional point cloud model generating device and generation method of entity article

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012141235A1 (en) * 2011-04-13 2012-10-18 株式会社トプコン Three-dimensional point group position data processing device, three-dimensional point group position data processing system, three-dimensional point group position data processing method and program
CN106570823A (en) * 2016-10-11 2017-04-19 山东科技大学 Planar feature matching-based point cloud crude splicing method
CN108090960A (en) * 2017-12-25 2018-05-29 北京航空航天大学 A kind of Object reconstruction method based on geometrical constraint
CN109377521A (en) * 2018-09-11 2019-02-22 武汉大学 Terrestrial Laser scanner data acquire midpoint to the point cloud registration method of best fit plane
CN109946701A (en) * 2019-03-26 2019-06-28 新石器慧通(北京)科技有限公司 A kind of cloud coordinate transformation method and device
CN110443836A (en) * 2019-06-24 2019-11-12 中国人民解放军战略支援部队信息工程大学 A kind of point cloud data autoegistration method and device based on plane characteristic
CN110910454A (en) * 2019-10-11 2020-03-24 华南农业大学 Automatic calibration registration method of mobile livestock three-dimensional reconstruction equipment

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
一种车载激光扫描点云中路面坑槽自动提取方法;杨雷等;测绘工程;第29卷(第1期);第66-71页 *
基于平面特征的地面雷达点云配准算法;张东;黄腾;;测绘科学;第40卷(第11期);第146--149页 *
车载激光点云与序列化全景影像融合方法;卢秀山等;中国激光;第45卷(第5期);第0510004-1至0510004-8页 *

Also Published As

Publication number Publication date
CN112132875A (en) 2020-12-25

Similar Documents

Publication Publication Date Title
CN112132875B (en) Multi-platform point cloud matching method based on surface features
CN111563442B (en) Slam method and system for fusing point cloud and camera image data based on laser radar
CN111136660B (en) Robot pose positioning method and system
CN112017220B (en) Point cloud accurate registration method based on robust constraint least square algorithm
CN114862932B (en) BIM global positioning-based pose correction method and motion distortion correction method
CN112257722B (en) Point cloud fitting method based on robust nonlinear Gaussian-Hermer model
CN110487286B (en) Robot pose judgment method based on point feature projection and laser point cloud fusion
CN112907735B (en) Flexible cable identification and three-dimensional reconstruction method based on point cloud
CN107330927B (en) Airborne visible light image positioning method
CN114140761A (en) Point cloud registration method and device, computer equipment and storage medium
CN115761303A (en) Ground object classification method based on airborne laser radar point cloud and remote sensing image data
CN116758234A (en) Mountain terrain modeling method based on multipoint cloud data fusion
CN116189006A (en) Remote sensing image building extraction method supporting three-dimensional data
CN114111791A (en) Indoor autonomous navigation method and system for intelligent robot and storage medium
CN111553954B (en) Online luminosity calibration method based on direct method monocular SLAM
CN117333846A (en) Detection method and system based on sensor fusion and incremental learning in severe weather
WO2021063756A1 (en) Improved trajectory estimation based on ground truth
CN116823929A (en) Cross-modal matching positioning method and system based on visual image and point cloud map
CN115542362A (en) High-precision space positioning method, system, equipment and medium for electric power operation site
Chang et al. Robust accurate LiDAR-GNSS/IMU self-calibration based on iterative refinement
CN116342621A (en) Geometric parameter identification method and system based on three-dimensional reconstruction of space moving target
JP6761388B2 (en) Estimator and program
CN114882119A (en) Camera external parameter calibration method and device, computer equipment and storage medium
CN111811501B (en) Trunk feature-based unmanned aerial vehicle positioning method, unmanned aerial vehicle and storage medium
CN114004949A (en) Airborne point cloud assisted mobile measurement system arrangement parameter calibration method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant