CN108921895A - A kind of sensor relative pose estimation method - Google Patents

A kind of sensor relative pose estimation method Download PDF

Info

Publication number
CN108921895A
CN108921895A CN201810600704.0A CN201810600704A CN108921895A CN 108921895 A CN108921895 A CN 108921895A CN 201810600704 A CN201810600704 A CN 201810600704A CN 108921895 A CN108921895 A CN 108921895A
Authority
CN
China
Prior art keywords
point
matching
characteristic
point cloud
screening
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810600704.0A
Other languages
Chinese (zh)
Other versions
CN108921895B (en
Inventor
朱效洲
曹璐
姚雯
陈小前
赵勇
白玉铸
王祎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National Defense Technology Innovation Institute PLA Academy of Military Science
Original Assignee
National Defense Technology Innovation Institute PLA Academy of Military Science
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National Defense Technology Innovation Institute PLA Academy of Military Science filed Critical National Defense Technology Innovation Institute PLA Academy of Military Science
Priority to CN201810600704.0A priority Critical patent/CN108921895B/en
Publication of CN108921895A publication Critical patent/CN108921895A/en
Application granted granted Critical
Publication of CN108921895B publication Critical patent/CN108921895B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The present invention provides a kind of sensor relative pose estimation method, includes the following steps:Pre-treatment step, feature point extraction and description are carried out respectively in former frame and present frame coloured picture using 2D feature detection, characteristic point is matched and is screened to establish corresponding relationship, 2D-3D information association collection is formed using depth information of the characteristic point in former frame depth map, former frame and present frame depth map are converted into source point cloud and target point cloud respectively, cloud is carried out down-sampled;Point cloud corresponding relationship construction step constructs the corresponding relationship between source point cloud and target point cloud to form corresponding point set;Error function building and solution procedure, until meeting termination condition, if termination condition is unsatisfactory for, re-start a cloud corresponding relationship building using 2D-3D information association collection and corresponding point set building error function and Optimization Solution.Based on the above method, for the sensor Relative attitude and displacement estimation problem in extraterrestrial target closely perception, computational accuracy is significantly improved, and robustness significantly improves.

Description

A kind of sensor relative pose estimation method
Technical field
The invention belongs to extraterrestrial targets closely to perceive field, in particular to a kind of sensor relative pose estimation method.
Background technique
Iteration closest approach (Iterative Closet Point, ICP) algorithm is equal to the 1990s by Besl earliest It just proposes, for estimating the transformation relation between 3D point cloud, by the development of more than two decades, is widely used to three-dimensional at present Reconstruction, target identification, robot navigation etc..Given source point cloudTarget point cloud And transformation to be solvedRotational component R ∈ SO (3), translational componentInitial value, ICP algorithm iteration into Row following two step, until restraining or reaching given number of iterations:
For the every bit P in source point cloudi, its nearest neighbor point is searched in target point cloud, i.e., with it apart from the smallest Point, to establish corresponding point set C:
Wherein | | | |2This black (Frobenius) norm of the not Luo Beini of expression vector or matrix without distinction, and Do not cause to be abbreviated as in the case where ambiguity | | | |2
Transformation estimation is carried out according to current corresponding relationship.
Besl etc. uses point-to-point (point-to-point) distance, that is, the Europe after converting between source point and its nearest neighbor point Formula distance building error functionAcquiring makes its smallest R and t as current estimation:
Wherein formula (2) has closed-form solution, and the method that iteration optimization also can be used is solved.Closed form It solves and is acquired usually using the methods of singular value decomposition, unit quaternion, orthogonal matrix, dual quaterion.Eggert etc. is to above Four kinds of methods are compared, it is found that the result difference that each method obtains under real world noise conditions is not obvious.
One problem of ICP algorithm is that the large number of of a cloud midpoint causes calculation amount excessive, certain methods by with The preprocessing process such as machine sampling, consistent sampling reduce the quantity at point cloud midpoint, accelerate calculating speed.Another of ICP algorithm is asked Topic be it is stronger to the dependence of initial value, between two o'clock cloud convert less in the case where, simply use unit matrix I as Initial value can be obtained by ideal result;And converted between two o'clock cloud in biggish situation, still use unit matrix as just Value is just no longer suitable, is easily trapped into local minimum.Solve the problems, such as this common method be using by slightly to smart strategy, Pretreatment stage provides a rough but relatively reasonable initial value by different approaches.Zhang and Jost etc. carries out not cloud With the down-sampled of compression degree, then layering is estimated, using result estimated by the high level of compression degree as compression journey Spend the initial value that low level is estimated.Certain methods use 3D Scale invariant features transform (Scale Invariant first Feature Transform, SIFT), normal be aligned radial feature (Normal Aligned Radial Feature, NARF) Deng detection 3D characteristic point, subsequent point of use feature histogram (Point Feature Histograms, PFH), direction histogram Feature (Signature of Histograms of Orientations, SHOT), binary robust appearance and normal direction description Sub (Binary Robust Appearance and Normals Descriptor, BRAND) etc. clicks through the feature of detection Row description finally completes Feature Points Matching and estimates a relatively reasonable initial value using match point.
The corresponding relationship stage between establishing point cloud, certain methods use k Wei Shu (k-d tree), Octree (Octree) Etc. spatial data structures to accelerate the search process of nearest neighbor point, common open source realizes to include approximate KNN The library (Approximate Nearest Neighbors, ANN), the quick library of approximate KNN (Fast Library for Approximate Nearest Neighbors, FLANN), libnabo etc..Geological information is relied only on not with traditional ICP algorithm Together, certain methods introduce additional information also to search for nearest neighbor point, this is especially common in the method using RGBD data. Such as Johnson and Ji etc. introduce YIQ colouring information, and Men etc. introduces tone (hue) information, and Korn etc. introduces LAB color letter Breath etc..
In terms of error function building, researcher also explores the degree except point-to-point (point-to-point) distance Amount method.It is proposed that point of use to plane (point-to-plane) distance, that is, converts if with the Chen of the same period such as Besl etc. Euclidean distance between source point and target point local tangential plane afterwards, the normal information of target point is utilized in this.Segal etc. is then It proposes using plane to plane (plane-to-plane) distance, while utilizing the normal information of source point and target point.However At present method there are estimated accuracies it is not high, robustness is weak the problems such as, need new method to solve the above problems.
Summary of the invention
The purpose of the present invention is what is be achieved through the following technical solutions.
The present invention provides a kind of sensor relative pose estimation method, includes the following steps:
Pre-treatment step carries out feature point extraction in former frame coloured picture and present frame coloured picture using 2D feature detection And description, characteristic point is matched and screened to establish the corresponding relationship between characteristic point, forms 2D-3D information association collection, point Former frame depth map and present frame depth map are not converted into source point cloud and target point cloud, source point cloud and target point cloud are carried out It is down-sampled;
Point cloud corresponding relationship construction step constructs the corresponding relationship between source point cloud and target point cloud to form corresponding point set It closes;
Error function building and solution procedure construct error function simultaneously using 2D-3D information association collection and corresponding point set Optimization Solution, if termination condition is unsatisfactory for, re-starts a cloud corresponding relationship building until meeting termination condition.
Preferably, pretreatment includes the following steps:
Feature point extraction and description are carried out in coloured picture using SIFT, SIFT includes feature detection and description two Point, the testing result of SIFT three aspect information of scale locating when including characteristic point position, principal direction, detection;
In order successively to characteristic point carry out dual way nearest neighbor matching, symmetry screening, pixel distance screening, feature away from From screening, matching double points are obtained;
Appoint and take matching double points, establish image pixel coordinates, if existing at image pixel coordinates in former frame depth map Effective depth value, then 2D-3D information association succeeds and forms incidence set;
Former frame depth map and present frame depth map are converted into source point cloud and target point cloud respectively, to source point cloud and mesh Punctuate cloud carries out down-sampled.
Preferably, dual way nearest neighbor matching uses the characteristic distance between characteristic point to carry out arest neighbors as module Match, is divided into positive arest neighbors matching and Reverse Nearest matching;
In positive arest neighbors matching, forIn?In find and the smallest spy of its characteristic distance Sign pointSecondary small characteristic pointAs candidate, ifLess than given threshold value, thenWithBetween match Success, it is on the contrary then match invalid;
Reverse Nearest matching and positive arest neighbors matching are similar but contrary,By negative relational matching with Establish matching relationship;
Wherein,Indicate former frame coloured picture set of characteristic points,It indicatesIn each characteristic point, Indicate present frame coloured picture set of characteristic points,It indicatesIn each characteristic point,Indicate characteristic pointWithBetween characteristic distance, i, j are positive integer.
Preferably, symmetry screening includes the following steps:
By forward direction matching withMatching relationship is established, andBy negative relational matching also withFoundation With relationship, then symmetry screening passes through, the matching point set after being screened;
Otherwise symmetry screening failure cancels corresponding matching relationship.
Preferably, pixel distance screening includes the following steps:
Appoint the matching double points in the matching point set after taking screening, pixel distance is greater than given threshold value between the two, then pixel Apart from screening failure, need to reject matching relationship.
Preferably, characteristic distance screening includes the following steps:
The match point that keeping characteristics distance is less than certain proportion threshold alpha in proportion in all match points constitutes matching Point set.
Preferably, error function building includes the following steps:
Utilize the 2D-3D information association collection obtained in preprocessing processAnd it is obtained in point cloud corresponding relationship building process The correspondence point set C building error function taken
Wherein, error function is by 2DWith 3DComposition,Indicate variable p homogeneous form be K is to demarcate known camera Intrinsic Matrix, and w is 2D weights,For source point cloud,For target point cloud, pprojFor 2D feature Point pprevSubpoint after corresponding 3d space point P is transformed in current frame image,For translational component, d is characterized The corresponding depth value of point.
Preferably, error function solution includes the following steps:
Parametrization expression is carried out to pose using Lie algebra, is expressed as the following form without constraint least square:
Wherein,It is to spin matrix R parameter representation, the two corresponds;For [ω]×, []×Indicate the antisymmetric matrix that vector generates; Indicate variable p homogeneous form be It means thatHomogeneous form.
Preferably, Optimization Solution least square needs to calculate Jacobi (Jacobian) matrix of error function, 2D Jacobian matrix is:
Wherein, R is spin matrix.
Preferably, Optimization Solution least square needs to calculate the Jacobian matrix of error function, 3D Jacobian matrixs For:
[I-[R·P+t]×]
Wherein, I is the unit matrix of 3x3.
The advantage of the invention is that:
The present invention provides a kind of sensor relative pose estimation methods, improve iteration closest approach algorithm, the algorithm packet It includes:Pretreatment, the building of point cloud corresponding relationship, error function building and solution, use 2D-3D information association collection and corresponding point set Close building error function and Optimization Solution.Based on above-mentioned algorithm, for the sensor in extraterrestrial target closely perception with respect to position Appearance estimation problem, computational accuracy significantly improve, and robustness significantly improves.
Detailed description of the invention
By reading the detailed description of following detailed description, various other advantages and benefits are general for this field Logical technical staff will become clear.Attached drawing is only used for showing the purpose of specific embodiment, and is not considered as to this hair Bright limitation.And throughout the drawings, the same reference numbers will be used to refer to the same parts.In the accompanying drawings:
Attached drawing 1 shows the iteration closest approach algorithm frame figure of the fusion 2D-3D information of embodiment according to the present invention;
Attached drawing 2 shows progress feature extraction and the description in coloured picture of embodiment according to the present invention:(a) coloured picture; (b) feature extraction result is carried out using SIFT;
Attached drawing 3 shows the characteristic matching schematic diagram of embodiment according to the present invention:(a) positive arest neighbors matching; (b) Reverse Nearest matching;(c) symmetry is screened;
Attached drawing 4 shows the characteristic matching and the selection result of embodiment according to the present invention:(a) positive arest neighbors matching As a result;(b) symmetry the selection result;(c) pixel distance the selection result;(d) characteristic distance the selection result;
Attached drawing 5 shows the 2D-3D information association schematic diagram of embodiment according to the present invention;
The voxel grid that attached drawing 6 shows embodiment according to the present invention carries out a cloud down-sampled;
Attached drawing 7 shows the error function building schematic diagram of embodiment according to the present invention.
Specific embodiment
The illustrative embodiments of the disclosure are more fully described below with reference to accompanying drawings.Although showing this in attached drawing Disclosed illustrative embodiments, it being understood, however, that may be realized in various forms the disclosure without that should be illustrated here Embodiment is limited.It is to be able to thoroughly understand the disclosure on the contrary, providing these embodiments, and can incite somebody to action The scope of the present disclosure is fully disclosed to those skilled in the art.
Embodiment according to the present invention proposes a kind of sensor relative pose estimation method, is obtained using RBD camera 2D image and 3D point cloud, propose it is a kind of merge 2D-3D information modified ICP, referred to as 2d-3d ICP, be used for sensor Relative pose/estimation.As shown in Figure 1, algorithm frame is broadly divided into pretreatment, the building of point cloud corresponding relationship, error function Three parts of building and solution:Characteristic point is extracted in former frame and present frame coloured picture using 2D feature detection first, and is made It is described with Feature Descriptor.Herein on basis, characteristic matching and screening are carried out to establish the corresponding pass between characteristic point System forms the matching double points collection comprising 2D informationThen by depth map 3D information withIt is associated, forms 2D- 3D information association collectionAt the same time, according to the relationship between camera system and image pixel coordinates system, respectively by former frame and Present frame depth map is converted to source point cloudWith target point cloudAfter down-sampled to cloud progress, between building two o'clock cloud Corresponding relationship forms corresponding point set C.Finally use 2D-3D information association collectionError function is constructed with corresponding point set CAnd Optimization Solution relative pose, until meeting termination condition.Wherein, point cloud corresponding relationship building part is complete using FLANN At.For source point cloudMiddle every bit Pi, its arest neighbors is constructed using random k-d tree and is indexed, then from wherein searching for most Neighbor Points.Compared with traversal target point cloud every bit finds the brute force method of nearest neighbor point, can effectively it be mentioned using such method Rise search efficiency.
Work of the invention is mainly reflected in pretreatment and error function building and solves part, carries out individually below detailed It states.
1. pretreatment
1.1 feature extractions and description
The present invention carries out feature point extraction and description using SIFT in coloured picture.SIFT is mentioned by David G. Lowe earliest Out, including feature detection and sub- two parts are described.The core concept of son is detected first is that image pyramid is constructed, in different rulers Degree detects characteristic point in space, so that the change of scale to image has invariance;Second is that the part according to feature vertex neighborhood is tied Structure calculates principal direction, so that the rotation transformation to image has invariance.
The testing result of SIFT three aspect information of scale locating when including characteristic point position, principal direction, detection.Such as Fig. 2 Shown in, each circle represents the characteristic point detected, and wherein the center of circle indicates characteristic point position, when round size and detection Locating scale is directly proportional, and the straight line in circle indicates characteristic point principal direction.SIFT detection is in brightness of image, visual angle change feelings Stability is able to maintain under condition, it is also of good performance in repeatability, robustness and positioning feature point precision.SIFT description It is sub then feature vertex neighborhood is divided into 16 sub-regions, the gradient distribution to the pixel in each subregion in 8 directions respectively Situation is counted, accordingly by Based on Feature Points at the vector of 128 dimensions.
1.2 characteristic matchings and screening
After completing feature detection and description in two field pictures, characteristic matching and screening are carried out to establish between characteristic point Corresponding relationship, be divided into order dual way nearest neighbor matching, symmetry screening, pixel distance screening, characteristic distance screening Four parts.Assuming that the set of characteristic points detected in former frame and present frame coloured picture are denoted as respectivelyWithWherein NprevAnd NcurrIt respectively indicates respectively point and concentrates characteristic point Quantity.Characteristic pointWithBetween characteristic distance be denoted asIndicate the two in 128 dimensional feature vector spaces Euclidean distance;Pixel distance is denoted asIndicate Euclidean distance of the two in image pixel coordinates system.
(1) dual way nearest neighbor matches
Use the characteristic distance between characteristic point to carry out arest neighbors matching as module, is divided into forward and reverse two Direction carries out, and the matching point set established is denoted as respectivelyWithIn the matching of positive arest neighbors, forIn each characteristic pointTraversalIn all characteristic points, find the smallest with its characteristic distanceWith it is secondary small 'sAs candidate.IfLess than given threshold value, thenWithBetween match and set up, it is on the contrary then match not It sets up.As shown in Fig. 3 (a), it is assumed that withThe smallest two characteristic points of characteristic distance areWithButIt is unsatisfactory for requiring, match invalid;WithThe smallest two characteristic points of characteristic distance areWith AndIt meets the requirements, matching is set up, and the positive matching relationship established is indicated using red arrow.Such as Fig. 3 (b) Shown in, Reverse Nearest matching and forward direction arest neighbors matching are similar but contrary, i.e., forIn each characteristic pointIt needs to be traversed forIn each characteristic point.The negative relational matching relationship established is indicated using green arrow in figure.
(2) symmetry is screened
After the completion of dual way nearest neighbor matching, applies symmetry and screen to reduce error hiding quantity, the match point after screening Set is denoted asIfBy forward direction matching withMatching relationship is established, andBy negative relational matching also withMatching relationship is established, then symmetry screening passes through;Otherwise symmetry screening failure cancels corresponding matching relationship.Such as figure Shown in 3 (c),By forward direction matching withMatching relationship is established,Also by negative relational matching withEstablish matching Relationship, symmetry screening pass through;By forward direction matching withMatching relationship is established, stillBut by negative relational matching WithEstablish matching relationship, symmetry screening failure.
(3) pixel distance screens
After symmetry screening is completed, applies pixel distance and screen to be further reduced error hiding quantity, after screening It is denoted as with point setConsider the pose occurred during camera is imaged respectively at two different positions and poses transformation Limited, the variation that same clarification of objective point its image pixel coordinates are corresponded in two field pictures should also be limited.Appoint and takes With point pairIf pixel distance is greater than given threshold value between the two, pixel distance screening failure is needed Matching relationship is rejected.
(4) characteristic distance screens
After pixel distance screening is completed, it is also necessary to carry out characteristic distance screening, retain in proportion in all match points Those lesser match points of characteristic distance, the matching point set after screening are denoted asCharacteristic matching and the selection result such as Fig. 4 institute Show, wherein there are the characteristic points of matching relationship to facilitate display to using straight line to be attached between two field pictures.By being seen in (a) Positive arest neighbors matching is used only out, and there are many Mismatching points pair.These Mismatching points to and the same target of non-corresponding, but by In the similitude in feature space, matching relationship is mistakenly established.By being found out in (b) through the result to bi-directional matching The case where progress symmetry screening, error hiding, is effectively relieved.It is screened by finding out in (c) by pixel distance, is effectively picked In addition to the excessive Mismatching point pair of the distance in image pixel coordinates system.It is screened, is matched by characteristic distance by finding out in (d) As a result it has been further upgraded.
1.3 2D-3D information associations
After characteristic matching and screening are completed, the corresponding relationship between 2D characteristic point is had built up, it is also necessary to pass through depth map It is associated with 3D information.2D-3D information association process as shown in figure 5, wherein depth map shown with pseudo-colours, the redder table of color Show that depth value is bigger, the color the blue, indicates that depth value is smaller, and depth value inactive area uses white displays.Appoint and takes match point It is rightAssuming that characteristic pointWithIt is respectively in the image pixel coordinates corresponded in coloured picture (uprev,vprev) and (ucurr,vcurr).If in former frame depth map, image pixel coordinates ([uprev],[vprev]) at exist Effective depth value d, wherein [] indicates to be rounded coordinate value, then 2D-3D information association success.Finally formed incidence set It can be expressed asWhereinIndicate matching double points number.
1.4 clouds generate and down-sampled
Former frame and present frame depth image are respectively converted into source point cloudWith target point cloudThen according to difference Demand, it is down-sampled to cloud progress using voxel (Voxel) grid, to reduce the quantity for the point for including in point cloud, to be promoted Algorithm calculating speed.Down-sampled process is as shown in fig. 6, divide the space 3D using voxel grid, each point in point cloud Regard particle as, the point in each grid carries out approximate using its mass center.
The building of 2 error functions and solution
The building of 2.1 error functions
Utilize the 2D-3D information association collection obtained in preprocessing processAnd it is obtained in point cloud corresponding relationship building process The correspondence point set C building error function takenPretreatment only carries out once, and corresponding relationship building and error function building Iteration is solved to carry out:Then the initial value of a given pose to be solved carries out a cloud corresponding relationship building;According to pair of acquisition Point set building error function and Optimization Solution are answered, to be updated to pose to be solved;Judged later, if meeting Termination condition then terminates, and is unsatisfactory for, and re-starts a cloud corresponding relationship according to updated pose and constructs.
As shown by the equation, error function is by 2DWith 3DComposition, whereinIndicate variableHomogeneous shape Formula isK is to demarcate known camera Intrinsic Matrix, and w is 2D weights.As shown in fig. 7,3D and traditional IC P Error function in algorithm is consistent, for measuring source point in 3d spaceTransformed rear and its nearest neighbor pointBetween it is European away from From;2D then for measuring p in 2D imageprojWithBetween Euclidean distance, wherein pprojFor 2D characteristic pointInstitute is right The 3d space point answeredSubpoint after transformed in current frame image.
Wherein, error function is by 2DWith 3DComposition,Indicate variable p homogeneous form be K is to demarcate known camera Intrinsic Matrix, and w is 2D weights,For source point cloud,For target point cloud, pprojFor 2D feature Point pprevSubpoint after corresponding 3d space point P is transformed in current frame image,For translational component, d is characterized The corresponding depth value of point.
2.2 error functions solve
To obtain optimal R and t, need to solve belt restraining least square problem as follows
On the one hand, if applying constraint using Lagrange multiplier will increase the complexity of optimization problem, cause more Calculation amount;On the other hand, 9 variables are needed if being indicated using 3 × 3 orthogonal matrix to R, is a kind of expression of parameter excessively Form can adversely affect optimization process.And according to group's theory, each Lie group (Lie Group) such as proper orthogonal group SO (3) and special European groupIt is also smooth manifold, its corresponding Lie algebra (Lie Algebra) can be used and carry out part It is approximate.Lie algebra constitute Lie group cut space at identical element, derivative can be calculated on it, also facilitate in optimization process into Row interpolation.Furthermore it can be mapped by index and log operations between Lie group and Lie algebra.
As a kind of minimum parameter form, Lie algebra only needs 3 parameters that can carry out table to rotation R ∈ SO (3) Show.IfThen
Wherein []×Indicate the antisymmetric matrix that vector generates.Similarly only need 6 parameters can using Lie algebra To transformation matrixIt is indicated.IfThen
Therefore the present invention carries out parametrization expression to pose using Lie algebra, then can be rearranged as following no constraint The form of least square is easy to solve.
Wherein,It is to spin matrix R parameter representation, the two corresponds;For [ω]×, []×Indicate the antisymmetric matrix that vector generates; Indicate variable p homogeneous form be It means thatHomogeneous form.
During Optimization Solution least square, it is important to calculate Jacobi (Jacobian) matrix of error function. According to the derivation characteristic of Lie Group & Lie Algebra, 2D Jacobian matrixs are
Wherein, R is spin matrix.
And 3D Jacobian matrixs are
[I-[R·P+t]×] (9)
Wherein, I is the unit matrix of 3x3.
The present invention provides a kind of sensor relative pose estimation methods for merging 2D-3D information, with iteration closest approach Algorithm, the algorithm include:Pretreatment, the building of point cloud corresponding relationship, error function building and solution, use 2D-3D information association Collection and corresponding point set building error function and Optimization Solution.Based on above-mentioned algorithm, in extraterrestrial target closely perception Sensor Relative attitude and displacement estimation problem, computational accuracy significantly improve, and robustness significantly improves.
More than, illustrative specific embodiment only of the invention, but scope of protection of the present invention is not limited thereto, In the technical scope disclosed by the present invention, any changes or substitutions that can be easily thought of by anyone skilled in the art, It should be covered by the protection scope of the present invention.Therefore, protection scope of the present invention should be with scope of protection of the claims It is quasi-.

Claims (10)

1. a kind of sensor relative pose estimation method, which is characterized in that include the following steps:
Pre-treatment step is carried out feature point extraction using 2D feature detection and is retouched in former frame coloured picture and present frame coloured picture It states, the characteristic point is matched and screened the corresponding relationship to establish between the characteristic point, form 2D-3D information association Collection, is converted to source point cloud and target point cloud for former frame depth map and present frame depth map respectively, to the source point cloud and target Point cloud carries out down-sampled;
Point cloud corresponding relationship construction step, constructs the corresponding relationship between the source point cloud and target point cloud to form corresponding point set It closes;
Error function building and solution procedure construct error letter using the 2D-3D information association collection and the corresponding point set Number and Optimization Solution, if the termination condition is unsatisfactory for, re-start described cloud corresponding relationship until meeting termination condition Building.
2. the method as described in claim 1, which is characterized in that the pretreatment includes the following steps:
Feature point extraction and description are carried out in coloured picture using SIFT, the SIFT includes feature detection and description two Point, the testing result of the SIFT three aspect information of scale locating when including characteristic point position, principal direction, detection;
Dual way nearest neighbor matching, symmetry screening, pixel distance screening, characteristic distance successively are carried out to the characteristic point in order Screening obtains matching double points;
Appoint take the matching double points, establish image pixel coordinates, if in former frame depth map, at image pixel coordinates there are Depth value is imitated, then 2D-3D information association succeeds and forms incidence set;
Former frame depth map and present frame depth map are converted into source point cloud and target point cloud respectively, to the source point cloud and target Point cloud carries out down-sampled.
3. method according to claim 2, which is characterized in that the dual way nearest neighbor matching uses the spy between the characteristic point Sign distance carries out arest neighbors matching as module, is divided into positive arest neighbors matching and Reverse Nearest matching;
In the forward direction arest neighbors matching, forIn?In find and the smallest feature of its characteristic distance PointSecondary small characteristic pointAs candidate, ifIt is less than given threshold value, then describedWith it is described Between successful match, it is on the contrary then match invalid;
The Reverse Nearest matching and the positive arest neighbors matching are similar but contrary,By negative relational matching withEstablish matching relationship;
Wherein,Indicate former frame coloured picture set of characteristic points,It indicatesIn each characteristic point,Expression is worked as Previous frame coloured picture set of characteristic points,It indicatesIn each characteristic point,Indicate characteristic pointWithBetween Characteristic distance, i, j are positive integer.
4. method as claimed in claim 3, which is characterized in that the symmetry screening includes the following steps:
By forward direction matching withMatching relationship is established, andBy negative relational matching also withMatching relationship is established, Then symmetry screening passes through, the matching point set after being screened;
Otherwise symmetry screening failure cancels corresponding matching relationship.
5. method as claimed in claim 4, which is characterized in that the pixel distance screening includes the following steps:
Appoint the matching double points in the matching point set after taking the screening, if pixel distance is greater than given threshold value between the two, as Element rejects matching relationship apart from screening failure.
6. method according to claim 2, which is characterized in that the characteristic distance screening includes the following steps:
The match point that keeping characteristics distance is less than certain proportion threshold alpha in proportion in all match points constitutes matching point set.
7. method as claimed in claim 3, which is characterized in that the error function building includes the following steps:
Utilize the 2D-3D information association collection obtained in the preprocessing processAnd it is obtained in point cloud corresponding relationship building process Correspondence point setConstruct error function
Wherein, error function is by 2DWith 3DComposition,Indicate variable p homogeneous form beK is mark Fixed known camera Intrinsic Matrix, w are 2D weights,For source point cloud,For target point cloud, pprojFor 2D characteristic point pprev Subpoint after corresponding 3d space point P is transformed in current frame image,For translational component, d is characterized a correspondence Depth value, R is spin matrix.
8. the method for claim 7, which is characterized in that the error function solution includes the following steps:
Parametrization expression is carried out to pose using Lie algebra, is expressed as the following form without constraint least square:
Wherein,It is to spin matrix R parameter representation, the two corresponds;For Indicate the antisymmetric matrix that vector generates; Indicate variable p homogeneous form be It means thatHomogeneous form.
9. method according to claim 8, which is characterized in that described in Optimization Solution when least square, calculate error function Jacobian matrix, 2D Jacobian matrixs are:
Wherein, R is spin matrix.
10. method according to claim 8, which is characterized in that described in Optimization Solution when least square, calculate error function Jacobian matrix, 3D Jacobian matrixs are:
[I -[R·P+t]×]
Wherein, I is the unit matrix of 3x3.
CN201810600704.0A 2018-06-12 2018-06-12 Sensor relative pose estimation method Active CN108921895B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810600704.0A CN108921895B (en) 2018-06-12 2018-06-12 Sensor relative pose estimation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810600704.0A CN108921895B (en) 2018-06-12 2018-06-12 Sensor relative pose estimation method

Publications (2)

Publication Number Publication Date
CN108921895A true CN108921895A (en) 2018-11-30
CN108921895B CN108921895B (en) 2021-03-02

Family

ID=64419193

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810600704.0A Active CN108921895B (en) 2018-06-12 2018-06-12 Sensor relative pose estimation method

Country Status (1)

Country Link
CN (1) CN108921895B (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109827578A (en) * 2019-02-25 2019-05-31 中国人民解放军军事科学院国防科技创新研究院 Satellite relative attitude estimation method based on profile similitude
CN110097581A (en) * 2019-04-28 2019-08-06 西安交通大学 Method based on point cloud registering ICP algorithm building K-D tree
CN110188604A (en) * 2019-04-18 2019-08-30 盎锐(上海)信息科技有限公司 Face identification method and device based on 2D and 3D image
CN110458177A (en) * 2019-07-12 2019-11-15 中国科学院深圳先进技术研究院 Acquisition methods, image processing apparatus and the storage medium of image depth information
CN110490932A (en) * 2019-08-21 2019-11-22 东南大学 The infrared coplanar cursor iteration optimization measurement driving boom spatial pose method of monocular
CN110766716A (en) * 2019-09-10 2020-02-07 中国科学院深圳先进技术研究院 Method and system for acquiring information of space unknown moving target
CN111681279A (en) * 2020-04-17 2020-09-18 东南大学 Driving suspension arm space pose measurement method based on improved lie group nonlinear optimization
CN112212889A (en) * 2020-09-16 2021-01-12 北京工业大学 SINS strapdown inertial navigation system shaking base rough alignment method based on special orthogonal group optimal estimation
CN112285734A (en) * 2020-10-30 2021-01-29 北京斯年智驾科技有限公司 Spike-based high-precision alignment method and system for unmanned port container truck
WO2022262160A1 (en) * 2021-06-18 2022-12-22 上海商汤临港智能科技有限公司 Sensor calibration method and apparatus, electronic device, and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101645170A (en) * 2009-09-03 2010-02-10 北京信息科技大学 Precise registration method of multilook point cloud
US20160035124A1 (en) * 2013-02-18 2016-02-04 Tata Consultancy Services Limited Segmenting objects in multimedia data
CN105938619A (en) * 2016-04-11 2016-09-14 中国矿业大学 Visual odometer realization method based on fusion of RGB and depth information
CN107220995A (en) * 2017-04-21 2017-09-29 西安交通大学 A kind of improved method of the quick point cloud registration algorithms of ICP based on ORB characteristics of image
CN107292949A (en) * 2017-05-25 2017-10-24 深圳先进技术研究院 Three-dimensional rebuilding method, device and the terminal device of scene
CN107590827A (en) * 2017-09-15 2018-01-16 重庆邮电大学 A kind of indoor mobile robot vision SLAM methods based on Kinect
US20180158235A1 (en) * 2016-12-01 2018-06-07 Here Global B.V. Method and apparatus for generating a cleaned object model for an object in a mapping database

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101645170A (en) * 2009-09-03 2010-02-10 北京信息科技大学 Precise registration method of multilook point cloud
US20160035124A1 (en) * 2013-02-18 2016-02-04 Tata Consultancy Services Limited Segmenting objects in multimedia data
CN105938619A (en) * 2016-04-11 2016-09-14 中国矿业大学 Visual odometer realization method based on fusion of RGB and depth information
US20180158235A1 (en) * 2016-12-01 2018-06-07 Here Global B.V. Method and apparatus for generating a cleaned object model for an object in a mapping database
CN107220995A (en) * 2017-04-21 2017-09-29 西安交通大学 A kind of improved method of the quick point cloud registration algorithms of ICP based on ORB characteristics of image
CN107292949A (en) * 2017-05-25 2017-10-24 深圳先进技术研究院 Three-dimensional rebuilding method, device and the terminal device of scene
CN107590827A (en) * 2017-09-15 2018-01-16 重庆邮电大学 A kind of indoor mobile robot vision SLAM methods based on Kinect

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
XIA SHEN 等: ""Fast RGBD-ICP with bionic vision depth perception model"", 《2015 IEEE INTERNATIONAL CONFERENCE ON CYBER TECHNOLOGY IN AUTOMATION, CONTROL, AND INTELLIGENT SYSTEMS 》 *
YONGHUAI LIU 等: ""Evaluating 3D-2D correspondences for accurate camera pose estimation from a single image"", 《SMC"03 CONFERENCE PROCEEDINGS》 *
谌夏: ""基于深度信息的仿生视觉模型快速RGB-D ICP算法研究"", 《中国优秀硕士学位论文全文数据库-信息科技辑》 *

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109827578A (en) * 2019-02-25 2019-05-31 中国人民解放军军事科学院国防科技创新研究院 Satellite relative attitude estimation method based on profile similitude
CN109827578B (en) * 2019-02-25 2019-11-22 中国人民解放军军事科学院国防科技创新研究院 Satellite relative attitude estimation method based on profile similitude
CN110188604A (en) * 2019-04-18 2019-08-30 盎锐(上海)信息科技有限公司 Face identification method and device based on 2D and 3D image
CN110097581A (en) * 2019-04-28 2019-08-06 西安交通大学 Method based on point cloud registering ICP algorithm building K-D tree
CN110458177B (en) * 2019-07-12 2023-04-07 中国科学院深圳先进技术研究院 Method for acquiring image depth information, image processing device and storage medium
CN110458177A (en) * 2019-07-12 2019-11-15 中国科学院深圳先进技术研究院 Acquisition methods, image processing apparatus and the storage medium of image depth information
CN110490932A (en) * 2019-08-21 2019-11-22 东南大学 The infrared coplanar cursor iteration optimization measurement driving boom spatial pose method of monocular
CN110490932B (en) * 2019-08-21 2023-05-09 东南大学 Method for measuring space pose of crane boom through monocular infrared coplanar cursor iteration optimization
CN110766716B (en) * 2019-09-10 2022-03-29 中国科学院深圳先进技术研究院 Method and system for acquiring information of space unknown moving target
CN110766716A (en) * 2019-09-10 2020-02-07 中国科学院深圳先进技术研究院 Method and system for acquiring information of space unknown moving target
CN111681279A (en) * 2020-04-17 2020-09-18 东南大学 Driving suspension arm space pose measurement method based on improved lie group nonlinear optimization
CN111681279B (en) * 2020-04-17 2023-10-31 东南大学 Driving suspension arm space pose measurement method based on improved Liqun nonlinear optimization
CN112212889A (en) * 2020-09-16 2021-01-12 北京工业大学 SINS strapdown inertial navigation system shaking base rough alignment method based on special orthogonal group optimal estimation
CN112285734A (en) * 2020-10-30 2021-01-29 北京斯年智驾科技有限公司 Spike-based high-precision alignment method and system for unmanned port container truck
CN112285734B (en) * 2020-10-30 2023-06-23 北京斯年智驾科技有限公司 Port unmanned set card high-precision alignment method and system based on spike
WO2022262160A1 (en) * 2021-06-18 2022-12-22 上海商汤临港智能科技有限公司 Sensor calibration method and apparatus, electronic device, and storage medium

Also Published As

Publication number Publication date
CN108921895B (en) 2021-03-02

Similar Documents

Publication Publication Date Title
CN108921895A (en) A kind of sensor relative pose estimation method
CN106296693B (en) Based on 3D point cloud FPFH feature real-time three-dimensional space-location method
CN111815757B (en) Large member three-dimensional reconstruction method based on image sequence
CN108369741B (en) Method and system for registration data
Furukawa et al. Manhattan-world stereo
CN104063702B (en) Three-dimensional gait recognition based on shielding recovery and partial similarity matching
CN102236794B (en) Recognition and pose determination of 3D objects in 3D scenes
CN109544599B (en) Three-dimensional point cloud registration method based on camera pose estimation
CN106485690A (en) Cloud data based on a feature and the autoregistration fusion method of optical image
CN103729643A (en) Recognition and pose determination of 3d objects in multimodal scenes
CN106919944A (en) A kind of wide-angle image method for quickly identifying based on ORB algorithms
CN102982334B (en) The sparse disparities acquisition methods of based target edge feature and grey similarity
CN106155299B (en) A kind of pair of smart machine carries out the method and device of gesture control
CN107492107B (en) Object identification and reconstruction method based on plane and space information fusion
CN110310331A (en) A kind of position and orientation estimation method based on linear feature in conjunction with point cloud feature
Lin et al. Research on 3D reconstruction in binocular stereo vision based on feature point matching method
Tamas et al. Relative pose estimation and fusion of omnidirectional and lidar cameras
CN109443319A (en) Barrier range-measurement system and its distance measuring method based on monocular vision
CN117870659A (en) Visual inertial integrated navigation algorithm based on dotted line characteristics
Labatut et al. Hierarchical shape-based surface reconstruction for dense multi-view stereo
Morelli et al. Deep-image-matching: a toolbox for multiview image matching of complex scenarios
CN109785367A (en) Exterior point filtering method and device in threedimensional model tracking
Haala et al. Combining Laser Scanning and Photogrammetry-A Hybrid Approach for Heritage Documentation.
Netz et al. Using specular highlights as pose invariant features for 2d-3d pose estimation
Lyra et al. Development of an efficient 3D reconstruction solution from permissive open-source code

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant