CN112700501B - Underwater monocular subpixel relative pose estimation method - Google Patents

Underwater monocular subpixel relative pose estimation method Download PDF

Info

Publication number
CN112700501B
CN112700501B CN202011464862.1A CN202011464862A CN112700501B CN 112700501 B CN112700501 B CN 112700501B CN 202011464862 A CN202011464862 A CN 202011464862A CN 112700501 B CN112700501 B CN 112700501B
Authority
CN
China
Prior art keywords
coordinate system
light source
underwater
coordinates
characteristic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011464862.1A
Other languages
Chinese (zh)
Other versions
CN112700501A (en
Inventor
高剑
张元旭
张福斌
张立川
陈依民
张飞虎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwestern Polytechnical University
Original Assignee
Northwestern Polytechnical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwestern Polytechnical University filed Critical Northwestern Polytechnical University
Priority to CN202011464862.1A priority Critical patent/CN112700501B/en
Publication of CN112700501A publication Critical patent/CN112700501A/en
Application granted granted Critical
Publication of CN112700501B publication Critical patent/CN112700501B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)

Abstract

The invention provides an underwater monocular subpixel relative pose estimation method, which comprises the steps of firstly, setting a light source target on a target butt joint device, wherein the light source target is provided with at least 5 characteristic light sources, and acquiring three-dimensional coordinates of the characteristic light sources in a world coordinate system in advance. And secondly, acquiring a light source target image by using an underwater monocular vision system, and acquiring coordinates of each characteristic point in the image under a pixel coordinate system. Thirdly, taking the coordinates of the feature points in the world coordinate system and the coordinates of the feature points in the pixel coordinate system as inputs, and obtaining the relative pose matrix of the world coordinate system relative to the camera coordinate system by utilizing the internal reference matrix of the underwater monocular vision system camera and adopting the coplanar PnP algorithm c M w Is a coarse value of (a). Finally, to obtain c M w The coarse value is an initial value, and an RLS optimization algorithm is adopted for iteration to obtain c M w Accurate values. The invention can accurately obtain the relative pose information and complete the autonomous docking work of the underwater robot.

Description

Underwater monocular subpixel relative pose estimation method
Technical Field
The invention relates to the technical field of underwater robot vision, in particular to an underwater monocular subpixel relative pose estimation method.
Background
When the robot works independently for a long time, the robot needs to perform docking operation to enter the recovery device for data transmission and energy supply because of the load of the robot and the limitation of underwater communication. For a long time, domestic and foreign scholars have obtained good research results in the aspect of underwater acoustic signal docking, but because the underwater acoustic signal is good in performance only at long distance positions, the underwater acoustic signal has poor short-distance stability, low precision and poor robustness, and is not suitable for short-distance docking operation. In order to meet the requirement of autonomous docking operation, people need to acquire close-range pose estimation data so as to control the underwater robot to enter the docking device. The visual sensor has unique advantages in the estimation of the relative pose of autonomous docking according to the advantages of low cost, high resolution, high frame rate, low noise, low delay and the like, and is suitable for the estimation of the relative pose of the short-distance and high-precision docking operation.
The traditional common short-distance visual docking operation method mainly adopts the simplest geometric method to guide, generally takes the center of a light source target as a target guide point, guides the underwater robot to enter the docking device according to the direction of the center point, or takes the image area and the circumference formed by the target light source point as targets, and guides the underwater robot to enter the docking device according to the relation between the changes of the image area and the circumference. The traditional short-distance visual docking operation method has certain limitations and defects, and is mainly characterized in that: firstly, the method of simple geometry has fewer available characteristics, the influence of the posture of the robot is not considered in the process of docking the underwater robot, only the position information is contained, and the autonomous docking operation can finally fail due to the posture error in the engineering; secondly, when the image is acquired underwater by the method of simple geometry and is extracted, the extraction of the characteristics can generate larger errors or the situation of losing the characteristics is generated due to the complexity of the underwater environment, the robustness is extremely poor, the underwater robot can lose the target in the engineering, the data errors are too large, and the autonomous docking operation fails.
Disclosure of Invention
In order to solve the problems in the prior art, the invention provides an underwater monocular subpixel relative pose estimation method, which aims at an underwater docking operation task, designs a PnP pose estimation algorithm by utilizing a light source target image acquired by an underwater monocular vision system (monocular RGB camera) arranged on an underwater robot, and performs iterative optimization to output pose information of the underwater robot relative to a target docking device in real time.
Technical proposal
The technical principle of the invention is as follows:
and obtaining a two-dimensional coordinate of a central point of a characteristic point light source in the target image of the target light source under a pixel coordinate system by using the underwater monocular vision system, taking the coordinate of the central point and the coordinate under a world coordinate system as input, and solving the relative pose between the monocular vision system and the target of the target light source by adopting a coplanar PnP algorithm and an RLS optimization algorithm.
The method comprises the following specific steps:
step 1: setting a light source target on the target docking device, wherein the light source target is provided with at least 5 characteristic light sources, the centers of the 3 characteristic light sources are positioned on the same straight line, and three-dimensional coordinates of the characteristic light sources under a certain set world coordinate system are obtained in advance;
step 2: acquiring a light source target image by using an underwater monocular vision system, and acquiring coordinates of each characteristic point in the image under a pixel coordinate system;
step 3: the coordinates of the feature points in the world coordinate system and the coordinates of the feature points in the pixel coordinate system are used as inputs, and the relative pose matrix of the world coordinate system relative to the camera coordinate system is obtained by utilizing the internal parameter matrix of the underwater monocular vision system camera and adopting the coplanar PnP algorithm c M w Is a coarse value of (2);
step 4: to obtain c M w The coarse value is an initial value, and an RLS optimization algorithm is adopted for iteration to obtain c M w Accurate values.
In step 1, a three-dimensional positioning system is used to measure the three-dimensional coordinates of the center of each characteristic light source in a world coordinate system, wherein the world coordinate system takes the center of the characteristic light source with the light source target at the upper left corner as the origin.
Further, the underwater monocular vision system adopts a monocular RGB camera, is arranged below the underwater robot, and is kept parallel to the underwater robot, and the pixels of each frame of image are higher than 800 ten thousand pixels.
Further, in step 1, after the underwater monocular vision system collects the light source target image, preprocessing is performed on the image, including filtering, corrosion and extraction of the light spot edge contour of the characteristic light source, so as to extract the coordinates of the center of the characteristic light source in the pixel coordinate system in the next step.
Further, for the spot edge contour of a certain characteristic light source in the light source target image, the formula is adopted
Determining the coordinates (a, b) of the central point of the characteristic light source under the pixel coordinate system; wherein the position of the point on the edge profile of the spot of the characteristic light source in the pixel coordinate system is determined according to the coordinates (u i ,v i ) Using the formula
And calculating to obtain a symbol in a characteristic light source center point coordinate formula, wherein E is a set of characteristic light source light spot edge contour points, and superscripts alpha and beta correspondingly take 0,1,2 and 3.
Further, in step 3, a coplanar PnP algorithm is adopted to obtain a relative pose matrix of the world coordinate system relative to the camera coordinate system c M w The specific process of the coarse value of (2) is as follows:
step 3-1: establishing a PnP algorithm coordinate system:
knowing the geometric topological position of all feature points in the light source target, setting the coordinate of the ith feature point in the world coordinate system as P i =(x wi ,y wi ,z wi ) The coordinates in the pixel coordinate system are (u) i ,v i ) I is 1,2, …, n; selecting three feature points P on the same straight line in a light source target 1 、P 2 And P 3 With the 3 feature pointsEstablishing a reference coordinate system;
thereby obtaining each characteristic point P i The coordinates in the reference coordinate system are P ri =(x ri ,y ri 0) and obtaining the pose conversion matrix of the reference coordinate system relative to the world coordinate system as w M r
Step 3-2: solving the coplanar PnP problem:
taking n feature points P 1 ~P n Solving the coplanar PnP problem, wherein n is not less than 4, and P is known 1 ~P n Coordinates P in world coordinate System i =(x wi ,y wi ,z wi ) And coordinates (u) in the pixel coordinate system i ,v i ) The method comprises the steps of carrying out a first treatment on the surface of the Obtaining a characteristic point P according to a camera internal reference matrix of the underwater monocular vision system i Corresponds to the coordinates (x 1ci ,y 1ci );
Pose conversion matrix for reference coordinate system relative to camera coordinate system
Obtaining an equation
And converted into matrix form A 1 H 1 +A 2 H 2 =0, wherein,
H 1 =[ c η rx c η ry c η rz ] T
H 2 =[ c o rx c o ry c o rz c p rx c p ry c p rz ] T
A 1 is a 2n x 3 matrix, A 2 Is a 2n×6 matrix;
constructing an index function
F=||A 1 H 1 +A 2 H 2 || 2 +λ(1-||H 1 || 2 )
By solving for
Obtaining H 1 And H 2 The indicator function F is minimized, wherein,lambda is any set value;
based on the obtained H 1 And H 2 Determining pose matrix c M r Is arranged in the first two columns and the fourth column, c M r a third column of (2) is cross-multiplied by the first column and the second column; further according to the formula c M wc M r r M w Obtaining pose matrix of world coordinate system relative to camera coordinate system c M w Wherein r M ww M r -1
Further, in step 4, the method obtained in step 3 c M w The coarse value is an initial value, and an RLS optimization algorithm is adopted for iteration to obtain c M w The specific process of the accurate value is as follows:
according to an iterative formula
Iterating from n=0 to n=n-1, finally utilizing the resulting ζ n Solution calculation c M w Accurate values;
rho in the iterative formula is a forgetting factor, and I is a unit array; the remaining parameters are determined according to the following formula
i is 1,2, …, n;
zeta initial value
ζ 0 =[ c η′ rx c η′ ry c η′ rz c o′ rx c o′ ry c o′ rz c α′ rx c α′ ry c α′ rz c p′ rx c p′ ry ] T
According to step 3 c M w Coarse value calculation: c η′ rxc η rx / c p rzc η′ ryc η ry / c p rzc η′ rzc η rz / c p rzc o′ rxc o rx / c p rzc o′ ryc o ry / c p rzc o′ rzc o rz / c p rzc α′ rxc α rx / c p rzc α′ ryc α ry / c p rzc α′ rzc α rz / c p rzc p′ rxc p rx / c p rzc p′ ryc p ry / c p rz
k is the enhancement factor.
Further, in step 3-1, the established reference coordinate system process is: taking the abscissa u of the three feature points under the pixel coordinate system i The smallest point is the origin of the reference coordinate system, if three feature points u i Identical, then take the ordinate v i The smaller point serves as the origin of the reference coordinate system; starting from the origin of the reference coordinate system, u in the three characteristic points is taken i X of reference coordinate system by connecting line of origin with characteristic point farthest from origin in direction r The shaft, the direction points to the characteristic point farthest from the origin; by X r The axis is taken as the bottom, X is taken as the cross origin r The vertical axis is Y r Axis, direction toward negative v direction of pixel coordinate system, Z r The axis direction is uniquely determined by a right hand rule to obtain a reference coordinate system O r X r Y r Z r
Advantageous effects
The invention provides an underwater monocular subpixel relative pose estimation method, which aims at an underwater docking operation task, utilizes a light source target image acquired by an underwater monocular vision system (monocular RGB camera) arranged on an underwater robot to design a PnP pose estimation algorithm, and carries out iterative optimization to output pose information of the underwater robot relative to a target docking device in real time.
Additional aspects and advantages of the invention will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention.
Drawings
The foregoing and/or additional aspects and advantages of the invention will become apparent and may be better understood from the following description of embodiments taken in conjunction with the accompanying drawings in which:
fig. 1: a PNP algorithm coordinate system is shown;
fig. 2: a relative pose estimation flow chart.
Detailed Description
The following detailed description of embodiments of the invention is exemplary and intended to be illustrative of the invention and not to be construed as limiting the invention.
The basic principle of the invention is as follows:
firstly, a light source target is arranged on a target docking device, the light source target is provided with at least 5 characteristic light sources, the centers of 3 characteristic light sources are positioned on the same straight line, and three-dimensional coordinates of the characteristic light sources under a certain set world coordinate system are acquired in advance.
And secondly, acquiring a light source target image by using an underwater monocular vision system, and acquiring coordinates of each characteristic point in the image under a pixel coordinate system.
Thirdly, taking the coordinates of the feature points in the world coordinate system and the coordinates of the feature points in the pixel coordinate system as inputs, and obtaining the relative pose matrix of the world coordinate system relative to the camera coordinate system by utilizing the internal reference matrix of the underwater monocular vision system camera and adopting the coplanar PnP algorithm c M w Is a coarse value of (a). The world coordinate system reflects the pose of the target docking device, and the camera coordinate system reflects the pose of the underwater robot provided with the underwater monocular vision system.
Finally, to obtain c M w The coarse value is an initial value, and an RLS optimization algorithm is adopted for iteration to obtain c M w Accurate values.
The specific steps are as follows:
step 1: collecting target data of an underwater light source:
the target docking device is provided with a light source target, the light source target is provided with at least 5 characteristic light sources, the centers of the 3 characteristic light sources are positioned on the same straight line, a three-dimensional positioning system is adopted to measure and obtain three-dimensional coordinates of the centers of the characteristic light sources in a world coordinate system, and the world coordinate system in the embodiment takes the centers of the characteristic light sources of the light source target positioned at the upper left corner as an origin.
The underwater robot with the underwater monocular vision system is placed in water, in the embodiment, the underwater monocular vision system adopts a monocular RGB camera, is installed below the underwater robot, and is kept parallel to the underwater robot, and the pixels of each frame of image are higher than 800 ten thousand pixels.
The method comprises the steps of collecting a light source target image through an underwater monocular vision system, preprocessing the image, including filtering, corroding, extracting the light spot edge outline of a characteristic light source and the like, and extracting the coordinates of the center of the characteristic light source in a pixel coordinate system in the next step.
Step 2: sub-pixel feature point extraction is carried out under a pixel coordinate system of a light source target image, wherein the pixel coordinate system takes the upper left corner of the image as an origin, the image acts as a u axis, and the image columns as v axes.
Since the obtained characteristic light source light spot outline is approximately circular, the equation considering the circle under the pixel coordinate system is that
(u-a) 2 +(v-b) 2 =r 2 (1)
Thereby taking the residual as
ε i =(u i -a) 2 +(v i -b) 2 -r 2 (2)
Wherein i E is a set of edge contour points of a certain characteristic light source facula, (u) i ,v i ) Is the coordinates of a point on the edge contour in the pixel coordinate system.
The sum of squares function of the residuals is
Is available according to the least square principle
Can be obtained after being unfolded
The above materials are arranged to obtain
Wherein each parameter can be expressed as follows
And (5) correspondingly taking 0,1,2 and 3 from the superscript alpha and beta to obtain the relevant symbol in the formula (6).
The formula (6) is eliminated by the quadratic form, and after finishing, the formula is as follows
The expressions of a and b obtained by the above formula (8) can be obtained by combining the parametric expressions of circles
The a and b are the coordinates of the central point of a certain characteristic point light source in the image, namely the coordinates of the characteristic point under the pixel coordinate system.
Step 3: obtaining a relative pose matrix of a world coordinate system relative to a camera coordinate system by adopting a coplanar PnP algorithm c M w Is a coarse value of (a).
Step 3-1: and establishing a PnP algorithm coordinate system.
Knowing the geometric topological position of all feature points in the light source target, setting the coordinate of the ith feature point in the world coordinate system as P i =(x wi ,y wi ,z wi ) The coordinates in the pixel coordinate system are (u) i ,v i ) I is 1,2, …, n.
Selecting three feature points P on the same straight line in a light source target 1 、P 2 And P 3 And constructing a reference coordinate system by using the 3 characteristic points.
In this embodiment, the established reference coordinate system process is: taking the abscissa u of the three feature points under the pixel coordinate system i The smallest point is the origin of the reference coordinate system, if three feature points u i Identical, then take the ordinate v i The smaller point serves as the origin of the coordinate system. Starting from the origin, u in the three feature points is taken i X of reference coordinate system by connecting line of origin with characteristic point farthest from origin in direction r And the axis is directed from the origin to the characteristic point farthest from the origin. By X r The axis is taken as the bottom, X is taken as the cross origin r The vertical axis is Y r Axis, direction toward negative v direction of pixel coordinate system, Z r The axis direction is uniquely determined by a right hand rule to obtain a coordinate system O of a PnP algorithm r X r Y r Z r (as shown in figure 1).
Thereby, each characteristic point P can be obtained i The coordinates in the reference coordinate system are P ri =(x ri ,y ri 0) and obtaining the pose conversion matrix of the reference coordinate system relative to the world coordinate system as w M r
Step 3-2: solving the coplanar P4P problem.
Take four feature points P 1 ~P 4 Performing coplanar P4P problem solving, and knowing P 1 ~P 4 Coordinates P in world coordinate System i =(x wi ,y wi ,z wi ) And coordinates (u) in the pixel coordinate system i ,v i ) The point P can be calculated taking into account the pinhole model of the camera i Corresponding to imaging in a cameraPlane, i.e. coordinates in the image coordinate systemThe method comprises the following steps:
wherein the method comprises the steps of
The matrix is a camera internal parameter matrix, and is a known quantity obtained after camera calibration.
Characteristic point P i Coordinates P in a reference coordinate system ri =(x ri ,y ri 0) substituting the external parameter model of the camera to obtain the characteristic point P i The coordinates in the camera coordinate system are expressed as
Wherein the method comprises the steps of
c M r The first three columns of coefficients are rotation parameters in the X direction, the Y direction and the Z direction respectively, and the fourth column is a position parameter.
Whereas formula (10)
Substitution into
Is available in the form of
The matrix form is as follows
A 1 H 1 +A 2 H 2 =0 (13)
Wherein,
H 1 =[ c η rx c η ry c η rz ] T
H 2 =[ c o rx c o ry c o rz c p rx c p ry c p rz ] T
A 1 is a 2n x 3 matrix, A 2 Is a 2n×6 matrix, n is the number of feature points.
Constructing an index function
F=||A 1 H 1 +A 2 H 2 || 2 +λ(1-||H 1 || 2 ) (14)
By the index function F, will c M r The solving of the relative pose parameters is converted into an optimization problem to solve, and a proper matrix H is searched under any lambda condition 1 And H 2 So that the index function F is minimized. Wherein H is 1 And H 2 Calculated from the following formula
Wherein,
if find H 1 And H 2 Pose matrix c M r Can be determined, c M r is determined by cross-multiplying the first column and the second column. The pose matrix of the world coordinate system relative to the camera coordinate system is recorded as c M w Then there is
c M wc M r r M w (16)
In the formula (16), the amino acid sequence of the compound, r M w is the pose matrix of the world coordinate system relative to the reference coordinate system, and is represented by the pose matrix in figure 1 w M r Can obtain
r M ww M r -1 (17)
4 known spatial characteristic point light sources A 1 Is a matrix of 8 rows and 3 columns, A 2 Is a matrix of 8 rows and 6 columns, so that there are 8 unknowns in equation (13) and 8 equations exist, which can be applied to H by equations (14) and (15) 1 And H 2 Solving to obtain the camera pose matrix of the world coordinate system relative to the camera coordinate system as c M w
Step 4: RLS optimization algorithm
In step 3, 4 feature points are selected from 5 feature points to form a coplanar P4P problem, and the solution of the coplanar P4P is completed to obtain a matrix H 1 And H 2 Finally, a relative pose matrix of the world coordinate system relative to the camera coordinate system is obtained c M w . In order to obtain the relative pose matrix more accurately, an RLS optimization algorithm is adopted to perform iterative optimization, so that the relative pose matrix is accurately determined:
ζ=[ c η′ rx c η′ ry c η′ rz c o′ rx c o′ ry c o′ rz c α′ rx c α′ ry c α′ rz c p′ rx c p′ ry ] T (20)
in the method, in the process of the invention, c η′ rxc η rx / c p rzc η′ ryc η ry / c p rzc η′ rzc η rz / c p rzc o′ rxc o rx / c p rzc o′ ryc o ry / c p rzc o′ rzc o rz / c p rzc α′ rxc α rx / c p rzc α′ ryc α ry / c p rzc α′ rzc α rz / c p rzc p′ rxc p rx / c p rzc p′ ryc p ry / c p rz k is a reinforcing coefficient, different parameter values are used according to different conditions, the magnitude of the attitude error in the iterative process is determined by the magnitude of k, and the k value is reasonably configured, so that the attitude error can be eliminated to the maximum extent, and the accuracy and the robustness of the system are ensured.
The RLS optimization method has the relevant parameter value of
The relative pose parameter is accurately calculated in an iterative mode by adopting an RLS method with forgetting factors, and the specific calculation method is as follows
Wherein, iterating from n=0 to n=4 (N-1, i.e. the number of feature points is-1), and using one feature point for each iteration; ρ is a forgetting factor and the initial value of ζ is obtained according to formula (20).
Through the formula, all the characteristic points are substituted into the RLS in a one-to-one correspondence mode to perform recursive operation, so that the accurate value of zeta can be obtained, namely, the accurate pose conversion matrix of the obtained world coordinate system relative to the camera coordinate system.
Although embodiments of the present invention have been shown and described above, it will be understood that the above embodiments are illustrative and not to be construed as limiting the invention, and that variations, modifications, alternatives, and variations may be made in the above embodiments by those skilled in the art without departing from the spirit and principles of the invention.

Claims (5)

1. An underwater monocular subpixel relative pose estimation method is characterized in that: the method comprises the following steps:
step 1: setting a light source target on the target docking device, wherein the light source target is provided with at least 5 characteristic light sources, the centers of the 3 characteristic light sources are positioned on the same straight line, and three-dimensional coordinates of the characteristic light sources under a certain set world coordinate system are obtained in advance;
step 2: acquiring a light source target image by using an underwater monocular vision system, and acquiring coordinates of each characteristic point in the image under a pixel coordinate system;
step 3: taking the coordinates of the feature points in the world coordinate system and the coordinates of the feature points in the pixel coordinate system as inputs, and obtaining the coarse values of a relative pose matrix cMw of the world coordinate system relative to the camera coordinate system by utilizing an internal reference matrix of the underwater monocular vision system camera and adopting a coplanar PnP algorithm;
step 4: taking the obtained cMw coarse value as an initial value, and adopting an RLS optimization algorithm to obtain a cMw accurate value through iteration;
for the light spot edge contour of a certain characteristic light source in the light source target image, the formula is adopted
Determining the coordinates (a, b) of the central point of the characteristic light source under the pixel coordinate system; wherein the formula is used according to the coordinates (ui, vi) of the points on the edge profile of the characteristic light source light spot in the pixel coordinate system:calculating to obtain a symbol in a characteristic light source center point coordinate formula, wherein E is a set of characteristic light source light spot edge contour points, and superscripts alpha and beta correspondingly take 0,1,2 and 3;
in the step 3, a coplanar PnP algorithm is adopted, and the specific process of obtaining the coarse value of the relative pose matrix cMw of the world coordinate system relative to the camera coordinate system is as follows:
step 3-1: establishing a PnP algorithm coordinate system:
knowing the geometric topological positions of all the characteristic points in the light source target, setting the coordinates of the ith characteristic point in a world coordinate system as Pi= (xwi, ywi, zwi), and setting the coordinates of the ith characteristic point in a pixel coordinate system as (ui, vi), wherein i is 1,2, …, n; three characteristic points P1, P2 and P3 which are positioned on the same straight line in the light source target are selected, and a reference coordinate system is constructed by the 3 characteristic points;
thereby obtaining a coordinate Pri= (xri, yri, 0) of each feature point Pi in the reference coordinate system, and obtaining a pose conversion matrix wMr of the reference coordinate system relative to the world coordinate system;
step 3-2: solving the coplanar PnP problem:
taking n characteristic points P1-Pn to solve a coplanar PnP problem, wherein n is not less than 4, and knowing coordinates Pi= (xwi, ywi, zwi) of P1-Pn in a world coordinate system and coordinates (ui, vi) of P1-Pn in a pixel coordinate system; obtaining coordinates of the feature points Pi corresponding to an image coordinate system according to the camera internal reference matrix of the underwater monocular vision system;
pose conversion matrix for reference coordinate system relative to camera coordinate system
The equation is obtained:
and converted into a matrix form a1h1+a2h2=0, wherein:
a1 is a 2n×3 matrix, and A2 is a 2n×6 matrix;
constructing an index function
F=||A1H1+A2H2||2+λ(1-||H1||2)
By solving for
H1 and H2 are obtained such that the index function F is minimized, wherein,lambda is any set value;
according to the obtained H1 and H2, the first two columns and the fourth column of the pose conversion matrix cMr are determined, and the third column of cMr is determined by cross multiplication of the first column and the second column; further deriving a pose conversion matrix cMw of the world coordinate system relative to the camera coordinate system according to the formula cMw =cmrrmw, wherein rMw = wMr-1; cMr the pose conversion matrix of the reference coordinate system relative to the camera coordinate system, the first three columns of coefficients are rotation parameters in the X direction, the Y direction and the Z direction respectively, and the fourth column is a position parameter;
in step 4, the cMw coarse value obtained in step 3 is taken as an initial value, and the specific process of obtaining cMw accurate value by iteration through adopting an RLS optimization algorithm is as follows:
according to an iterative formula
Iterating from n=0 to n=n-1, and finally calculating cMw accurate values by using the obtained ζn;
rho in the iterative formula is a forgetting factor, and I is a unit array; the remaining parameters are determined according to the following formula:
i is 1,2, …, n;
zeta initial value
And (3) calculating according to the cMw coarse value obtained in the step (3), wherein k is a reinforcing coefficient.
2. The method for estimating the relative pose of underwater monocular subpixels according to claim 1, wherein the method comprises the following steps: in the step 1, a three-dimensional positioning system is adopted to measure and obtain the three-dimensional coordinates of the center of each characteristic light source in a world coordinate system, and the world coordinate system takes the center of the characteristic light source with the light source target positioned at the upper left corner as an origin.
3. The method for estimating the relative pose of underwater monocular subpixels according to claim 1, wherein the method comprises the following steps: the underwater monocular vision system adopts a monocular RGB camera, is arranged below the underwater robot, and is kept parallel to the underwater robot, and the pixels of each frame of image are higher than 800 ten thousand pixels.
4. The method for estimating the relative pose of underwater monocular subpixels according to claim 1, wherein the method comprises the following steps: in step 1, after the underwater monocular vision system collects the light source target image, preprocessing the image, including filtering, corroding and extracting the light spot edge outline of the characteristic light source, for extracting the coordinates of the center of the characteristic light source in the pixel coordinate system in the next step.
5. The method for estimating the relative pose of underwater monocular subpixels according to claim 1, wherein the method comprises the following steps: in step 3-1, the established reference coordinate system process is as follows: taking the point with the smallest abscissa ui in the pixel coordinate system as the origin of the reference coordinate system, and taking the point with the smaller ordinate vi as the origin of the reference coordinate system if the three feature points ui are the same; starting from an origin of a reference coordinate system, taking a connecting line of a feature point farthest from the origin in the ui direction and the origin in the three feature points as an Xr axis of the reference coordinate system, and pointing the feature point farthest from the origin in the direction from the origin; taking the Xr axis as a base, taking a perpendicular line of the Xr axis as a Yr axis, and directing the direction towards the negative v direction of the pixel coordinate system, wherein the Zr axis direction is uniquely determined by a right hand rule, so as to obtain the reference coordinate system OrXrYrZr.
CN202011464862.1A 2020-12-12 2020-12-12 Underwater monocular subpixel relative pose estimation method Active CN112700501B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011464862.1A CN112700501B (en) 2020-12-12 2020-12-12 Underwater monocular subpixel relative pose estimation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011464862.1A CN112700501B (en) 2020-12-12 2020-12-12 Underwater monocular subpixel relative pose estimation method

Publications (2)

Publication Number Publication Date
CN112700501A CN112700501A (en) 2021-04-23
CN112700501B true CN112700501B (en) 2024-03-05

Family

ID=75507629

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011464862.1A Active CN112700501B (en) 2020-12-12 2020-12-12 Underwater monocular subpixel relative pose estimation method

Country Status (1)

Country Link
CN (1) CN112700501B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113744342B (en) * 2021-08-04 2023-03-24 上海宏景智驾信息科技有限公司 Monocular camera external parameter calibration system and method
CN113592958A (en) * 2021-08-13 2021-11-02 大连海事大学 Monocular vision based AUV docking station optical guiding method
CN114332244A (en) * 2021-12-31 2022-04-12 北京德火科技有限责任公司 Camera positioning method and system for virtual studio
CN115790539B (en) * 2022-11-22 2024-02-13 深圳大学 Cooperative target underwater photogrammetry method
CN116485917B (en) * 2023-06-19 2023-09-22 擎翌(上海)智能科技有限公司 Combined calibration method, system, equipment and medium for shooting device and radar device
CN117474993B (en) * 2023-10-27 2024-05-24 哈尔滨工程大学 Underwater image feature point sub-pixel position estimation method and device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108734737A (en) * 2018-06-14 2018-11-02 哈尔滨工业大学 The method that view-based access control model SLAM estimation spaces rotate noncooperative target shaft
CN108844459A (en) * 2018-05-03 2018-11-20 华中科技大学无锡研究院 A kind of scaling method and device of leaf digital template detection system
CN110332887A (en) * 2019-06-27 2019-10-15 中国地质大学(武汉) A kind of monocular vision pose measurement system and method based on characteristic light punctuate
CN111415391A (en) * 2020-02-28 2020-07-14 中国民航大学 Multi-view camera external orientation parameter calibration method adopting inter-shooting method
US10719953B1 (en) * 2018-03-27 2020-07-21 Facebook Technologies, Llc Passive object tracking using camera
CN111612794A (en) * 2020-04-15 2020-09-01 哈尔滨工业大学(深圳)(哈尔滨工业大学深圳科技创新研究院) Multi-2D vision-based high-precision three-dimensional pose estimation method and system for parts
CN112066879A (en) * 2020-09-11 2020-12-11 哈尔滨工业大学 Air floatation motion simulator pose measuring device and method based on computer vision

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10719953B1 (en) * 2018-03-27 2020-07-21 Facebook Technologies, Llc Passive object tracking using camera
CN108844459A (en) * 2018-05-03 2018-11-20 华中科技大学无锡研究院 A kind of scaling method and device of leaf digital template detection system
CN108734737A (en) * 2018-06-14 2018-11-02 哈尔滨工业大学 The method that view-based access control model SLAM estimation spaces rotate noncooperative target shaft
CN110332887A (en) * 2019-06-27 2019-10-15 中国地质大学(武汉) A kind of monocular vision pose measurement system and method based on characteristic light punctuate
CN111415391A (en) * 2020-02-28 2020-07-14 中国民航大学 Multi-view camera external orientation parameter calibration method adopting inter-shooting method
CN111612794A (en) * 2020-04-15 2020-09-01 哈尔滨工业大学(深圳)(哈尔滨工业大学深圳科技创新研究院) Multi-2D vision-based high-precision three-dimensional pose estimation method and system for parts
CN112066879A (en) * 2020-09-11 2020-12-11 哈尔滨工业大学 Air floatation motion simulator pose measuring device and method based on computer vision

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Gao, Jian.A Hybrid Approach for Visual Servo Control of Underwater Vehicles.《OCEANS 2016 MTS/IEEE MONTEREY》.2016,全文. *
Li, Hai.A monocular vision system for online pose measurement of a 3RRR planar parallel manipulator.《JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS》.2018,全文. *

Also Published As

Publication number Publication date
CN112700501A (en) 2021-04-23

Similar Documents

Publication Publication Date Title
CN112700501B (en) Underwater monocular subpixel relative pose estimation method
CN107301654B (en) Multi-sensor high-precision instant positioning and mapping method
CN111089569B (en) Large box body measuring method based on monocular vision
CN109598762B (en) High-precision binocular camera calibration method
CN110570449B (en) Positioning and mapping method based on millimeter wave radar and visual SLAM
CN101488187A (en) System and method for deformable object recognition
CN103192397A (en) Off-line visual programming method and system for robot
Chatterjee et al. Algorithms for coplanar camera calibration
CN111563878A (en) Space target positioning method
CN106157322B (en) A kind of camera installation site scaling method based on plane mirror
CN111415391A (en) Multi-view camera external orientation parameter calibration method adopting inter-shooting method
CN112017248B (en) 2D laser radar camera multi-frame single-step calibration method based on dotted line characteristics
CN114474056B (en) Monocular vision high-precision target positioning method for grabbing operation
CN112362034B (en) Solid engine multi-cylinder section butt joint guiding measurement method based on binocular vision
CN117893610B (en) Aviation assembly robot gesture measurement system based on zoom monocular vision
CN105096341A (en) Mobile robot pose estimation method based on trifocal tensor and key frame strategy
Perdigoto et al. Calibration of mirror position and extrinsic parameters in axial non-central catadioptric systems
CN114001651B (en) Large-scale slender barrel type component pose in-situ measurement method based on binocular vision measurement and priori detection data
CN115218889A (en) Multi-sensor indoor positioning method based on dotted line feature fusion
CN102789644A (en) Novel camera calibration method based on two crossed straight lines
JP5462662B2 (en) Position / orientation measurement apparatus, object identification apparatus, position / orientation measurement method, and program
CN106980601B (en) High-precision basic matrix solving method based on trinocular polar line constraint
CN116817920A (en) Visual positioning method and device for plane mobile robot without three-dimensional map model
CN109059761B (en) EIV model-based handheld target measuring head calibration method
CN113324538B (en) Cooperative target remote high-precision six-degree-of-freedom pose measurement method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant