CN109829939B - Method for narrowing search range of multi-view image matching same-name image points - Google Patents

Method for narrowing search range of multi-view image matching same-name image points Download PDF

Info

Publication number
CN109829939B
CN109829939B CN201910048674.1A CN201910048674A CN109829939B CN 109829939 B CN109829939 B CN 109829939B CN 201910048674 A CN201910048674 A CN 201910048674A CN 109829939 B CN109829939 B CN 109829939B
Authority
CN
China
Prior art keywords
image
candidate
homonymous
points
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910048674.1A
Other languages
Chinese (zh)
Other versions
CN109829939A (en
Inventor
张卡
盛业华
闾国年
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Fanzai Geographic Information Industry Research Institute Co ltd
Nanjing Normal University
Original Assignee
Nanjing Fanzai Geographic Information Industry Research Institute Co ltd
Nanjing Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Fanzai Geographic Information Industry Research Institute Co ltd, Nanjing Normal University filed Critical Nanjing Fanzai Geographic Information Industry Research Institute Co ltd
Priority to CN201910048674.1A priority Critical patent/CN109829939B/en
Publication of CN109829939A publication Critical patent/CN109829939A/en
Application granted granted Critical
Publication of CN109829939B publication Critical patent/CN109829939B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses a method for reducing the searching range of multi-view image matching homonymous image points, which comprises the steps of firstly determining an initial candidate homonymous image point set; secondly, taking all candidate points in the image point to be matched and the initial candidate homonymous image point set thereof as observation data, performing multi-view front intersection adjustment calculation, and calculating the standardized residual error of each group of candidate homonymous image points according to the adjustment result; and finally, carrying out iterative refinement processing on the initial candidate homonymous image point set according to the standardized residual error distribution map of the candidate image points and the actual homonymous image point positions of the image points to be matched to obtain a quantitative numerical index capable of reflecting the position characteristics of the homonymous image points in the multi-view image and a homonymous image point search range with greatly reduced interval length. The method can be used for accurately searching the homonymous image points of multi-view image matching, thereby greatly reducing the searching number of candidate homonymous image points in the image matching process and improving the calculation efficiency and accuracy of multi-view image dense matching.

Description

Method for reducing searching range of multi-view image matching same-name image points
Technical Field
The invention belongs to digital photogrammetry, a geographic information system and a computer vision technology, and particularly relates to a method for narrowing the search range of multi-view image matching homonymous image points.
Background
Photogrammetry is a science for researching the shape, position, size, characteristics and mutual position relationship of a shot object by using images, is always a main technical means for acquiring three-dimensional geographic space information, and is widely applied to numerous fields in national economic construction and social development, such as basic mapping, geographic national condition monitoring, resource environment survey, urban planning, smart city construction and the like. In post-processing of photogrammetric data, multi-view image matching is a key step, and the purpose of the multi-view image matching is to automatically find the homonymous image points of the same ground object on a plurality of images. The homonymous image points acquired by multi-view image matching are important data bases of photogrammetric data processing links such as automatic aerial triangulation, geographic information acquisition, three-dimensional scene reconstruction and the like. In addition, multi-view image matching also plays an important role in applications such as video data processing, multi-source data fusion, robot visual navigation and the like. Therefore, image matching has been a hot research problem in the field of digital photogrammetry and computer vision.
Multi-view matching involves two basic problems: and (4) calculating the matching similarity measure and determining the searching range of the homonymy image points. The matching measure is a basis for judging whether different image points on a plurality of images are homonymous image points, and can have an important influence on the accuracy and the robustness of multi-view image matching. The searching range of the homonymous image points determines the number of candidate homonymous image points participating in the matching calculation, and the value of the searching range of the homonymous image points can have important influence on the reliability and the calculation efficiency of the multi-view image matching. At present, the research of the existing image matching method focuses on how to improve the robustness of the matching measure and constrain the search range of the image points of the same name by using image space information (gray values, feature vectors, etc.) in a local matching window and introducing various constraint conditions (epipolar lines, plane homography, adjacency relations between image points, etc.). However, the image space information has no characteristics of uniqueness and invariance, and cannot construct a unique feature which can uniquely express the image point with the same name. In addition, due to the influence of factors such as occlusion and geometric deformation, the image points in the matching window often do not satisfy the constraint conditions such as plane homography and spatial relationship maintenance. Therefore, the existing image matching method can not overcome the problem of ambiguity of matching measure caused by the phenomenon of 'same object different spectrum' or 'same spectrum foreign matter' in the image and the problem of space relation change between image points caused by ground object shielding and perspective distortion factors; when the image is processed in the areas which are difficult to match, such as repeated texture, weak texture, discontinuous parallax, geometric deformation and the like, the defects of low matching rate, poor reliability of matching results and the like exist.
Then, do there exist some unique regularity in the positions of the same-name pixels in the multiview images? Is there a numerical indicator uniquely indicating the location of the image point of the same name? How to quantitatively describe such position regularity of the same-name image points? Can this rule be used to further refine the search range of the homonymous image points in the multi-view image matching process or uniquely determine the homonymous image points? The research of the problems has important significance for solving the defects of the existing image matching method.
Disclosure of Invention
The purpose of the invention is as follows: the invention aims to provide a method for reducing the searching range of the multi-view matched same-name image points aiming at the problem of accurate searching of the same-name image points in the multi-view matching process and based on the reliability theory of a measuring adjustment system, so as to obtain a numerical index reflecting the position characteristics of the same-name image points in the multi-view image, thereby providing a method support for greatly reducing the searching number of candidate same-name image points.
The technical scheme is as follows: a method for reducing the searching range of multi-view image matching same-name image points comprises the following steps:
(1) Determining an initial candidate homonymous image point set with a uniform grouping rule on a search image for an image point to be matched on a reference image through a geometric imaging model and object elevation information of the image;
(2) Taking an image point to be matched and all candidate points in an initial candidate homonymous image point set thereof as observation data, performing least square adjustment calculation aiming at the unknown object space three-dimensional coordinates of the homonymous image points based on a multi-view front intersection principle, and calculating the standardized residual error of each group of candidate homonymous image points according to the adjustment result of the multi-view front intersection;
(3) Performing iterative refinement processing on the initial candidate homonymous image point set according to the standardized residual error distribution map of the candidate image points and the actual homonymous image point positions of the image points to be matched;
(4) And (4) according to the results of the iterative computations in the step (3), determining a quantitative numerical index reflecting the position characteristics of the homonymous image points in the multi-view image and a homonymous image point search range with greatly reduced interval length.
Further, in the step (1), the reference image is any one image in the image set, the search image is an image obtained by dividing the reference image in the image set, and the image point to be matched is any image point in the reference image.
The step (1) comprises the following steps:
(11) Inputting N multi-view images with exterior orientation elements (if linear array push-broom type imaging space photography images, visual rational polynomial coefficients are the exterior orientation elements of the images) and the object space approximate elevation range of the image coverage areaThe slightly higher range is expressed as: [ maximum elevation value Z max Minimum elevation value Z min ];
(12) For reference image I 0 Determining the image point p to be matched in the rest search images I by using the geometric imaging model of the image according to the approximate elevation range of the image area k M on the homonymous epipolar line of (1) k Candidate homonymous image point q k i (ii) a The expression of the geometric imaging model is as follows:
(r,c)=F(X,Y,Z)
wherein r and c are the row number and the column number of the image point in the image, and X, Y, Z is the object space three-dimensional coordinate of the image point corresponding to the ground object point; f is a function for expressing an image geometric imaging model, and k =1,2, …, N-1; i =1,2, …, M k (ii) a For an aviation or close-range image of the central perspective projection, a collinear conditional equation model is used, and for a multi-linear array push-broom type space satellite image, a rational function model is used;
(13) Based on the object space coordinate constraint, the number of candidate homonymous image points with different sizes on each search image is subjected to consistency processing, and the candidate homonymous image points which are independent from each other on each search image are uniformly grouped into M according to the object space coordinate z Set initial set of pixels, M z Is M k The initial set of image points is:
{[b 1 1 ,b 2 1 ,...,b N-1 1 ],[b 1 2 ,b 2 2 ,...,b N-1 2 ],...,[b 1 j ,b 2 j ,...,b N-1 j ],...,[b 1 Mz ,b 2 Mz ,...,b N-1 Mz ]}(j∈{1,2,...,M z })。
further, the step (2) comprises the following steps:
(21) For M determined in step (1) z The group candidate same-name image point set is used for eliminating repeated image points on each search image according to the line number and column number values of each candidate same-name image point;
(22) Taking the image point to be matched and all candidate same-name image points with the repeat points removed as observation image points, and constructing and solving an approximate value correction number X of the unknown object space three-dimensional coordinate of the image point based on a geometric imaging model of the image 3×1 =[dX dY dZ] T Error equation matrix V of 2M×1 =A 2M×3 X 3×1 -L 2M×1
(23) Based on least square principle, performing adjustment calculation on the error equation matrix, and calculating M based on adjustment result z Normalized residual values for jth group of candidate image points in the group
Figure BDA0001950043080000031
Wherein, ω (b) k j ) For the k search image I in the j set of candidate points k Candidate homonymous image point b of k j Normalized residual value of (2).
Further, the step (3) comprises the following steps:
(31) Drawing M z The distribution curve graph of the standardized residual error of the group candidate homonymous image points selects n containing the true homonymous image points from the initial candidate homonymous image point set according to the actual homonymous image point position of the image points to be matched z Group image point set (n) z ≤M z ):{[b 1 1 ,b 2 1 ,...,b N-1 1 ],[b 1 2 ,b 2 2 ,...,b N-1 2 ],...,[b 1 j ,b 2 j ,...,b N-1 j ],...,[b 1 nz ,b 2 nz ,...,b N-1 nz ]}(j∈{1,2,...,n z }) as a new set of candidate homologous image points;
(32) If n is z ≠M z Then n is used z Taking the new candidate homonymous image point set as an observation image point of the new adjustment calculation, performing the standardized residual calculation of the candidate point set in the step (2) again, and selecting the t new candidate point sets containing the true homonymous image points again; this process iterates until new candidates are synonymsThe number of the image points in the image point set is not changed any more;
(33) And after the iterative calculation process is finished, the output t value is 2.
Has the advantages that: compared with the prior art, the method obtains the unique error index reflecting the distribution characteristics between the true homonymous image point of the image point to be matched and other candidate image points in the multi-view image matching process, is very beneficial to further and accurately determining the homonymous image point searching range, and is expected to provide a new research idea for reducing the homonymous image point searching range of the multi-view image matching on the aspect of algorithm principle, so that the searching number of the candidate homonymous image points in the image matching process is greatly reduced, and the computing efficiency and the accuracy of the image matching are improved.
Drawings
FIG. 1 is a method framework diagram of an embodiment of the invention;
FIG. 2 (a) is a schematic diagram of object-based constraint-based determination of initial candidate homonymous image points of a multi-view image in a trend-horizontal mode;
FIG. 2 (b) is a schematic diagram of object-based constraint-based determination of initial candidate homonymous image points of a multi-view image in a trend vertical mode;
FIG. 3 is a diagram illustrating a unified grouping of initial candidate homonymous pixels of a multi-view image based on object space constraints according to an embodiment of the present invention;
FIG. 4 (a) is a diagram of the positions of the pixels to be matched located on the ground of the reference image and the actual pixels with the same name located on the two search images in the embodiment;
FIG. 4 (b) is a diagram of the positions of the image points to be matched located on the roof of the reference image and their actual corresponding image points on the two search images;
FIG. 4 (c) is a diagram of the positions of the image points to be matched located on the wall of the house of the reference image and the actual image points with the same name located on the two search images;
FIG. 5 is a graph of the distribution of error indicators for all candidate image points of the same name calculated for the image point to be matched in FIG. 4 (a);
FIG. 6 is a graph of the distribution of error indicators for all candidate image points of the same name calculated for the image point to be matched in FIG. 4 (b);
fig. 7 is a graph of the distribution of the error index of all candidate image points of the same name calculated for the image point to be matched in fig. 4 (c).
Detailed Description
For the purpose of explaining the technical solution disclosed in the present invention in detail, the following description is further made with reference to the accompanying drawings and specific embodiments.
The invention discloses a method for reducing the search range of matching homonymous image points of a multi-view image, which comprises the steps of firstly, determining an initial candidate homonymous image point set with a uniform grouping rule of image points to be matched on each search image on a reference image by utilizing an imaging model and prior object elevation information of the image; secondly, taking the image points to be matched and all candidate points in the initial candidate homonymous image point set as observation data, performing multi-view front intersection adjustment calculation aiming at the unknown object space three-dimensional coordinates of the homonymous image points, and calculating the standardized residual error of each group of candidate homonymous image points according to the adjustment result; and finally, carrying out iterative refinement processing on the initial candidate homonymous image point set according to the standardized residual error distribution map of the candidate image points and the actual homonymous image point positions of the image points to be matched, thereby obtaining a quantitative numerical index capable of reflecting the position characteristics of the homonymous image points in the multi-view image and a homonymous image point search range with greatly reduced interval length according to the result of each iterative computation. The method can be used for accurately searching the homonymous image points of multi-view image matching, thereby greatly reducing the searching number of candidate homonymous image points in the image matching process and improving the calculation efficiency and accuracy of multi-view image dense matching.
The basic calculation process of the invention is as follows:
(1) Manually determining one image as a reference image I from N multi-view aerial images with known internal and external orientation elements 0 The remaining N-1 frames are used as search images I 1 、I 2 、…、I N-1 Selecting one image point to be matched and N-1 image points with the same name corresponding to the image point from the reference image and the search image;
(2) For a given image point to be matched, determining a candidate homonymous image point search range on each search image based on epipolar line and object space approximate elevation range constraint; carrying out consistency processing on the different search ranges to uniformly group the discrete candidate homonymous image points on the different search images so as to form an initial m groups of candidate homonymous image point sets;
(3) Performing least square adjustment calculation aiming at the unknown object space three-dimensional coordinates of the image points by utilizing the image space image plane coordinates of the image points to be matched and the m groups of candidate homonymous image points of the image points to be matched based on the multi-view front intersection principle; then according to the adjustment result, calculating the standardized residual errors of all the groups of candidate image points with the same name, and drawing a candidate image point standardized residual error distribution map;
(4) Selecting a new n groups of candidate homonymous image point sets containing real homonymous image points from the m groups of candidate homonymous image point sets according to the standardized residual distribution map and the actual homonymous image point positions of the image points to be matched;
(5) If n is not equal to m, taking the new n groups of candidate homonymous image points as an initial candidate homonymous image point set, and performing the standardized residual calculation in the step (3) and the determination of the new candidate homonymous image point set in the step (4) again;
(6) Iterating the calculation process in the step (5) until the number of the image points in the new candidate homonymous image point set is not changed any more; and outputting t groups of candidate homonymous image points at the end of iteration, namely a candidate homonymous image point set refined according to the standardized residual error, wherein t is far smaller than m.
As shown in fig. 1, a method for narrowing the search range of matching the same-name image points of the multi-view image mainly comprises three parts:
(a) Determining a plurality of groups of candidate homonymous image point sets with uniform grouping rules of image points to be matched on the reference image on each search image;
(b) Calculating the standardized residual error of each group of candidate homonymous image points based on the measurement adjustment principle;
(c) Iterative refinement of the set of candidate points based on the normalized residual. The specific implementation steps are as follows:
the first step is as follows: and determining a plurality of groups of candidate homonymous image point sets with uniform grouping rules of the points to be matched on each search image.
Based on object space constraints such as epipolar lines and object space approximate elevation ranges, a specific process for determining a plurality of candidate homonymous image point sets with uniform grouping rules of image points to be matched on each search image is as follows:
(1) Manually determining one image as a reference image I from N multi-view aerial images with known internal and external orientation elements 0 The remaining N-1 frames are used as search images I 1 、I 2 、…I k 、…、I N-1 (k =1,2, …, N-1), the center of imaging of each of the reference image and the search image is S 0 、S k The corresponding exterior orientation elements are respectively
Figure BDA0001950043080000051
And one image point p to be matched and N-1 image points with the same name corresponding to the image point p are selected from the reference image and the search image (the image space row number of the image point p to be matched is marked as (r, c), and the image plane coordinate is marked as (x, y)).
(2) According to the priori knowledge, the approximate elevation range of the area covered by the reference image can be determined: minimum elevation Z min And maximum elevation Z max (the elevation range is not required to be precise, as long as the actual elevation of the image point to be matched can be included), the ground object point P corresponding to the image point P is positioned on the photographing light S 0 Segment P on P min P max In between. Using the inverse equation of the geometric imaging model (r, c) = F (X, Y, Z) (see equation (1), which takes the collinearity equation model of the central projection image as an example), the point P is calculated from the image plane coordinates and the object elevation of the image point min 、P max Object space plane coordinate (X) min ,Y min )、(X max ,Y max )。
Figure BDA0001950043080000061
Where f is the principal camera distance in the interior orientation element,
Figure BDA0001950043080000067
is composed of an image I 0 Outer azimuth angle elementIs or is>
Figure BDA0001950043080000063
Figure BDA0001950043080000064
Nine directional cosines in the determined rotation matrix.
(3) The point P is determined according to the geometric imaging model of the image (see equation (2), which takes the collinear equation model of the central projection image as an example) min (X min ,Y min ,Z min )、P max (X max ,Y max ,Z max ) Search for image I k Performing projection to obtain corresponding image points
Figure BDA0001950043080000068
And its image plane coordinate->
Figure BDA0001950043080000069
And/or>
Figure BDA00019500430800000610
Then the point p to be matched is in the image I k Point p of the same name k Must be located on line segment->
Figure BDA00019500430800000611
(this line segment is actually the homonymic epipolar line defined by the exterior orientation element, as shown in FIG. 2), all M's on this homonymic epipolar line k One image point is the image point p to be matched in the image I k The initial candidate homonymous image point of (c).
Figure BDA0001950043080000062
In the formula (I), the compound is shown in the specification,
Figure BDA0001950043080000065
is made by searching for an image I k In an exterior orientation angle element>
Figure BDA0001950043080000066
Nine directional cosines in the determined rotation matrix.
The image point can be calculated by utilizing the image principal point coordinates and the pixel size in the internal orientation elements
Figure BDA00019500430800000613
Is image space row number->
Figure BDA00019500430800000614
Figure BDA00019500430800000615
And/or>
Figure BDA00019500430800000616
Thereby obtaining the same-name check line->
Figure BDA00019500430800000612
And from this epipolar equation, the ith candidate point on the epipolar line can be calculated>
Figure BDA00019500430800000617
Is based on the column number>
Figure BDA00019500430800000618
The equation of a straight line for a homonymous epipolar line is calculated as follows:
(1) when in use
Figure BDA00019500430800000619
The epipolar line tends to a horizontal position (see FIG. 2 (a)), with a line equation of @>
Figure BDA00019500430800000633
Wherein the slope->
Figure BDA00019500430800000621
Intercept->
Figure BDA00019500430800000622
Independent variable
Figure BDA00019500430800000623
Is taken to be in a value range of->
Figure BDA00019500430800000625
Total number of candidate homonymous pixels
Figure BDA00019500430800000624
int () is a rounding function, abs () is an absolute value function, min () is a minimum value function, and max () is a maximum value function.
(2) When the temperature is higher than the set temperature
Figure BDA00019500430800000626
The epipolar line tends to be in a vertical position (see FIG. 2 (b)) and has the equation of a straight line->
Figure BDA00019500430800000627
Wherein the slope->
Figure BDA00019500430800000628
Intercept->
Figure BDA00019500430800000630
Independent variable
Figure BDA00019500430800000631
Is taken to be in a value range of->
Figure BDA00019500430800000629
The total number of candidate like image points->
Figure BDA00019500430800000632
(4) In step (3), the number M of initial candidate homonymous image points of the point p to be matched on each search image k Are different, and the candidate points on different search images are independent of each other, and the association is difficult to establish according to the serial numbers between the candidate points. In order to facilitate the analysis of the distribution rule of the candidate homonymous image points, the following method is adopted to search each imageThe number of candidate homonymous image points on the image is subjected to consistency processing, so that the independent discrete candidate homonymous image points are uniformly grouped into different candidate point sets according to the object coordinate rule:
(1) by the number M of candidate homonymous image points z =max(M k The largest search image of | k =1,2, …, N-1) is the main search image (e.g., image I in fig. 3 1 ) The remaining N-2 images are the secondary search images.
(2) For homonymous epipolar lines b in main search image 1 1 b 1 Mz J (j epsilon {1,2, …, M) z }) candidate homologous points
Figure BDA0001950043080000072
First, using a dual-image forward-rendezvous method to generate a reference image from light S 0 p and the light on the main search image->
Figure BDA0001950043080000073
Calculating the corresponding ground object point P j Object space three-dimensional coordinate (X) j ,Y j ,Z j ) (ii) a Then using the collinear equation shown in formula (2) and the t-th sub-search image I t (t is the linear equation of the homonymous epipolar line of {2, …, N-1 }), the point P j Projecting each sub-search image (as indicated by the dotted line S in FIG. 4) t P 1 、…、S t P j 、…、S t P Mz ) In the image I t Get the sum point->
Figure BDA0001950043080000074
Corresponding candidate synonym point->
Figure BDA0001950043080000075
Then, it is>
Figure BDA0001950043080000076
And (4) obtaining a jth group of candidate homonymous image points corresponding to the points to be matched on all the searched images.
(3) When all candidate homonymous image points on the main search image are processed in the step (2)The number of candidate homonymous image points of the image point p to be matched on the N-1 search images is unified into M z And candidate points of the same sequence number belong to the same group. That is, the number of candidate points on each original search image is unified into M z Group (2): { [ b ] 1 1 ,b 2 1 ,…,b N-1 1 ],[b 1 2 ,b 2 2 ,…,b N-1 2 ],…,
Figure BDA0001950043080000077
It should be noted that: m after the uniformization treatment z Group candidate homonymous image points, other than the main search image, of a certain auxiliary search image I t M of (A) to z A candidate point
Figure BDA0001950043080000078
There is a repetition point.
The second step: and (3) calculating the standardized residual error of each group of candidate homonymous image points based on the measurement adjustment principle.
A to-be-matched image point p on a reference image and each group of candidate homonymous image point sets after the uniform grouping thereof are determined as { [ b ] 1 i ,b 2 i ,…,b N-1 i ]|i∈{1,2,…,M z The standardized residual calculation of each group of candidate same-name image points based on the measurement adjustment principle is carried out according to the following process, so that the standardized residual value omega of the ith group of candidate same-name image points is obtained i
(1) And removing repeated candidate homonymous image points. From the above-mentioned determination principle of candidate homonymous image points on the search image, M on the k-th search image z A candidate point
Figure BDA0001950043080000079
There are duplicate points and duplicate candidate points may only occur in consecutive neighbors of the sequence number. Therefore, from the 2 nd candidate point, it is determined whether or not the point overlaps with the adjacent previous point one by one, and if the point overlaps with the previous point, the point is marked as an overlap point. When in useAll M z After all candidate points are subjected to discrimination marking, M without duplication can be obtained k Individual candidate point->
Figure BDA00019500430800000710
In addition, after the repeated point is distinguished, the serial number j (j belongs to {1,2, …, M) of the original repeated point set point can be obtained z }) and the sequence number j 'of the point in the set of points after the repetition point (j' is larger than the sequence number of the point in the set of points after the repetition point {1,2, …, M ∈ k }) of the following: j' = Cor (j) (function Cor represents the j point ≦ Cor (j) }>
Figure BDA00019500430800000711
And j' th point>
Figure BDA00019500430800000712
Is the same point). At this time, the total number of the same-name image points on the reference image and the N-1 search images is%>
Figure BDA0001950043080000071
(2) Error equations for all image points are constructed. According to the geometric imaging model of the image, taylor series linearization is performed on the unknown three-dimensional coordinates (X, Y, Z) of the image point by taking the image plane coordinates of the image point on the image as the observed value, and an error equation for solving the three-dimensional coordinates of the object side shown in formula (3) can be listed (taking the collinear equation of the central projection image shown in formula (2) as an example):
Figure BDA0001950043080000081
wherein (x, y) is an observation value of the image plane coordinates of the image point, and (v) x ,v y ) As the residual of the observed value, ((x) 0 ,(y) 0 ) Is to approximate the object space coordinates (X) of the image point 0 ,Y 0 ,Z 0 ) And (2) carrying out approximate image plane coordinates (dX, dY, dZ) calculated by the formula (2) to obtain the correction number of the approximate value of the object space three-dimensional coordinates to be solved. And 6 coefficients (a) of the error equation 11 ,a 12 ,a 13 ,a 21 ,a 22 ,a 23 ) Calculated as follows:
Figure BDA0001950043080000082
wherein (a) 1 ,a 2 ,a 3 ;b 1 ,b 2 ,b 3 ;c 1 ,c 2 ,c 3 ) Is the cosine of nine directions in the rotation matrix of the image, (X) s ,Y s ,Z s ) Is the object space three-dimensional coordinate of the image photographing center.
According to the formula (3), for all M image points with the same name on the reference image and each search image, the error equations of the image points are listed one by one according to the sequence of searching the image from the reference image, and an error equation matrix shown in the formula (5) is formed.
V 2M×1 =A 2M×3 X 3×1 -L 2M×1 (5)
In the formula, V is a residual matrix (dimension is 2M rows × 1 column) of 2M image plane coordinate observations, X is an approximate correction matrix (dimension is 3 rows × 1 column) of an object three-dimensional coordinate to be solved, and A, L is a coefficient matrix and a constant term matrix (dimension is 2M rows × 3 column and 2M rows × 1 column, respectively) of an error equation. The specific form of these matrices is shown in equation (6).
Figure BDA0001950043080000083
In the formula, the superscript of each element in the matrix represents a picture number, and the subscript represents the serial numbers of the data item and the dot.
(3) A reliability matrix is calculated. According to coefficient array A, constant array L and weight array P of observed value in error equation ll (the observed values in this document are independent observations, and this matrix is an identity matrix E), the following reliability matrix R can be calculated:
Figure BDA0001950043080000091
(4) An error equation is solved. Solving an error equation matrix shown in the formula (5) by using a least square principle to obtain an unknown matrix X, a residual matrix V of all observed values and an error sigma in unit weight 0
Figure BDA0001950043080000092
(5) The normalized residual ω (i) of the ith "point of the M homonymous image points is calculated. Since there are two image plane coordinate observations x and y for each image point, the average of the normalized residuals x and y for that image point is taken as the normalized residual for that point (see equation (9)).
Figure BDA0001950043080000093
(6) Calculating the kth search image I k The jth candidate homonymous pixel on
Figure BDA0001950043080000097
Is standardized for the residual value->
Figure BDA0001950043080000098
Due to the point->
Figure BDA0001950043080000099
The serial number j corresponds to the serial number j' after the duplication point is removed, and the total number of the image points with the same name after the duplication is greater than or equal to the value from the reference image to the (k-1) th search image>
Figure BDA0001950043080000094
Therefore, is combined with>
Figure BDA00019500430800000910
The calculation method of (c) is as follows:
Figure BDA0001950043080000095
(7) Calculating the normalized residual value omega of the i-th group of candidate homonymous image points i . Due to the i-th group candidate homonymous image point [ b 1 i ,b 2 i ,…,b N-1 i ]Is composed of N-1 candidate image points on N-1 search images, so the average value of the normalized residual errors of the N-1 candidate image points is taken as omega i
Figure BDA0001950043080000096
The third step: iterative refinement of the set of candidate points based on the normalized residual.
In the second step, M of the image point p to be matched on the N-1 search images is calculated z Set of candidate homonymous image points { [ b ] 1 1 ,b 2 1 ,…,b N-1 1 ],[b 1 2 ,b 2 2 ,…,b N-1 2 ],…,[b 1 i ,b 2 i ,…,b N-1 i ],…,
Figure BDA00019500430800000911
Normalized residual value omega of each group of image points in the image i (i∈{1,2,…,M z And) carrying out iterative refinement processing on the candidate point set according to the following process:
(1) According to the standard residual value omega of each group of image points i The horizontal axis represents the group number i, and the normalized residual value ω is calculated i For the vertical axis, M is plotted z Normalized residual distribution plots for the set of candidate homonymous image points.
(2) According to M z The shape characteristics of the standardized residual error distribution curve chart of the group of candidate homonymous image points and the actual homonymous image point positions of the image points p to be matched on the N-1 search images are set, if the standardized residual error distribution curve of the candidate image points presents an approximate V-shaped distribution shape of the lowest point, the M-th image points are set according to the shape characteristics of the standardized residual error distribution curve chart of the group of candidate homonymous image points, and the M-th image points are set according to the shape characteristics of the standardized residual error distribution curve chart of the group of candidate homonymous image points and the actual homonymous image point positions of the image points p to be matched on the N-1 search images z Selecting n containing true homonym image points from the group candidate homonym image point set z Group image point set (n) z ≤M z ):{[b 1 1 ,b 2 1 ,…,b N-1 1 ],[b 1 2 ,b 2 2 ,…,b N-1 2 ],…,
Figure BDA0001950043080000101
[b 1 nz ,b 2 nz ,…,b N-1 nz ]}(j∈{1,2,…,n z }) as a new set of candidate homologous image points.
(3) If n is z ≠M z Then, to n z Grouping new candidate homonymous image point sets, calculating the standardized residual error of the candidate point set in the second step again, and selecting t groups of new candidate point sets containing real homonymous image points again; if t ≠ n z Then continuing to calculate the standardized residual error of the candidate point set and selecting a new candidate point set of the next t groups; the above process is iterated until the number of the image points in the new candidate homonymous image point set is not changed any more.
(4) The t groups of candidate homonymous image points output when the iterative computation process is finished are a candidate homonymous image point set after iterative refinement according to the standardized residual error.
According to the above steps, the experimental data of fig. 4 (a) -4 (c) are selected for experimental verification of the method of the present invention. Fig. 4 shows 3 image points to be matched located on the ground, the top of a building, and the wall facade of the building, and the positions of the image points with the same name on 2 search images (the straight lines on the search images are the epipolar lines); iterative refinement calculation of the candidate homonymous image point set of the invention is performed on the 3 image points to be matched in fig. 4, and the normalized residual error distribution curve of the candidate homonymous image points calculated in each iteration is shown in fig. 5, 6 and 7.
According to the results of fig. 5, 6 and 7, the present invention summarizes the position characteristics of the same-name image points in the multi-view image as follows: (1) for the image point to be matched on one image in the multi-view images, the standardized residual values of all candidate homonymous image points in the search interval on the homonymous epipolar lines of the rest images present a V-shaped distribution phenomenon. (2) On the normalized residual value distribution curve of all candidate homonymous image points, the minimum residual value point is located substantially at the middle of all candidate points, and the true homonymous image point is located between the minimum residual value point and the start point (or end point). (3) And (3) from the V-shaped distribution curve, taking a local point set on the left (or right) side of the minimum residual error point containing the true image points with the same name as a new candidate point set, and iteratively performing adjustment calculation on the standardized residual errors of the new point set until the number of the image points in the new candidate point set is not changed any more. In each iteration process, the standardized residual values of the candidate homonymous image points still obey the distribution phenomena of the rules (1) and (2); moreover, after the iteration is finished, the number of candidate homonymous image points on each search image is reduced to 2.
Therefore, the method can obtain the unique error index and the distribution characteristics for distinguishing the true homonymous image point of the image point to be matched from other candidate image points in the multi-view image matching process. Through the finite times of iterative computation of the method, the number of candidate homonymous image points of the image points to be matched on each search image on the reference image is finally reduced to 2, so that the search number of the candidate homonymous image points in the multi-view image matching process is greatly reduced, and the method is very favorable for improving the computing efficiency and accuracy of multi-view image matching (especially multi-view intensive matching).

Claims (3)

1. A method for reducing the searching range of multi-view image matching same-name image points is characterized in that: the method comprises the following steps:
(1) Determining an initial candidate homonymous image point set with a uniform grouping rule on a search image for an image point to be matched on a reference image through a geometric imaging model and object elevation information of the image;
(2) Taking an image point to be matched and all candidate points in an initial candidate homonymous image point set thereof as observation data, performing least square adjustment calculation aiming at the unknown object space three-dimensional coordinates of the homonymous image points based on a multi-view front intersection principle, and calculating the standardized residual error of each group of candidate homonymous image points according to the adjustment result of the multi-view front intersection;
(3) Performing iterative refinement processing on the initial candidate homonymous image point set according to the standardized residual distribution map of the candidate image points and the actual homonymous image point positions of the image points to be matched;
(4) According to the results of each iterative computation in the step (3), determining a quantitative numerical index reflecting the position characteristics of the homonymous image points in the multi-view image and a homonymous image point searching range with greatly reduced interval length,
the step (1) comprises the following steps:
(11) Inputting N multi-view images with external orientation elements and an object space approximate elevation range of an image coverage area, wherein the object space approximate elevation range is expressed as: [ maximum elevation value Z max Minimum elevation value Z min ];
(12) For reference image I 0 Determining the image point p to be matched in the rest search images I by using the geometric imaging model of the image according to the approximate elevation range of the image area k M on the homonymous epipolar line of (1) k Candidate corresponding image point q k i (ii) a The expression of the geometric imaging model is as follows:
(r,c)=F(X,Y,Z)
wherein r and c are the row number and the column number of the image point in the image, and X, Y, Z is the object space three-dimensional coordinate of the image point corresponding to the ground object point; f is a function expressing a geometric imaging model of the image, k =1,2, ·, N-1; i =1,2, …, M k (ii) a For an aviation or close-range image of the central perspective projection, a collinear conditional equation model is used, and for a multi-linear array push-broom type space satellite image, a rational function model is used;
(13) Based on the object space coordinate constraint, the number of candidate homonymous image points with different sizes on each search image is subjected to consistency processing, and the candidate homonymous image points which are independent from each other on each search image are uniformly grouped into M according to the object space coordinate z Set initial set of pixels, M z Is M k The initial set of image points is:
{[b 1 1 ,b 2 1 ,...,b N-1 1 ],[b 1 2 ,b 2 2 ,...,b N-1 2 ],...,[b 1 j ,b 2 j ,...,b N-1 j ],...,[b 1 Mz ,b 2 Mz ,...,b N-1 Mz ]j ∈ {1,2 z },
The step (2) comprises the following steps:
(21) For M determined in step (1) z The group candidate same-name image point set is used for eliminating repeated image points on each search image according to the line number and column number values of each candidate same-name image point;
(22) Taking the image point to be matched and all candidate same-name image points with the repeat points removed as observation image points, and constructing and solving an approximate value correction number X of the unknown object space three-dimensional coordinate of the image point based on a geometric imaging model of the image 3×1 =[dX dY dZ] T Error equation matrix V of 2M×1 =A 2M×3 X 3×1 -L 2M×1
(23) Based on the principle of least square, the error equation matrix is subjected to adjustment calculation, and M is calculated on the basis of the adjustment result z Normalized residual values for jth group of candidate image points in the group
Figure FDA0003950146450000021
Wherein the content of the first and second substances,
Figure FDA0003950146450000022
for the k search image I in the j set of candidate points k Candidate homonymous image points of
Figure FDA0003950146450000023
The normalized residual value of (a) is,
the step (3) comprises the following steps:
(31) Drawing M z The distribution curve graph of the standardized residual error of the group candidate homonymous image points selects n containing the true homonymous image points from the initial candidate homonymous image point set according to the actual homonymous image point position of the image points to be matched z Set of image points, where n z ≤M z :{[b 1 1 ,b 2 1 ,...,b N-1 1 ],[b 1 2 ,b 2 2 ,...,b N-1 2 ],...,[b 1 j ,b 2 j ,...,b N-1 j ],...,[b 1 nz ,b 2 nz ,...,b N-1 nz ]J e {1,2 z The candidate homonymous image point set is used as a new candidate homonymous image point set;
(32) If n is z ≠M z Then n is used z Taking the new candidate homonymous image point set as an observation image point of the new adjustment calculation, performing the standardized residual calculation of the candidate point set in the step (2) again, and selecting the t new candidate point sets containing the true homonymous image points again; the process is iterated until the number of the image points in the new candidate homonymous image point set is not changed any more;
(33) And after the iterative calculation process is finished, the output t value is 2.
2. The method of claim 1, wherein the method further comprises the step of: in the step (1), the reference image is any one image in the image set, the search image is other images except the reference image in the image set, and the image point to be matched is any point in the reference image.
3. The method of claim 1, wherein the method further comprises the step of: if the space photography image is linear array push-broom type imaging in the step (11), rational polynomial coefficients can be considered as external orientation elements of the image.
CN201910048674.1A 2019-01-18 2019-01-18 Method for narrowing search range of multi-view image matching same-name image points Active CN109829939B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910048674.1A CN109829939B (en) 2019-01-18 2019-01-18 Method for narrowing search range of multi-view image matching same-name image points

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910048674.1A CN109829939B (en) 2019-01-18 2019-01-18 Method for narrowing search range of multi-view image matching same-name image points

Publications (2)

Publication Number Publication Date
CN109829939A CN109829939A (en) 2019-05-31
CN109829939B true CN109829939B (en) 2023-03-24

Family

ID=66860940

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910048674.1A Active CN109829939B (en) 2019-01-18 2019-01-18 Method for narrowing search range of multi-view image matching same-name image points

Country Status (1)

Country Link
CN (1) CN109829939B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117664088B (en) * 2024-01-31 2024-04-02 中国人民解放军战略支援部队航天工程大学 Method, system and equipment for determining homonymy point by ultra-wide vertical orbit circular scanning satellite image
CN117664087B (en) * 2024-01-31 2024-04-02 中国人民解放军战略支援部队航天工程大学 Method, system and equipment for generating vertical orbit circular scanning type satellite image epipolar line

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103604417B (en) * 2013-11-15 2015-08-05 南京师范大学 The multi-view images bi-directional matching strategy that object space is information constrained
CN103606151B (en) * 2013-11-15 2016-05-04 南京师范大学 Based on the virtual geographical scene method for auto constructing on a large scale of imaging point cloud
CN104318566B (en) * 2014-10-24 2017-04-05 南京师范大学 Can return to the new multi-view images plumb line path matching method of multiple height values

Also Published As

Publication number Publication date
CN109829939A (en) 2019-05-31

Similar Documents

Publication Publication Date Title
CN109410256B (en) Automatic high-precision point cloud and image registration method based on mutual information
CN107767440B (en) Cultural relic sequence image fine three-dimensional reconstruction method based on triangulation network interpolation and constraint
CN113592989B (en) Three-dimensional scene reconstruction system, method, equipment and storage medium
CN105354841B (en) A kind of rapid remote sensing image matching method and system
CN109829939B (en) Method for narrowing search range of multi-view image matching same-name image points
US10432915B2 (en) Systems, methods, and devices for generating three-dimensional models
CN111369607A (en) Prefabricated part assembling and matching method based on picture analysis
CN112946679B (en) Unmanned aerial vehicle mapping jelly effect detection method and system based on artificial intelligence
CN112489099A (en) Point cloud registration method and device, storage medium and electronic equipment
Hong et al. Rapid three-dimensional detection approach for building damage due to earthquakes by the use of parallel processing of unmanned aerial vehicle imagery
CN112929626A (en) Three-dimensional information extraction method based on smartphone image
CN104318566B (en) Can return to the new multi-view images plumb line path matching method of multiple height values
JP6411188B2 (en) Stereo matching device, stereo matching program, and stereo matching method
CN116518864A (en) Engineering structure full-field deformation detection method based on three-dimensional point cloud comparison analysis
CN115330876A (en) Target template graph matching and positioning method based on twin network and central position estimation
CN109087344B (en) Image selection method and device in three-dimensional reconstruction
CN107504959B (en) Method for measuring house wall base outline by utilizing inclined aerial image
CN114529615A (en) Radar calibration method, device and storage medium
CN112132950B (en) Three-dimensional point cloud scene updating method based on crowdsourcing image
CN110580468B (en) Single wood structure parameter extraction method based on image matching point cloud
Duan et al. A combined image matching method for Chinese optical satellite imagery
CN112819900B (en) Method for calibrating internal azimuth, relative orientation and distortion coefficient of intelligent stereography
CN115700760A (en) Multi-mode data-based total-space laser radar scattering cross section calculation method
CN116091562A (en) Building point cloud automatic registration method based on two-dimensional projection line segments
CN115326025A (en) Binocular image measuring and predicting method for sea waves

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant