CN116612165A - Registration method for large-view-angle difference SAR image - Google Patents

Registration method for large-view-angle difference SAR image Download PDF

Info

Publication number
CN116612165A
CN116612165A CN202310420988.6A CN202310420988A CN116612165A CN 116612165 A CN116612165 A CN 116612165A CN 202310420988 A CN202310420988 A CN 202310420988A CN 116612165 A CN116612165 A CN 116612165A
Authority
CN
China
Prior art keywords
point
image
sar
algorithm
sift
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310420988.6A
Other languages
Chinese (zh)
Inventor
王英华
刘俊
刘宏伟
张晓婷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN202310420988.6A priority Critical patent/CN116612165A/en
Publication of CN116612165A publication Critical patent/CN116612165A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/14Transformations for image registration, e.g. adjusting or mapping for alignment of images
    • G06T3/147Transformations for image registration, e.g. adjusting or mapping for alignment of images using affine transformations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a registration method for large-view-angle difference SAR images, which utilizes an SAR-SIFT algorithm, a multi-scale Harris-Affine algorithm and a multi-scale MSER algorithm to respectively extract key points between a reference image and a to-be-registered image, establishes descriptors and matches, and then utilizes an LSS-Flow algorithm to carry out refined matching to obtain three matching point pair sets with different properties; then, registering and refining by using an A-SAR-SIFT algorithm with affine invariance in a mode of constructing a Voronoi polygon, and obtaining a matching point set again; and finally, summarizing and screening the matching point sets obtained in the first two stages, and calculating a global transformation model. The method of the invention takes into consideration the speckle noise existing in the SAR image and the influence caused by serious geometric distortion and radiation distortion existing between the images, and can remarkably improve the registration performance of the SAR image under large visual angle difference.

Description

Registration method for large-view-angle difference SAR image
Technical Field
The invention belongs to the technical field of SAR image registration, and particularly relates to a registration method for large-view-angle difference SAR images.
Background
The synthetic aperture radar (Synthetic Aperture Radar, SAR) is used as an active microwave remote sensing imaging system, has all-weather, all-day and high-resolution imaging characteristics, and can obtain ground observation data of an interested target under different polarization modes, different wave bands and different viewing angles. Currently, registration methods for SAR images have become a hotspot for many scholars to study.
Most of the proposed algorithms, such as the SAR-SIFT algorithm, have achieved good registration accuracy and efficiency on the general SAR image registration task, and also show prominence on the subsequent tasks of data classification, three-dimensional reconstruction, change detection, image fusion and the like. However, these methods only use the single characteristic features in the reference image and the image to be registered, such as corner features, but ignore other complementary features such as lines, planes, etc., so when the feature information contained in the imaging scene is less or the condition such as image blurring caused by poor imaging quality occurs, these methods fail in registration because sufficient correct matching point pairs cannot be found; in addition, when the number of the mismatching points is large, the existing method is difficult to screen out a large number of correct matching point pairs from the mismatching point pairs and converge the correct matching point pairs to a more accurate position by utilizing a refined matching algorithm, so that the obtained global transformation model is inaccurate; finally, due to the side view mechanism of the SAR sensor, serious geometric distortion is often caused between the reference image and the image to be registered, and the existing method has poor SAR image registration precision under the large visual angle difference.
Paper Registrating Oblique SAR Images Based on Complementary Integrated Filtering and Multilevel Matching provides an image registration method based on the combination of MSER algorithm and Harris-Affine algorithm. The method does not deeply consider that a plurality of matching point pairs which are not easily influenced by the change of the visual angle still exist between the reference image and the image to be registered; when the SAR-SIFT algorithm is used for matching, affine invariance of matching point pairs obtained by the MSER algorithm and the Harris-Affine algorithm in the previous step is only utilized, the corresponding relation and the spatial position information between the matching point pairs are ignored, and when the matching is performed, the spatial position guidance is absent, so that a lot of interference is increased; only a quick sampling consistency (fast sample consensus, FSC) algorithm is utilized to remove mismatching point pairs, so that the algorithm has long running time and low efficiency; the refined matching effect obtained by the proposed refined matching algorithm still has room for further improvement.
Disclosure of Invention
Aiming at the defects of the existing SAR image registration method, the invention provides a registration method aiming at large-view-angle difference SAR images, and the three-stage registration frame is used for registering the SAR images under the large view angle difference. Firstly, fully extracting matching point pairs with different properties between a reference image and an image to be registered by using different algorithms, and carrying out refined matching; then, using the A-SAR-SIFT algorithm with affine invariance, establishing a descriptor and carrying out local matching in a mode of constructing a Voronoi polygon, and then processing all obtained matching point pairs through a mismatching point pair removing step and a refined matching step; and finally summarizing all the matched point pairs obtained in the previous two stages, and calculating a global transformation model after the mismatching point pairs are removed. The SAR image registration method can well complete the registration task of SAR images under large view angle difference, and further improves the registration performance of SAR images.
A registration method for large-view-angle difference SAR images comprises the following steps:
(1) The key points in the reference image and the image to be registered are respectively extracted by using an SAR-SIFT algorithm, a multi-scale Harris-Affine algorithm and a multi-scale MSER algorithm, and then three matching point pair sets with different properties are finally obtained by establishing SAR-SIFT descriptors and matching: point-to-point set S that is not susceptible to changes in viewing angle S,c Corner point pair set S with affine invariance H,c Point-pair set S with affine invariance regions M,c
(2) Point-to-point pair set S using LSS-FLOW algorithm S,c 、S H,c And S is M,c Refined matching is carried out to obtain a more accurate point pair set S S,p 、S H,p And S is M,p
(3) Extracting key points on the reference map and the map to be registered by using an A-SAR-SIFT algorithm and establishing descriptors, and then at each pair of the reference map and the map to be registeredFinding out initial matching point pairs in the corresponding small polygons by utilizing the nearest neighbor principle, preliminarily removing mismatching point pairs from a matching point pair set on the whole image by utilizing an LSOR algorithm, and then carrying out secondary screening by utilizing an FSC algorithm to obtain a matching point pair set S A,c Refined matching is carried out by utilizing an LSS-Flow algorithm, and finally a matching point pair set S is obtained A,p
(4) Assembling point pairs S S,p 、S H,p 、S M,p And S is equal to A,p After summarizing, removing possible mismatching point pairs by using an LSOR algorithm to finally obtain a matching point pair set S final By S final A global transformation model is calculated.
Further, the above-mentioned point pair set S which is not easily affected by the change of the viewing angle is obtained S,c Corner point pair set S with affine invariance H,c Point-pair set S with affine invariance regions M,c The specific steps of (a) are as follows:
1a) Point-to-point set S that is not susceptible to changes in viewing angle S,c
Extracting key points in an image by using an SAR-SIFT algorithm, establishing an SAR-SIFT descriptor, using an NNDR algorithm to find an initial matching point pair set between a reference image and a to-be-registered image, removing mismatching point pairs by using an FSC algorithm, and finally obtaining a matching point pair set S S,c
1b) Corner point pair set S with affine invariance H,c
Extracting key points in the image by using a multi-scale Harris-Affine algorithm and obtaining an Affine transformation matrix at each key point, wherein the step of extracting the key points in the image is the same as that of the SAR-SIFT algorithm; then, transforming the image at each key point by utilizing a local affine transformation matrix, then establishing an SAR-SIFT descriptor, using NNDR algorithm to find an initial matching point pair set between the reference image and the image to be registered, removing the mismatching point pair by FSC algorithm, and finally obtaining a matching point pair set S H,c
1c) Point-pair set S with affine invariance regions M,c
Extracting key in image by using multi-scale MSER algorithmThe method comprises the steps of obtaining affine transformation matrixes at each key point, transforming an image by utilizing the local affine transformation matrixes at each key point, establishing SAR-SIFT descriptors, using NNDR algorithm to find an initial matching point pair set between a reference image and a to-be-registered image, removing mismatching point pairs through FSC algorithm, and finally obtaining a matching point pair set S M,c
Further, in the above registration method for large view angle difference SAR images, in the step (3), key points on the reference map and the map to be registered are extracted and descriptors are established by using an a-SAR-SIFT algorithm, and the specific steps are as follows: firstly, extracting key points on a reference image and an image to be registered by using an SAR-SIFT algorithm, and then collecting S H,p And S is M,p Respectively establishing Voronoi polygons for objects to obtain local affine transformation matrixes of each SAR-SIFT key point, and establishing SAR-SIFT descriptors after transforming images by using the local affine transformation matrixes.
Further, in the above-mentioned registration method for large-view-angle difference SAR images, the key points in the images are extracted by using the SAR-SIFT algorithm in steps 1 a), 1 b) and (3), and the operation steps are as follows:
2a) Selecting a group of exponentially weighted scale factors [ alpha ] 01 ,...,α n-1 ]Wherein the initial value alpha 0 =2,α i =α 0 *k i ,(i∈[1,n-1]),k=2 1/3 ,n=8;
2b) Calculating the horizontal gradient G of the image under different scale factors by using a ROEWA algorithm from the scale sequence of alpha x,α And vertical gradient G y,α Thereby obtaining the gradient amplitude Mag of the image under different scales α And the gradient direction Ori α
2c) According to already solving forSAR-Harris matrix C at each pixel point in the obtained gradient calculation image SH (x,y,α):
Wherein the parameter alpha is a scale parameter of an exponential weighting function,standard deviation is a Gaussian function;
2d) By C SH (x, y, alpha) calculating SAR-Harris response value R at each pixel point in the image SH (x,y,α),
R SH (x,y,α)=det(C SH (x,y,α))-d·tr(C SH (x,y,α)) 2
Wherein d is a parameter of any size, and is generally between 0.04 and 0.06;
2e) In the same layer of the scale space image, the SAR-Harris response value of each point is compared with the response value of the point in the 8 neighborhood of the SAR-Harris response value and a global threshold d SH Respectively, if the response value of the central point in the neighborhood is maximum and greater than the global threshold d SH Then the point is the detected keypoint, where the global threshold d SH Typically 0.85 is taken.
Further, in the above-mentioned registration method for large-view-angle difference SAR images, the steps 1 a), 1 b), 1 c) and (3) establish SAR-SIFT descriptors at the found key points, and the operation steps thereof are as follows:
3a) Using gradient amplitude Mag α And the gradient direction Ori α Taking a key point as a circle center, taking a circular neighborhood with radius of 6α, dividing 0-360 DEG into 36 parts in the circular neighborhood, wherein each part represents a range of 10 DEG, the abscissa represents a gradient direction angle, the ordinate represents a gradient amplitude value, traversing all pixel points in the neighborhood, firstly finding a histogram amplitude column corresponding to the gradient direction angle of the pixel point for each pixel point in the neighborhood, and then accumulating the gradient amplitude of the pixel point on the histogram amplitude column of the direction angle of the pixel point to obtainA main direction histogram reaching the key point, and smoothing the histogram;
3b) Taking the direction angle corresponding to the smoothed histogram peak value as the main direction of the key point and the direction angle corresponding to the column with the energy larger than 80% of the peak value as the auxiliary direction;
3c) Taking the key point as the center of a circle and the radius as r max A circular neighborhood of 12 alpha, rotating the circular neighborhood with a main direction angle as a reference to ensure that the feature descriptors have rotation invariance, then establishing a gradient direction histogram, dividing the circular neighborhood into 17 sub-areas, and respectively setting the radius of concentric circles to be 0.25 r from inside to outside max ,0.75·r max And r max And equally dividing 0-360 degrees into 8 parts, calculating a gradient direction histogram of each sub-region, and splicing and normalizing the histogram vectors of 17 sub-regions to generate 136-dimensional SAR-SIFT descriptors.
Further, in the above registration method for large view angle difference SAR images, in step 1 b), an Affine transformation matrix at a key point is calculated by using a multi-scale Harris-Affine algorithm, and the operation steps are as follows:
6a) For reference to figure I 1 A key point (x i ,y i ) Setting the iteration number K=15 and initializing the shape self-adaptive matrix U # 1 ) Is a unit matrix E;
6b) In the kth iteration, the shape adaptive matrix U is utilized (k ) For reference to figure I 1 And key points (x i ,y i ) Transforming to obtain a transformed image and key point coordinates:
wherein T is U using shape adaptive matrix k ) A mapping operation for transforming the image or coordinates;
6c) To be used forIs centered in the image->A square area W with the side length of 4 alpha is taken up, wherein alpha represents the scale of an image layer where the key point is located;
6d) Calculating horizontal gradient G of square region W under scale factor alpha by ROEWA algorithm x,α And vertical gradient G y,α Thereby obtaining the gradient amplitude Mag of W α
Calculating SAR-Harris matrix C at each pixel point in W according to the obtained gradient SH (x,y,α):
By C SH (x, y, α) calculating SAR-Harris response value R at each pixel point in W SH (x,y,α)
R SH (x,y,α)=det(C SH (x,y,α))-d·tr(C SH (x,y,α)) 2
Wherein d is a parameter of any size, and is generally between 0.04 and 0.06 degrees;
Taking the point with the maximum SAR-Harris response value in the square area W as a new key point coordinate
6e) Updating key points in original reference image I 1 Coordinates of:
6f) To be used forIs centered in the image->Re-selecting a square area W_new with side length of 4α, and calculating SAR-Harris matrix of central pixel point in the new area>
6g) Updating the shape adaptive matrix:
U (k+1) =(μ (k) ) -1 U (k)
normalizing the shape self-adaptive matrix to make the maximum characteristic value equal to 1;
6h) Calculating the convergence rate at the kth iteration:
wherein lambda is min(k) ) And lambda is max(k) ) Respectively represent the matrix mu (k) Minimum and maximum eigenvalues of (2);
6i) When ratio < 0.1, the loop is exited, and the result is that the value at the critical point (x i ,y i ) Affine transformation Matrix at i =U (k) Otherwise, the iteration times are increased by 1 and the loop is continued, and the updated shape self-adaptive matrix is utilized to recalculate the key points (x i ,y i ) A convergence rate ratio at; if the condition that the ratio is less than 0.1 can not be satisfied after the iteration is carried out for K times, discarding the key point;
6j) For reference image I 1 With the image I to be registered 2 Each key onThe same operations as above are performed by the dots.
Further, in the above registration method for large view angle difference SAR images, in step 1 c), key points in the images are extracted by using a multi-scale MSER algorithm, and an affine transformation matrix at each key point is calculated, and the operation steps are as follows:
8a) Selecting a set of scale space factors [ alpha ] 01 ,...,α n-1 ]Wherein the initial value alpha 0 =2,α i =α 0 *k i ,(i∈[1,n-1]),k=2 1/3 N=4, namely the scale of the Gaussian kernel, selecting the window length w=4α, and establishing a scale space by using the Gaussian blur kernel;
8b) Sequencing each layer of images according to gray values, and if the images are color images, converting the color images into gray images; a node is allocated to each pixel point in the image in advance, and the node index number is the gray value corresponding to the pixel point; according to the sorting result of the pixel points, placing the pixel points into the component tree one by one, wherein the placing sequence is the node index number corresponding to each pixel point; in the process of embedding, firstly, the pixel point is embedded, then the four-adjacent-domain position of the pixel point is checked, if nodes exist, the respective root node is searched, and the two node areas are combined; after all the pixel points are placed in the component tree, all extreme value regions corresponding to the image are obtained; wherein, the extremum area is defined as: if the gray values of all pixels in a certain region are larger than the gray values of boundary pixels, the region is defined as the maximum extremum region; if the gray values of all pixels in the region are smaller than the gray values of the boundary pixels, defining the region as a minimum extremum region;
8c) The MSER region is obtained using a maximum stability determination condition: if Q 1 ,...,Q i-1 ,Q i ,. it is a series of extreme regions that are mutually contained, i.e.If extremum region->At maximumStabilizing the extremum region if and only if the region change rate Q (i) =q i+Δ -Q i-Δ Q i Local minima are obtained at i, where · is expressed as the area of the region, subscript i e 0,255]Representing the gray scale, Δ representing a minute gray scale variation;
8d) Approximately fitting the irregular maximum stable value region into an elliptical region;
firstly, taking the center of gravity of a maximum stable value area as the center of an ellipse, and calculating the center of the ellipse:
calculating the geometric zero-order distance and geometric first-order distance of the extremely stable value region:
m 00 =∑I e (x,y)
m 01 =∑yI e (x,y)
m 10 =∑xI e (x,y)
wherein m is 00 、m 01 And m is equal to 10 A geometric zero-order distance and a geometric first-order distance respectively of the maximum stable value region, I e (x, y) represents the region of maximum stable values, so that the center coordinates of the ellipse, namely the coordinates (x) of the key points detected by the MSER algorithm can be obtained c ,y c ):
Calculating the geometric second step of the extremely stable value region:
wherein the method comprises the steps of
μ 20 =∑(x-x c ) 2 I e (x,y)
μ 02 =∑(y-y c ) 2 I e (x,y)
μ 11 =∑(x-x c )(y-y c )I e (x,y)
Two eigenvalues of geometric second order are calculated:
calculating the major half axis w, the minor half axis l and the major axis direction of the ellipse
8e) By using the elliptic long half shaft w, the elliptic short half shaft l and the major axis directionCalculate key point (x) c ,y c ) Affine transformation matrix at:
8f) Repeating the above operations by inverting the gray scale of the input image;
8g) The same is done for each layer of images in the scale space.
Further, in the above registration method for large-view-angle difference SAR images, the fine matching is performed on the point pair set by using the LSS-FLOW algorithm in the steps (2) and (3), so as to obtain a more accurate point pair set, and the operation steps are as follows:
9a) Utilizing reference point pair setsThe matching point pairs in the step (2) and the step (3) are all S, and the transformation model theta from the to-be-registered graph to the reference graph is obtained through calculation S,c 、S H,c And S is M,c Is the sum of the reference pictures I 1 The image to be registered is I 2 Map I to be registered 2 And (3) deforming:
wherein the method comprises the steps ofRepresenting an image obtained by transforming the image to be registered by using the transformation model theta;
9b) For the point pair set S to be refined coarse In the to-be-registered graph I 2 The above points are calculated to be the transformed image I 2 T Corresponding to the position on the first set, wherein for the step (2), the set of the point pairs to be refined is S S,c 、S H,c And S is M,c For the step (3), the set of the point pairs to be refined is S A,c
(x i T ,y i T )=T((x i ,y i ),θ);
9c) Calculating image I by ROEWA algorithm 1 And I 2 T Horizontal gradient G at scale factor α=2 x And vertical gradient G y Thereby obtaining the gradient amplitude Mag and the gradient direction Ori of the image:
For image I 1 And (3) withTaking a circular neighborhood with radius r=24 as a circle center of the pixel point, dividing the neighborhood into 17 subregions, dividing the concentric circle radius from inside to outside into 0.25 r,0.75 r and r respectively, dividing 0-360 DEG into 8 parts in each subregion, dividing each part of the region representing 45 DEG, wherein the abscissa represents a gradient direction angle, the ordinate represents a gradient amplitude, traversing all the pixel points in the subregion, firstly finding a histogram amplitude column corresponding to the gradient direction angle of the pixel point for each pixel point in the region, accumulating the gradient amplitude of the pixel point on a square column of the direction angle of the pixel point, thereby obtaining a gradient direction histogram of the point, converting the gradient direction histogram into a one-dimensional histogram vector, and splicing and normalizing the histogram vectors of the 17 subregions to generate a 136-dimensional SAR-SIFT descriptor;
computing image I 1 And (3) withThe SAR-SIFT descriptors of each pixel point are spliced to form an image I 1 And->SAR-SIFT description subgraph I 1 Both _desc and +_>
9d) For the point pair set S to be refined c o arse In reference image I 1 A matching point (x i ,y i ) Centering on it, describing sub-graph I in SAR-SIFT 1 Taking out one side length of 2 r on_desc lf +1 square region description sub-image I 1 Squ, for its corresponding point (x i T ,y i T ) The same operation is carried out, and the obtained square region description sub-image isWherein r is lf Taking 61;
9e) The loss function is E (w) is:
where p= (x, y) represents a certain pixel point on the image, w (p) = (u (p), v (p)) represents an optical flow vector of p point on the image, where u (p) represents an offset of p point in the horizontal direction, and v (p) represents an offset of p point in the vertical direction; epsilon represents a region formed by taking p points as the center and adding 8 adjacent points, and q represents a certain point in the region; parameters eta and alpha are respectively 0.001 and 0.01, and parameters t and d are 1;
wherein (1) the equation is a data item that constrains SAR-SIFT descriptors along the optical-flow vector w (p) to match each other; (2) The small displacement term constrains the optical flow vectors to be as small as possible without other available information; (3) The expression is a smooth term, and the optical flow vectors of adjacent pixels are constrained to be similar;
solving the w when the loss function E (w) takes the minimum value by using a confidence coefficient propagation algorithm;
9f) Computing an imageUpper point (x) i T ,y i T ) Is a new coordinate of (a):
x i T _new=x i T +u(r lf +1,r lf +1)
y i T _new=y i T +v(r lf +1,r lf +1)
9g) Calculation of theta using transformation model of image Point (x) i T _new,y i T New) in the original image I to be registered 2 Coordinate position on (c):
(x i _new,y i _new)=T((x i T _new,y i T _new),θ -1 )
9h) Traversing the point pair set S to be refined coarse After each pair of matching points, a more accurate point pair set S is obtained precise Wherein for step (2), the more accurate set of point pairs is S S,p 、S H,p And S is M,p For step (3), the more accurate set of point pairs is S A,p
Further, in the above-mentioned registration method for large-view-angle difference SAR image, S is used in step (3) H,p And S is M,p Establishing a Voronoi polygon for an object to obtain a local affine transformation matrix of each SAR-SIFT key point, wherein the operation steps are as follows:
10a) Utilizing a set of point pairs S H,p And S is M,p At reference image I 1 All the discrete points with affine invariance are taken as objects to automatically construct a Delaunay triangle network; numbering the discrete points and the formed triangles, and recording which three discrete points each triangle is composed of;
10b) Finding out the numbers of all triangles adjacent to each discrete point, and recording;
10c) Ordering and recording triangles adjacent to each discrete point in a clockwise direction so as to generate a Voronoi polygon by the next connection;
10d) Calculating the circle center of the circumscribed circle of each triangle, and recording;
10e) Connecting the circle centers of the circumscribed circles of the adjacent triangles according to the adjacent triangles of each discrete point to obtain a Voronoi polygon;
10f) The SAR-SIFT key points in each small Voronoi polygon are endowed with the same affine transformation matrix as the discrete points with affine invariance in the polygon, so that the local affine transformation matrix of each SAR-SIFT key point is obtained;
10g) Utilizing a set of point pairs S H,p And S is M,p At the image I to be registered 2 All the discrete point pairs with affine invariance are subjected to the same operation, and a local affine transformation matrix of each SAR-SIFT key point is obtained.
Further, in the above-mentioned registration method for large-view-angle difference SAR images, the matching point pairs on the whole image are preliminarily removed by using the LSOR algorithm in the steps (3) and (4), and the operation steps are as follows:
11a) Utilizing reference point pair setsCalculating a transformation model theta' from the image to be registered to the reference image, wherein for the step (3), the reference point pair set is S S,p 、S H,p And S is M,p Is the sum of (3); for step (4), the reference point pair set is S S,p 、S H,p 、S M,p And S is A,p Is a sum of (a) and (b).
Reference picture is I 1 The image to be registered is I 2 Image I to be registered using transformation model θ 2 Deforming to obtain an image
11b) For reference point pair setsIn the to-be-registered graph I 2 At a certain point (x s ,y s ) Calculating its image +.after transformation using transformation model θ - >Corresponding position coordinates (x s T ,y s T ):
(x s T ,y s T )=T((x s ,y s ),θ')
Aggregating pairs of reference pointsAll the point pairs in the list are subjected to the same transformation to obtain a new point pair set S ref T
11c) Calculate the Point pair set S ref T The length and the slope of each pair of matching point connecting lines are summed and averaged to obtain an average length dist_ave and an average slope_ave;
11d) For the point pair set S to be screened fil In the to-be-registered graph I 2 At a certain point (x i ,y i ) Calculating its transformed image using transformation model θCorresponding position coordinates (x i T ,y i T ). For the step (3), the point pair set to be screened is an initial matching point pair set on the whole image found by utilizing the nearest neighbor principle in the previous step; for the step (4), the point pair set to be screened is S S,p 、S H,p 、S M,p And S is A,p Is the sum of:
(x i T ,y i T )=T((x i ,y i ),θ')
11e) Computing point pairs (x) i ,y i ) And (x) i T ,y i T ) Length dist of wire i And slope i
11f) Set threshold to-be-screened point pair set S fil Screening all the point pairs in the table, and reserving the matched point pairs meeting the conditions:
|dist i -dist_ave|<Th d
|slope i -slope_ave|<Th s
wherein Th is d Taking 0.1 th s Taking 5 degrees.
The invention has the beneficial effects that:
1. according to the method, key points of different properties between the reference image and the image to be registered are analyzed, a mode of combining a multi-scale MSER algorithm, a multi-scale Harris-Affine algorithm and a SAR-SIFT algorithm is provided, the key points in the image are extracted together, descriptors are built for matching, and the difficulty that registration fails because the characteristic information contained in an imaging scene is less or the phenomenon of image blurring and the like caused by poor imaging quality cannot be found is effectively overcome;
2. The invention provides a method for establishing a Voronoi polygon around an acquired multi-scale MSER matching point pair with Affine invariance and a multi-scale Harris-Affine matching point pair on a reference image and a to-be-registered image, endowing SAR-SIFT key points in each small Voronoi polygon with Affine transformation properties the same as those of the matching points with Affine invariance in the polygon, and registering by establishing a descriptor, wherein the proposed A-SAR-SIFT algorithm is different from the traditional SAR-SIFT algorithm, so that each key point has Affine invariance, and more correct matching point pairs can be found for the registration problem of large-view-angle difference SAR images;
3. aiming at the problem that correct matching point pairs are difficult to screen out when the number of the mismatching point pairs is large, the invention provides an LSOR algorithm, and the mismatching point pairs in the point pair set are effectively removed by screening through setting a threshold value under the conditions of length and slope; aiming at the difficult problems of inaccurate positions and larger errors between correct matching point pairs, an LSS-Flow refined algorithm is provided by combining the idea of an optical Flow algorithm. The application of the mismatching point pair removing method and the refined method enables the finally obtained global transformation model to be more accurate;
4. Aiming at the problem of SAR image registration under large view angle difference, the invention provides a three-stage registration frame, and the accuracy of image registration is effectively improved.
Drawings
FIG. 1 is a flow chart of an implementation of the present embodiment;
FIG. 2 is an image of building SAR-SIFT descriptors;
FIG. 3 is an image of gradient calculation using the ROEWA algorithm;
FIG. 4 is a reference diagram of the input of Test 1;
FIG. 5 is a diagram to be registered of the input of Test 1;
FIG. 6 is a reference diagram of the input of Test 2;
FIG. 7 is a diagram to be registered of the input of Test 2;
FIG. 8 is a post-registration checkerboard diagram of Test 1;
FIG. 9 is a correct matching point pair diagram of Test 1;
FIG. 10 is a post-registration checkerboard diagram of Test 2;
FIG. 11 is a correct matching point pair diagram of Test 2;
FIG. 12 is a flowchart of an implementation of the present embodiment;
FIG. 13 is a flowchart of an implementation of the A-SAR-SIFT algorithm;
fig. 14 is an input explanatory diagram of the LSOR algorithm.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings.
Referring to fig. 1, 2, 3 and 12, the present embodiment provides a registration method for large view angle difference SAR images, which includes the steps of:
(1) The method comprises the steps of respectively extracting key points between a reference image and an image to be registered by using an SAR-SIFT (SAR scale-invariant feature transform, SAR-SIFT) algorithm, a multi-scale Harris-Affine algorithm and a multi-scale maximum stable extremum region (Maximally Stable Extremal Regions, MSER) algorithm, and finally obtaining three matching point pair sets with different properties by establishing SAR-SIFT descriptors and matching: point-to-point set S that is not susceptible to changes in viewing angle S,c Corner point pair set S with affine invariance H,c Point-pair set S with affine invariance regions M,c
1a) Point-to-point set S that is not susceptible to changes in viewing angle S,c
Extracting key points in an image by using an SAR-SIFT algorithm, establishing an SAR-SIFT descriptor, and finding a primary position between a reference image and an image to be registered by using a nearest neighbor distance ratio (nearest neighbor distance ratio, NNDR) algorithmStarting a matching point pair set, removing the mismatching point pair by an FSC algorithm, and finally obtaining a matching point pair set S S,c
1b) Corner point pair set S with affine invariance H,c
Extracting key points in the image by using a multi-scale Harris-Affine algorithm and obtaining an Affine transformation matrix at each key point, wherein the step of extracting the key points in the image is the same as that of the SAR-SIFT algorithm; then, transforming the image at each key point by utilizing a local affine transformation matrix, then establishing an SAR-SIFT descriptor, using NNDR algorithm to find an initial matching point pair set between the reference image and the image to be registered, removing the mismatching point pair by FSC algorithm, and finally obtaining a matching point pair set S H,c
1c) Point-pair set S with affine invariance regions M,c
Extracting key points in an image by using a multi-scale MSER algorithm, obtaining an affine transformation matrix at each key point, transforming the image by using a local affine transformation matrix at each key point, establishing an SAR-SIFT descriptor, using an NNDR algorithm to find an initial matching point pair set between a reference image and a to-be-registered image, removing mismatching point pairs by using an FSC algorithm, and finally obtaining a matching point pair set S M,c
(2) Point-to-point set S using Local SAR-SIFT Flow (LSS-Flow) algorithm S,c 、S H,c And S is M,c Refined matching is carried out to obtain a more accurate point pair set S S,p 、S H,p And S is M,p
(3) Extracting key points on the reference image and the image to be registered by using an A-SAR-SIFT algorithm and establishing a descriptor, wherein the specific steps are that firstly, the key points in the reference image and the image to be registered are extracted by using the SAR-SIFT algorithm, and then, the key points are collected by using a set S H,p And S is M,p Establishing Voronoi polygons for objects at all points on the reference graph and the graph to be registered to obtain a local affine transformation matrix of each SAR-SIFT key point, and establishing SAR-SIFT descriptors after transforming the images by using the local affine transformation matrix; the most utilized is then within each corresponding small polygon in the reference and to-be-registered mapsThe neighbor principle finds out an initial matching point pair, the length and slope-based abnormal point deleting (outlier removal based on length and slope, LSOR) algorithm is utilized to preliminarily remove the mismatching point pair from the matching point pair set on the whole image, and then the FSC algorithm is utilized to carry out secondary screening to obtain a matching point pair set S A,c Refined matching is carried out by utilizing an LSS-Flow algorithm, and finally a matching point pair set S is obtained A,p
(4) Assembling point pairs S S,p 、S H,p 、S M,p And S is equal to A,p After summarizing, removing possible mismatching point pairs by using an LSOR algorithm to finally obtain a matching point pair set S final By S final A global transformation model is calculated.
The embodiment makes up the defects of the existing SAR image registration method, and proposes to register SAR images under large visual angle difference by using a three-stage registration frame. In a first-stage task, primarily searching matching point pairs between a reference image and a to-be-registered image by using a multi-scale MSER algorithm, a multi-scale Harris-Affine algorithm and an SAR-SIFT algorithm, and finely matching all the obtained matching point pairs by using a fine matching algorithm; in the second stage task, through a mode of constructing Voronoi polygons, carrying out local matching by utilizing an A-SAR-SIFT algorithm with affine invariance, obtaining a large number of matching point pairs, then fully reserving correct matching point pairs by utilizing a mismatching point pair removing algorithm and an FSC algorithm, and finally carrying out fine matching by utilizing a fine matching algorithm; in the third stage task, screening all the matched point pairs obtained in the first stage and the second stage, and calculating a global transformation model. The method takes the influence of the speckle noise existing in the SAR images and the serious geometric distortion and radiation distortion existing between the images into consideration, and can remarkably improve the registration performance of the SAR images under the large visual angle difference.
In the SAR image registration method of the present embodiment, the key points in the images are extracted by using the SAR-SIFT algorithm in steps 1 a), 1 b) and (3), and the operation steps are as follows:
2a) Selecting a group of exponentially weighted scale factors [ alpha ] 01 ,...,α n-1 ]Wherein the initial value alpha 0 =2,α i =α 0 *k i ,(i∈[1,n-1]),k=2 1/3 ,n=8。
2b) Calculating the horizontal gradient G of the image under different scale factors by using an index weighted average ratio (Ratio of Exponent Weight Average, ROEWA) algorithm from the scale sequence of alpha x,α And vertical gradient G y,α Thereby obtaining the gradient amplitude Mag of the image under different scales α And the gradient direction Ori α
2c) Calculating SAR-Harris matrix C at each pixel point in the image according to the obtained gradient SH (x,y,α):
Wherein the parameter alpha is a scale parameter of an exponential weighting function,is the standard deviation of the gaussian function.
2d) By C SH (x, y, alpha) calculating SAR-Harris response value R at each pixel point in the image SH (x,y,α):
R SH (x,y,α)=det(C SH (x,y,α))-d·tr(C SH (x,y,α)) 2
Wherein d is a parameter of any size, and is generally between 0.04 and 0.06.
2e) In the same layer of the scale space image, the SAR-Harris response value of each point is compared with the response value of the point in the 8 neighborhood of the SAR-Harris response value and a global threshold d SH Respectively, if the response value of the center point in the neighborhood is maximum and greater than the global thresholdd SH Then the point is the detected keypoint, where the global threshold d SH Typically 0.85 is taken.
In the SAR image registration method of the present embodiment, the SAR-SIFT descriptors are created at the found key points in steps 1 a), 1 b), 1 c) and (3), and the operation steps thereof are as follows:
3a) Using gradient amplitude Mag α And the gradient direction Ori α Taking a key point as a circle center, taking a circular neighborhood with a radius of 6α, dividing 0-360 DEG into 36 parts in the circular neighborhood, wherein each part represents a range of 10 DEG, the abscissa represents a gradient direction angle, the ordinate represents a gradient amplitude value, traversing all pixel points in the neighborhood, firstly finding a histogram amplitude column corresponding to the gradient direction angle of the pixel point for each pixel point in the neighborhood, accumulating the gradient amplitude value of the pixel point on a square column of the direction angle of the pixel point, thereby obtaining a main direction histogram of the key point, and carrying out smoothing treatment on the histogram.
3b) And taking the direction angle corresponding to the smoothed histogram peak value as the main direction of the key point, and taking the direction angle corresponding to the column with the energy larger than 80% of the peak value as the auxiliary direction.
3c) Taking the key point as the center of a circle and the radius as r max A circular neighborhood of =12α, and the circular neighborhood is rotated with reference to the main direction angle, the feature descriptor is made to have rotation invariance, and then a gradient direction histogram is established. As shown in figure 2, the circular neighborhood is divided into 17 sub-areas, and the radius of the concentric circles is 0.25 r from inside to outside max ,0.75·r max And r max And equally dividing 0-360 degrees into 8 parts, calculating a gradient direction histogram of each sub-region, and splicing and normalizing the histogram vectors of 17 sub-regions to generate 136-dimensional SAR-SIFT descriptors.
In the SAR image registration method of the present embodiment, the initial matching point pair set between the reference map and the map to be registered is found by using NNDR algorithm in steps 1 a), 1 b) and 1 c), and the operation steps are as follows:
4a) In the key point space, the Euclidean distance between the feature vectors is used for measuring the distance degree of two key points, and the closer the distance degree isThe more similar, the closest and next closest key points are found, and d is used for the closest and next closest distances respectively min And d nd And (3) representing.
4b) If it isThen the key point and the key point closest to the key point are correctly matched point pairs, and when the threshold distRatio is 0.8, a smaller point pair set C is obtained h When the threshold distRatio takes 0.9, a larger set of point pairs C is obtained l
In the SAR image registration method of the present embodiment, the steps 1 a), 1 b), 1 c) and (3) remove the mismatching point pair by using the FSC algorithm, and the operation steps thereof are as follows:
5a) Setting the iteration number of the algorithm as N, and in the t-th iteration process, setting a point-to-point set C h Three pairs of matching points are randomly selected:
5b) Calculating a transformation model θ for an image using the three pairs of points t
5c) Computing a set of point pairs C using the resulting transformation model l Midpoint pair c i Is the transform error of (a):
c i ={(x i ,y i ),(x i ,y i )}
e(c it )=||(x i ,y i )-T((x i ,y i ),θ t )||
wherein, (x) i ,y i ) Representing the point pair c i Coordinates of key points on the reference map, (x) i ,y i ) Indicated at point pair c i Corresponding point coordinates on the map to be registered; t ((x) i ,y i ),θ t ) Representation using transform model pairs (x i ,y i ) Corresponding position e (c) it ) Represented in the transformation model theta t Lower, point pair c i Is a matching error of (a).
5d) Traversing Point pair set C l Each pair of matching points in the set C is obtained by integrating all matching point pairs with the transformation error e less than 3 t
5e) When the algorithm is iterated for N times, the algorithm is finished, and C is obtained 1 ,C 2 ,...,C N And taking out the set with the most point pairs as a matching point pair set finally obtained by an algorithm.
In the SAR image registration method of the embodiment, in the step 1 b), an Affine transformation matrix at key points is calculated by using a multi-scale Harris-Affine algorithm, and the operation steps are as follows:
6a) For reference to figure I 1 A key point (x i ,y i ) Setting the iteration number K=15 and initializing the shape self-adaptive matrix U # 1 ) Is the identity matrix E.
6b) In the kth iteration, the shape adaptive matrix U is utilized (k) For reference to figure I 1 And key points (x i ,y i ) Transforming to obtain a transformed image and key point coordinates:
wherein T is U using shape adaptive matrix k ) A mapping operation for transforming the image or coordinates;
6c) To be used forIs centered in the image->And (4) taking a square area W with a side length of 4α, wherein α represents the scale of the image layer where the key point is located.
6d) Square area calculation using ROEWA algorithmHorizontal gradient G of domain W at scale factor alpha x,α And vertical gradient G y,α Thereby obtaining the gradient amplitude Mag of W α
Calculating SAR-Harris matrix C at each pixel point in W according to the obtained gradient SH (x,y,α):
Wherein the parameter alpha is a parameter of an exponentially weighted function,standard deviation is a Gaussian function;
by C SH (x, y, α) calculating SAR-Harris response value R at each pixel point in W SH (x,y,α)
R SH (x,y,α)=det(C SH (x,y,α))-d·tr(C SH (x,y,α)) 2
Wherein d is a parameter of any size, and is generally between 0.04 and 0.06.
Taking the point with the maximum SAR-Harris response value in the square area W as a new key point coordinate
6e) Updating key points in original reference image I 1 Coordinates of:
6f) To be used forIs centered in the image->Re-selecting a square area W_new with side length of 4α, and calculating SAR-Harris matrix of central pixel point in the new area >
6g) Updating the shape adaptive matrix:
U (k+1) =(μ (k) ) -1 U (k )
and normalizing the shape adaptive matrix to make its maximum eigenvalue equal to 1.
6h) Calculating the convergence rate at the kth iteration:
wherein lambda is min(k) ) And lambda is max(k) ) Respectively represent the matrix mu (k) Minimum feature value and maximum feature value of (a).
6i) When ratio < 0.1, the loop is exited, and the result is that the value at the critical point (x i ,y i ) Affine transformation Matrix at i =U (k) Otherwise, the iteration times are increased by 1 and the loop is continued, and the updated shape self-adaptive matrix is utilized to recalculate the key points (x i ,y i ) A convergence rate ratio at; if the condition that the ratio is less than 0.1 can not be satisfied after the iteration is carried out for K times, discarding the key point;
6j) For reference image I 1 With the image I to be registered 2 Each of the above key points performs the same above operation.
In the SAR image registration method of the present embodiment, the images are transformed with an affine transformation matrix at each key point in steps 1 b), 1 c) and (3). For reference image I 1 Some key point on(x i ,y i ) Using affine transformation Matrix i Map of gradient amplitude for image Mag α Gradient pattern Ori α Transforming the coordinates of the key points:
Mag α T =T(Mag α ,Matrix i )
Ori α T =T(Ori α ,Matrix i )
(x i T ,y i T )=T((x i ,y i ),Matrix i )。
in the SAR image registration method of the present embodiment, in step 1 c), key points in an image are extracted by using a multi-scale MSER algorithm, and an affine transformation matrix at each key point is calculated, and the operation steps are as follows:
8a) Selecting a set of scale space factors [ alpha ] 01 ,...,α n-1 ]Wherein the initial value alpha 0 =2,α i =α 0 *k i ,(i∈[1,n-1]),k=21 3 N=4, namely the scale of the Gaussian kernel, selecting the window length w=4α, and establishing a scale space by using the Gaussian blur kernel;
8b) Sequencing each layer of images according to gray values, and if the images are color images, converting the color images into gray images; a node is allocated to each pixel point in the image in advance, and the node index number is the gray value corresponding to the pixel point. According to the sorting result of the pixel points, placing the pixel points into the component tree one by one, wherein the placing sequence is the node index number corresponding to each pixel point; in the process of embedding, firstly, the pixel point is embedded, then the position of the four adjacent domains of the pixel point is checked, if nodes exist, the respective root nodes are searched, and the two node areas are combined. After all the pixel points are placed in the component tree, all the extreme value regions corresponding to the image are obtained. Wherein, the extremum area is defined as: if the gray values of all pixels in a certain region are larger than the gray values of boundary pixels, the region is defined as the maximum extremum region; if the gray values of all pixels in the region are smaller than the gray values of the boundary pixels, defining the region as a minimum extremum region;
8c) The MSER region is obtained using a maximum stability determination condition: if Q 1 ,...,Q i-1 ,Q i ,. it is a series of extreme regions that are mutually contained, i.e.If extremum region->For the maximum stable extremum region, if and only if the region change rate Q (i) = |q i+Δ -Q i-Δ |/|Q i Local minima are obtained at i, where-expressed as area of the region, subscript i e 0,255]Representing the gray scale, Δ representing a minute gray scale variation;
8d) The irregular region of extremely stable values is approximately fit to an elliptical region.
Firstly, taking the center of gravity of a maximum stable value area as the center of an ellipse, and calculating the center of the ellipse:
calculating the geometric zero-order distance and geometric first-order distance of the extremely stable value region:
m 00 =ΣI e (x,y)
m 01 =ΣyI e (x,y)
m 10 =∑xI e (x,y)
wherein m is 00 、m 01 And m is equal to 10 A geometric zero-order distance and a geometric first-order distance respectively of the maximum stable value region, I e (x, y) represents the region of maximum stable values, so that the center coordinates of the ellipse, namely the coordinates (x) of the key points detected by the MSER algorithm can be obtained c ,y c ):
Calculating the geometric second step of the extremely stable value region:
wherein the method comprises the steps of
μ 20 =∑(x-x c ) 2 I e (x,y)
μ 02 =∑(y-y c ) 2 I e (x,y)
μ 11 =∑(x-x c )(y-y c )I e (x,y)
Two eigenvalues of geometric second order are calculated:
calculating the major half axis w, the minor half axis l and the major axis direction of the ellipse
8e) By using the elliptic long half shaft w, the elliptic short half shaft l and the major axis directionCalculation keyPoint (x) c ,y c ) Affine transformation matrix at:
8f) Inverting the gray scale of the original image, and repeating the operation;
8g) The same is done for each layer of images in the scale space.
In the SAR image registration method of the present embodiment, the fine matching is performed on the point pair set by using the LSS-FLOW algorithm in steps (2) and (3), so as to obtain a more accurate point pair set, and the operation steps are as follows:
9a) Utilizing reference point pair setsThe matching point pairs in the step (2) and the step (3) are all S, and the transformation model theta from the to-be-registered graph to the reference graph is obtained through calculation S,c 、S H,c And S is M,c Is a sum of (a) and (b). Reference picture is I 1 The image to be registered is I 2 Map I to be registered 2 And (3) deforming:
wherein the method comprises the steps ofRepresenting an image transformed by the transformation model θ for the map to be registered.
9b) For the point pair set S to be refined coarse In the to-be-registered graph I 2 The above points are calculated to be the transformed imageCorresponding to the position on the first set, wherein for the step (2), the set of the point pairs to be refined is S S,c 、S H,c And S is M,c For the step (3), the set of the point pairs to be refined is S A,c
(x i T ,y i T )=T((x i ,y i ),θ)。
9c) Calculating image I by ROEWA algorithm 1 And (3) withHorizontal gradient G at scale factor α=2 x And vertical gradient G y Thereby obtaining the gradient amplitude Mag and the gradient direction Ori of the image:
/>
for image I 1 And (3) withTaking the pixel point as a circle center, taking a circular neighborhood with the radius of r=24, dividing the neighborhood into 17 subregions, respectively taking 0.25 r,0.75 r and r as shown in figure 2 as concentric circle radius from inside to outside, dividing 0-360 DEG into 8 parts in each subregion, each part representing a range of 45 DEG, wherein the abscissa represents a gradient direction angle, the ordinate represents a gradient amplitude value, traversing all the pixel points in the subregion, firstly finding a histogram amplitude column corresponding to the gradient direction angle of the pixel point for each pixel point in the region, then accumulating the gradient amplitude value of the pixel point on a square column of the direction angle of the pixel point, thereby obtaining a gradient direction histogram of the pixel point, converting the gradient direction histogram into a one-dimensional histogram vector, and splicing and normalizing the histogram vectors of the 17 subregions to generate a 136-dimensional SAR-SIFT descriptor;
Computing image I 1 And (3) withEach pixel pointAnd stitch together the SAR-SIFT descriptors of (2) to form image I 1 And->SAR-SIFT description subgraph I 1 Both _desc and +_>
9d) For the point pair set S to be refined c o arse In reference image I 1 A matching point (x i ,y i ) Centering on it, describing sub-graph I in SAR-SIFT 1 Taking out one side length of 2 r on_desc lf +1 square region description sub-image I 1 Squ, for its corresponding point (x i T ,y i T ) The same operation is carried out, and the obtained square region description sub-image isWherein r is lf Taking 61;
9e) The loss function is E (w) is:
where p= (x, y) represents a certain pixel point on the image, w (p) = (u (p), v (p)) represents an optical flow vector of p point on the image, where u (p) represents an offset of p point in the horizontal direction, and v (p) represents an offset of p point in the vertical direction; epsilon represents a region formed by taking p points as the center and adding 8 adjacent points, and q represents a certain point in the region; parameters eta and alpha are respectively 0.001 and 0.01, and parameters t and d are both 1.
Wherein (1) the equation is a data item that constrains SAR-SIFT descriptors along the optical-flow vector w (p) to match each other; (2) The small displacement term constrains the optical flow vectors to be as small as possible without other available information; (3) The expression is a smooth term that constrains the optical flow vectors of adjacent pixels to be similar.
9f) Computing an imageUpper point (x) i T ,y i T ) Is a new coordinate of (a):
x i T _new=x i T +u(r lf +1,r lf +1)
y i T _new=y i T +v(r lf +1,r lf +1)
9g) Calculating points (x) using a transformation model θ of an image i T _new,y i T New) in the original image I to be registered 2 Coordinate position on (c):
(x i _new,y i _new)=T((x i T _new,y i T _new),θ -1 )
9h) Traversing the point pair set S to be refined coarse After each pair of matching points, a more accurate point pair set S is obtained precise Wherein for step (2), the more accurate set of point pairs is S S,p 、S H,p And S is M,p For step (3), the more accurate set of point pairs is S A,p
S is used in step (3) in the SAR image registration method of the embodiment H,p And S is M,p Establishing a Voronoi polygon for an object to obtain a local affine transformation matrix of each SAR-SIFT key point, wherein the operation steps are as follows:
10a) Utilizing a set of point pairs S H,p And S is M,p At reference image I 1 All the discrete points with affine invariance are taken as objects to automatically construct a Delaunay triangle network; for these discrete points and the triangle numbers formed, it is recorded which three discrete points each triangle is made up of.
10b) The numbers of all triangles adjacent to each discrete point are found and recorded.
10c) Triangles adjacent to each discrete point are ordered and recorded in a clockwise direction for the next step of connection to generate Voronoi polygons.
10d) And calculating the circle center of the circumscribed circle of each triangle, and recording.
10e) And connecting the circle centers of the circumscribed circles of the adjacent triangles according to the adjacent triangles of each discrete point, so as to obtain the Voronoi polygon.
10f) The local affine transformation matrix of each SAR-SIFT key point is obtained by endowing the SAR-SIFT key points positioned in each small Voronoi polygon with the same affine transformation matrix of the discrete points with affine invariance in the polygon.
10g) Utilizing a set of point pairs S H,p And S is M,p At the image I to be registered 2 All the discrete point pairs with affine invariance are subjected to the same operation, and a local affine transformation matrix of each SAR-SIFT key point is obtained.
In the SAR image registration method of the present embodiment, the matching point pairs on the whole image in steps (3) and (4) are preliminarily removed by using the LSOR algorithm, and the operation steps are as follows:
11a) Utilizing reference point pair setsCalculating a transformation model theta' from the image to be registered to the reference image, wherein for the step (3), the reference point pair set is S S,p 、S H,p And S is M,p Is the sum of (3); for step (4), the reference point pair set is S S,p 、S H,p 、S M,p And S is A,p Is the sum of (3);
reference picture is I 1 The image to be registered is I 2 Image I to be registered using transformation model θ 2 Deforming to obtain an image
Wherein the method comprises the steps ofRepresenting an image obtained by transforming the image to be registered by using the transformation model theta';
11b) For reference point pair setsIn the to-be-registered graph I 2 At a certain point (x s ,y s ) Calculating its image +.after transformation using transformation model θ ->Corresponding position coordinates (x s T ,y s T ):
(x s T ,y s T )=T((x s ,y s ),θ')
Aggregating pairs of reference pointsAll the point pairs in the list are subjected to the same transformation to obtain a new point pair set S ref T
11c) Calculate the Point pair set S ref T The length and the slope of each pair of matching point connecting lines are summed and averaged to obtain an average length dist_ave and an average slope_ave;
11d) For the point pair set S to be screened fil In the to-be-registered graph I 2 At a certain point (x i ,y i ) Calculating its transformed image using transformation model θCorresponding position coordinates (x i T ,y i T ) For the step (3), the point pair set to be screened is an initial matching point pair set on the whole image found by utilizing the nearest neighbor principle in the previous step; for the step (4), the point pair set to be screened is S S,p 、S H,p 、S M,p And S is A,p Is the sum of:
(x i T ,y i T )=T((x i ,y i ),θ')
11e) Computing point pairs (x) i ,y i ) And (x) i T ,y i T ) Length dist of wire i And slope i
11f) Point pair set S to be screened by threshold fil Screening all the point pairs in the table, and reserving the matched point pairs meeting the conditions:
|dist i -dist_ave|<Th d
|slope i -slope_ave|<Th s
wherein Th is d Taking 0.1 th s Taking 5 degrees.
In the SAR image registration method of the present embodiment, in steps 2 b), 6 d) and 9 c), horizontal gradients G of the image under different scale factors are calculated by using ROEWA algorithm x,α And vertical gradient G y,α The operation steps are as follows:
12a) Calculating the horizontal gradient G x,α : for any pixel (i, j), first, calculating the index weighted average value of the gray values of the pixel in the range of (4α+1) times 2 α on the left and right sides of the pixelAnd->Then will->And->Taking the logarithm by quotient to obtain the horizontal gradient G x,α Where α is an exponential weighting factor. Horizontal gradient G x,α The calculation formula of (2) is as follows:
wherein I (·) represents the gray value of the pixel in the SAR image.
12b) Calculating the vertical gradient G y,α : for any pixel (i, j), firstly calculating the index weighted average value of the gray values of the pixel within the range of 2α× (4α+1) on the upper side and the lower side of the pixelAnd->Then will->And->Taking the logarithm by quotient to obtain the vertical gradient G y,α Where α is an exponential weighting factor, vertical gradient G y,α The calculation formula of (2) is as follows:
wherein I (·) represents the gray value of the pixel in the SAR image.
In order to make up for the defects of the existing SAR image registration method, the invention provides a three-stage registration frame for registering SAR images under large visual angle difference. In a first-stage task, primarily searching matching point pairs between a reference image and a to-be-registered image by using a multi-scale MSER algorithm, a multi-scale Harris-Affine algorithm and an SAR-SIFT algorithm, and finely matching all the obtained matching point pairs by using a fine matching algorithm; in the second stage task, utilizing an A-SAR-SIFT algorithm with affine invariance to carry out local matching in a mode of constructing Voronoi polygons, obtaining a large number of matching point pairs, then utilizing a mismatching point pair removing algorithm and an FSC algorithm to fully reserve correct matching point pairs, and finally utilizing a refined matching algorithm to carry out refined matching; in the third stage task, the global transformation model is calculated after screening all the matching point pairs obtained in the first stage and the second stage. According to the SAR image registration method, the influence caused by the speckle noise existing in the SAR images and the serious geometric distortion and radiation distortion existing between the images is considered, so that the SAR image registration performance under the large visual angle difference can be remarkably improved.
Two sets of published data sets (from paper KAZE-SAR: SAR Image Registration Using KAZE Detector and Modified SURF Descriptor for Tackling Speckle Noise, mohammadreza Pourfard et al, 2021) are selected, and in order to further verify the performance of the registration method for large-view-angle difference SAR images, the images to be registered in the two sets of published data sets are deformed, and the registration environment under large view angle difference is simulated.
Fig. 4-7 are two sets of registration images used in the experiments of the present invention. Wherein fig. 4 shows a reference graph of the inputs of the first set of experiments, and fig. 5 shows a to-be-registered graph of the inputs of the first set of experiments; fig. 6 shows a reference view of the inputs of the second set of experiments, and fig. 7 shows a to-be-registered view of the inputs of the second set of experiments.
Fig. 8-11 are registration results of the present invention on two sets of registered images. Wherein fig. 8 shows a checkerboard diagram after registration of a first set of experiments, fig. 9 shows pairs of correct matching points found in two images for the first set of experiments, fig. 10 shows a checkerboard diagram after registration of a second set of experiments, and fig. 11 shows pairs of correct matching points found in two images for the second set of experiments.
Table 1 below shows the registration results of the method of the present invention on two sets of data sets, and compares them with the existing registration method Affine-SIFT algorithm (ASIFT for short, from paper ASIFT: A New Framework for Fully Affine Invariant Image Comparison ", SIAM Journal on Imaging Sciences, J.M.Moreal et al, 2009), where Precision in Table 1 indicates the accuracy, i.e., the percentage of the correct point pairs in the final pair; RMSE represents root mean square error; the reference map and the map to be registered input by Test1 are respectively shown in fig. 4 and 5, and the reference map and the map to be registered input by Test2 are respectively shown in fig. 6 and 7. From the results, the method provided by the invention has better registration performance.
Table 1 comparison of registration performance of the inventive method with prior art methods
The foregoing description is only of the preferred embodiments of the present invention, and is not intended to limit the invention, but rather should be construed in scope without departing from the technical scope of the invention.

Claims (10)

1. The registration method for the large-view-angle difference SAR image is characterized by comprising the following steps of:
(1) The SAR-SIFT algorithm, the multi-scale Harris-Affine algorithm and the multi-scale MSER algorithm are utilized to respectively extract key points between the reference graph and the graph to be registered, descriptors are established and matched, and three matching point pair sets with different properties are finally obtained: point-to-point set S that is not susceptible to changes in viewing angle S,c Corner point pair set S with affine invariance H,c Point-pair set S with affine invariance regions M,c
(2) Point pair set S using LSS-Flow algorithm S,c 、S H,c And S is M,c Refined matching is carried out to obtain a more accurate point pair set S S,p 、S H,p And S is M,p
(3) Extracting key points on a reference image and a to-be-registered image by using an A-SAR-SIFT algorithm, establishing descriptors, finding initial matching point pairs in each corresponding small polygon in the reference image and the to-be-registered image by using a nearest neighbor principle, preliminarily removing the mismatching point pairs from a matching point pair set on the whole image by using an LSOR algorithm, and then performing secondary screening by using an FSC algorithm to obtain a matching point pair set S A,c Refined matching is carried out by utilizing an LSS-Flow algorithm, and finally a matching point pair set S is obtained A,p
(4) Assembling point pairs S S,p 、S H,p 、S M,p And S is equal to A,p After summarizing, removing possible mismatching point pairs by using an LSOR algorithm to finally obtain a matching point pair set S final By S final A global transformation model is calculated.
2. The registration method for large view angle difference SAR images according to claim 1, wherein said set S of point pairs that is not susceptible to view angle variation S,c Corner point pair set S with affine invariance H,c Point pair set S with affine invariance regions M,c The specific steps of (a) are as follows:
1a) Point-to-point set S that is not susceptible to changes in viewing angle S,c
Extracting key points in an image by using an SAR-SIFT algorithm, establishing an SAR-SIFT descriptor, using an NNDR algorithm to find an initial matching point pair set between a reference image and a to-be-registered image, removing mismatching point pairs by using an FSC algorithm, and finally obtaining a matching point pair set S S,c
1b) Corner point pair set S with affine invariance H,c
Extracting key points in the image by using a multi-scale Harris-Affine algorithm and obtaining an Affine transformation matrix at each key point, wherein the step of extracting the key points in the image is the same as that of the SAR-SIFT algorithm; then, at each key point, an SAR-SIFT descriptor is built after the image is transformed by utilizing a local affine transformation matrix, and an NNDR algorithm is used for finding the initial matching between the reference image and the image to be registered After the false matching point pair is removed by FSC algorithm, the matching point pair set S is finally obtained H,c
1c) Point-pair set S with affine invariance regions M,c
Extracting key points in an image by using a multi-scale MSER algorithm, obtaining an affine transformation matrix at each key point, transforming the image by using a local affine transformation matrix at each key point, establishing an SAR-SIFT descriptor, using an NNDR algorithm to find an initial matching point pair set between a reference image and a to-be-registered image, removing mismatching point pairs by using an FSC algorithm, and finally obtaining a matching point pair set S M,c
3. The registration method for large-view-angle difference SAR images according to claim 2, wherein in step (3), the key points on the reference map and the map to be registered are extracted and descriptors are created by using an a-SAR-SIFT algorithm, and the specific steps are as follows: firstly, extracting key points on a reference image and an image to be registered by using an SAR-SIFT algorithm, and then collecting S H,p And S is M,p Respectively establishing Voronoi polygons for objects to obtain local affine transformation matrixes of each SAR-SIFT key point, and establishing SAR-SIFT descriptors after transforming images by using the local affine transformation matrixes.
4. A registration method for large view angle difference SAR images according to claim 3, wherein the key points in the images are extracted by SAR-SIFT algorithm in steps 1 a), 1 b) and (3), and the operation steps are as follows:
2a) Selecting a group of exponentially weighted scale factors [ alpha ] 01 ,...,α n-1 ]Wherein the initial value alpha 0 =2,α i =α 0 *k i ,(i∈[1,n-1]),k=2 1/3 ,n=8;
2b) Calculating the horizontal gradient G of the image under different scale factors by using a ROEWA algorithm from the scale sequence of alpha x,α And vertical gradient G y,α Thereby obtaining the gradient amplitude Mag of the image under different scales α And the gradient direction Ori α
2c) Calculating SAR-Harris matrix C at each pixel point in the image according to the obtained gradient SH (x,y,α):
Wherein the parameter alpha is a scale parameter of an exponential weighting function,standard deviation is a Gaussian function;
2d) By C SH (x, y, alpha) calculating SAR-Harris response value R at each pixel point in the image SH (x,y,α):
R SH (x,y,α)=det(C SH (x,y,α))-d·tr(C SH (x,y,α)) 2
Wherein d is a parameter of any size, and is generally between 0.04 and 0.06;
2e) In the same layer of the scale space image, the SAR-Harris response value of each point is compared with the response value of the point in the 8 neighborhood of the SAR-Harris response value and a global threshold d SH Respectively, if the response value of the central point in the neighborhood is maximum and greater than the global threshold d SH Then the point is the detected keypoint, where the global threshold d SH Typically 0.85 is taken.
5. A registration method for large view angle difference SAR images according to claim 3, wherein the steps 1 a), 1 b), 1 c) and (3) establish SAR-SIFT descriptors at the found key points, which operation steps are as follows:
3a) Using gradient amplitude Mag α And the gradient direction Ori α Taking a key point as a circle center, taking a circular neighborhood with a radius of 6α, dividing 0-360 DEG into 36 parts in the circular neighborhood, wherein each part represents a range of 10 DEG, the abscissa represents a gradient direction angle, the ordinate represents a gradient amplitude value, traversing all pixel points in the neighborhood, firstly finding a histogram amplitude column corresponding to the gradient direction angle of the pixel point for each pixel point in the neighborhood, accumulating the gradient amplitude value of the pixel point on a square column of the direction angle of the pixel point, thereby obtaining a main direction histogram of the key point, and carrying out smoothing treatment on the histogram;
3b) Taking a direction angle corresponding to the smoothed histogram gradient peak value as a main direction of the key point and a direction angle corresponding to a column with the energy larger than 80% of the peak value as an auxiliary direction;
3c) Taking the key point as the center of a circle and the radius as r max A circular neighborhood of 12 alpha, rotating the circular neighborhood with a main direction angle as a reference to ensure that the feature descriptors have rotation invariance, then establishing a gradient direction histogram, dividing the circular neighborhood into 17 sub-areas, and respectively setting the radius of the concentric circles to be 0.25 r from inside to outside max ,0.75·r max And r max And equally dividing 0-360 degrees into 8 parts, calculating a gradient direction histogram of each sub-region, and splicing and normalizing the histogram vectors of 17 sub-regions to generate 136-dimensional SAR-SIFT descriptors.
6. The registration method for large view angle difference SAR images according to claim 2, wherein the operation steps of calculating Affine transformation matrix at key points using multi-scale Harris-Affine algorithm in step 1 b) are as follows:
6a) For reference to figure I 1 A key point (x i ,y i ) Setting the iteration number k=15 and initializing the shape adaptive matrix U (1) Is a unit matrix E;
6b) In the kth iteration, the shape adaptive matrix U is utilized (k) For reference to figure I 1 And thereonKey point (x) i ,y i ) Transforming to obtain a transformed image and key point coordinates:
wherein T is a shape-adaptive matrix U (k) A mapping operation for transforming the image or coordinates;
6c) To be used forIs centered in the image->A square area W with the side length of 4 alpha is taken up, wherein alpha represents the scale of an image layer where the key point is located;
6d) Calculating horizontal gradient G of square region W under scale factor alpha by ROEWA algorithm x,α And vertical gradient G y,α Thereby obtaining the gradient amplitude Mag of W α
Calculating SAR-Harris matrix C at each pixel point in W according to the obtained gradient SH (x,y,α):
Wherein the parameter alpha is a parameter of an exponentially weighted function,standard deviation is a Gaussian function;
by C SH (x, y, α) calculating SAR-Harris response value R at each pixel point in W SH (x,y,α)
R SH (x,y,α)=det(C SH (x,y,α))-d·tr(C SH (x,y,α)) 2
Wherein d is a parameter of any size, and is generally between 0.04 and 0.06;
taking the point with the maximum SAR-Harris response value in the square area W as a new key point coordinate
6e) Updating key points in original reference image I 1 Coordinates of:
6f) To be used forIs centered in the image->Re-selecting a square area W_new with side length of 4α, and calculating SAR-Harris matrix of central pixel point in the new area>
6g) Updating the shape adaptive matrix:
U (k+1) =(μ (k) ) -1 U (k)
normalizing the shape self-adaptive matrix to make the maximum characteristic value equal to 1;
6h) Calculating the convergence rate at the kth iteration:
wherein lambda is min(k) ) And lambda is max(k) ) Respectively represent the matrix mu (k) Minimum and maximum eigenvalues of (2);
6i) When ratio < 0.1, the loop is exited, and the result is that the value at the critical point (x i ,y i ) Affine transformation Matrix at i =U (k) Otherwise, the iteration times are increased by 1 and the loop is continued, and the updated shape self-adaptive matrix is utilized to recalculate the key points (x i ,y i ) A convergence rate ratio at; if the condition that the ratio is less than 0.1 can not be satisfied after the iteration is carried out for K times, discarding the key point;
6j) For reference image I 1 With the image I to be registered 2 Each of the above key points performs the same above operation.
7. The registration method for large view angle difference SAR images according to claim 2, wherein in step 1 c) the key points in the image are extracted using a multi-scale MSER algorithm and an affine transformation matrix at each key point is calculated, which comprises the following steps:
8a) Selecting a set of scale space factors [ alpha ] 01 ,...,α n-1 ]Wherein the initial value alpha 0 =2,α i =α 0 *k i ,(i∈[1,n-1]),k=2 1/3 N=4, i.e. the dimension of the gaussian kernel, the window length w=4α is selected i Establishing a scale space by using Gaussian blur kernels;
8b) Sequencing each layer of images according to gray values, and if the images are color images, converting the color images into gray images; a node is allocated to each pixel point in the image in advance, and the node index number is the gray value corresponding to the pixel point; according to the sorting result of the pixel points, placing the pixel points into the component tree one by one, wherein the placing sequence is the node index number corresponding to each pixel point; in the process of embedding, firstly, the pixel point is embedded, then the four-adjacent-domain position of the pixel point is checked, if nodes exist, the respective root node is searched, and the two node areas are combined; after all the pixel points are placed in the component tree, all extreme value regions corresponding to the image are obtained; wherein, the extremum area is defined as: if the gray values of all pixels in a certain region are larger than the gray values of boundary pixels, the region is defined as the maximum extremum region; if the gray values of all pixels in the region are smaller than the gray values of the boundary pixels, defining the region as a minimum extremum region;
8c) The MSER region is obtained using a maximum stability determination condition: if Q 1 ,...,Q i-1 ,Q i ,. it is a series of extreme regions that are mutually contained, i.e.If extremum region->For the maximum stable extremum region, if and only if the region change rate Q (i) = |q i+Δ -Q i-Δ |/|Q| i At i * Local minima are obtained where |·| is expressed as the area of the region, subscript i e [0,255]Representing the gray scale, Δ representing a minute gray scale variation;
8d) Approximately fitting the irregular maximum stable value region into an elliptical region;
firstly, taking the center of gravity of a maximum stable value area as the center of an ellipse, and calculating the center of the ellipse:
calculating the geometric zero-order distance and geometric first-order distance of the extremely stable value region:
m 00 =∑I e (x,y)
m 01 =∑yI e (x,y)
m 10 =∑xI e (x,y)
wherein m is 00 、m 01 And m is equal to 10 A geometric zero-order distance and a geometric first-order distance respectively of the maximum stable value region, I e (x, y) represents the region of maximum stable values, so that the center coordinates of the ellipse, namely the coordinates (x) of the key points detected by the MSER algorithm can be obtained c ,y c ):
Calculating the geometric second step of the extremely stable value region:
wherein the method comprises the steps of
μ 20 =∑(x-x c ) 2 I e (x,y)
μ 02 =∑(y-y c ) 2 I e (x,y)
μ 11 =∑(x-x c )(y-y c )I e (x,y)
Two eigenvalues of geometric second order are calculated:
calculating the major half axis w, the minor half axis l and the major axis direction of the ellipse
8e) By using the elliptic long half shaft w, the elliptic short half shaft l and the major axis directionCalculate key point (x) c ,y c ) Affine transformation matrix at:
8f) Inverting the gray value of the original image, and repeating the operation;
8g) The same is done for each layer of images in the scale space.
8. The registration method for large-view-angle difference SAR images according to claim 3, wherein the fine matching of the point pairs sets is performed by using LSS-FLOW algorithm in steps (2) and (3), so as to obtain a more accurate point pair set, and the operation steps are as follows:
9a) Utilizing reference point pair setsThe matching point pairs in the step (2) and the step (3) are all S, and the transformation model theta from the to-be-registered graph to the reference graph is obtained through calculation S,c 、S H,c And S is M,c Is the sum of the reference pictures I 1 The image to be registered is I 2 Map I to be registered 2 And (3) deforming:
wherein the method comprises the steps ofRepresenting an image obtained by transforming the image to be registered by using the transformation model theta;
9b) For the point pair set S to be refined coarse In the to-be-registered graph I 2 The above points are calculated to be the transformed imageCorresponding to the position on the first set, wherein for the step (2), the set of the point pairs to be refined is S S,c 、S H,c And S is M,c For the step (3), the set of the point pairs to be refined is S A,c
9c) Calculating image I by ROEWA algorithm 1 And (3) withHorizontal gradient G at scale factor α=2 x And vertical gradient G y Thereby obtaining the gradient amplitude Mag and the gradient direction Ori of the image:
For the drawingsImage I 1 And (3) withTaking the pixel point as a circle center, taking a circular neighborhood, dividing the neighborhood into 17 sub-areas, dividing the concentric circle radius into 0.25 r,0.75 r and r from inside to outside, dividing 0-360 DEG into 8 parts in each sub-area, representing a range of 45 DEG in each part, representing gradient direction angles by the abscissa, representing gradient amplitude by the ordinate, traversing all the pixel points in the sub-area, firstly finding a histogram amplitude column corresponding to the gradient direction angle of the pixel point for each pixel point in the area, accumulating the gradient amplitude of the pixel point on a straight column of the direction angle of the pixel point, thereby obtaining a gradient direction histogram of the point, converting the gradient direction histogram into a one-dimensional histogram vector, splicing and normalizing the histogram vectors of the 17 sub-areas to generate a 136-dimensional SAR-SIFT descriptor;
computing image I 1 And (3) withThe SAR-SIFT descriptors of each pixel point are spliced to form an image I 1 And (3) withSAR-SIFT description subgraph I 1 Both _desc and +_>
9d) For the point pair set S to be refined coarse In reference image I 1 A matching point (x i ,y i ) Description of subgraph I at SAR-SIFT 1 Taking out one side length of 2 r on_desc lf +1 square region description sub-image I 1 Squ for its corresponding point on the map to be registeredTo perform the same operationThe square region description sub-image obtained is +.>Wherein r is lf Taking 61;
9e) The loss function is E (w) is:
where p= (x, y) represents a certain pixel point on the image, w (p) = (u (p), v (p)) represents an optical flow vector of p point on the image, where u (p) represents an offset of p point in the horizontal direction, and v (p) represents an offset of p point in the vertical direction; epsilon represents a region formed by taking p points as the center and adding 8 adjacent points, and q represents a certain point in the region; parameters eta and alpha are respectively 0.001 and 0.01, and parameters t and d are 1;
wherein (1) the equation is a data item that constrains SAR-SIFT descriptors along the optical-flow vector w (p) to match each other; (2) The small displacement term constrains the optical flow vectors to be as small as possible without other available information; (3) The expression is a smooth term, and the optical flow vectors of adjacent pixels are constrained to be similar;
solving the w when the loss function E (w) takes the minimum value by using a confidence coefficient propagation algorithm;
9f) Computing an imageGo up some->Is a new coordinate of (a):
9g) Computing points using transformed model θ of an imageIn the original to-be-registered image I 2 Coordinate position on (c):
9h) Traversing the point pair set S to be refined coarse After each pair of matching points, a more accurate point pair set S is obtained precise Wherein for step (2), the more accurate set of point pairs is S S,p 、S H,p And S is M,p For step (3), the more accurate set of point pairs is S A,p
9. A method of registration for large view angle difference SAR images according to claim 3, wherein step (3) is performed with S H,p And S is M,p Establishing a Voronoi polygon for an object to obtain a local affine transformation matrix of each SAR-SIFT key point, wherein the operation steps are as follows:
10a) Utilizing a set of point pairs S H,p And S is M,p At reference image I 1 All the discrete points with affine invariance are taken as objects to automatically construct a Delaunay triangle network; numbering the discrete points and the formed triangles, and recording which three discrete points each triangle is composed of;
10b) Finding out the numbers of all triangles adjacent to each discrete point, and recording;
10c) Ordering and recording triangles adjacent to each discrete point in a clockwise direction so as to generate a Voronoi polygon by the next connection;
10d) Calculating the circle center of the circumscribed circle of each triangle, and recording;
10e) Connecting the circle centers of the circumscribed circles of the adjacent triangles according to the adjacent triangles of each discrete point to obtain a Voronoi polygon;
10f) The SAR-SIFT key points in each small Voronoi polygon are endowed with the same affine transformation matrix as the discrete points with affine invariance in the polygon, so that the local affine transformation matrix of each SAR-SIFT key point is obtained;
10g) Utilizing a set of point pairs S H,p And S is M,p At the image I to be registered 2 All the discrete point pairs with affine invariance are subjected to the same operation, and a local affine transformation matrix of each SAR-SIFT key point is obtained.
10. The registration method for large-view-angle difference SAR images according to claim 3, wherein the matching point pairs on the whole image are preliminarily removed by LSOR algorithm in steps (3) and (4), and the operation steps are as follows:
11a) Utilizing reference point pair setsCalculating a transformation model theta' from the image to be registered to the reference image, wherein for the step (3), the reference point pair set is S S,p 、S H,p And S is M,p Is the sum of (3); for step (4), the reference point pair set is S S,p 、S H,p 、S M,p And S is A,p Is the sum of (3);
reference picture is I 1 The image to be registered is I 2 Image I to be registered using transformation model θ 2 Deforming to obtain an image
11b) For reference point pair setsIn the to-be-registered graph I 2 At a certain point (x s ,y s ) Calculating its image +.after transformation using transformation model θ - >Corresponding position coordinates (x s T ,y s T ):
(x s T ,y s T )=T((x s ,y s ),θ')
Aggregating pairs of reference pointsAll the point pairs in the list are subjected to the same transformation to obtain a new point pair set S ref T
11c) Calculate the Point pair set S ref T The length and the slope of each pair of matching point connecting lines are summed and averaged to obtain an average length dist_ave and an average slope_ave;
11d) For the point pair set S to be screened fil In the to-be-registered graph I 2 At a certain point (x i ,y i ) Calculating its transformed image using transformation model θCorresponding position coordinates (x i T ,y i T ) For the step (3), the point pair set to be screened is an initial matching point pair set on the whole image found by utilizing the nearest neighbor principle in the previous step; for the step (4), the point pair set to be screened is S S,p 、S H,p 、S M,p And S is A,p Is the sum of:
(x i T ,y i T )=T((x i ,y i ),θ')
11e) Computing point pairs (x) i ,y i ) And (x) i T ,y i T ) Length dist of wire i And slope i
11f) Point pair set S to be screened by threshold fil Screening all the point pairs in the table, and reserving the matched point pairs meeting the conditions:
|dist i -dist_ave|<Th d
|slope i -slope_ave|<Th s
wherein Th is d Taking 0.1 th s Taking 5 degrees.
CN202310420988.6A 2023-04-19 2023-04-19 Registration method for large-view-angle difference SAR image Pending CN116612165A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310420988.6A CN116612165A (en) 2023-04-19 2023-04-19 Registration method for large-view-angle difference SAR image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310420988.6A CN116612165A (en) 2023-04-19 2023-04-19 Registration method for large-view-angle difference SAR image

Publications (1)

Publication Number Publication Date
CN116612165A true CN116612165A (en) 2023-08-18

Family

ID=87675450

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310420988.6A Pending CN116612165A (en) 2023-04-19 2023-04-19 Registration method for large-view-angle difference SAR image

Country Status (1)

Country Link
CN (1) CN116612165A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117830301A (en) * 2024-03-04 2024-04-05 青岛正大正电力环保设备有限公司 Slag dragging region detection method based on infrared and visible light fusion characteristics

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117830301A (en) * 2024-03-04 2024-04-05 青岛正大正电力环保设备有限公司 Slag dragging region detection method based on infrared and visible light fusion characteristics
CN117830301B (en) * 2024-03-04 2024-05-14 青岛正大正电力环保设备有限公司 Slag dragging region detection method based on infrared and visible light fusion characteristics

Similar Documents

Publication Publication Date Title
US9141871B2 (en) Systems, methods, and software implementing affine-invariant feature detection implementing iterative searching of an affine space
CN111080529A (en) Unmanned aerial vehicle aerial image splicing method for enhancing robustness
CN111340701B (en) Circuit board image splicing method for screening matching points based on clustering method
US8798377B2 (en) Efficient scale-space extraction and description of interest points
CN108346162B (en) Remote sensing image registration method based on structural information and space constraint
CN107025449B (en) Oblique image straight line feature matching method constrained by local area with unchanged visual angle
CN111524168B (en) Point cloud data registration method, system and device and computer storage medium
CN110135438B (en) Improved SURF algorithm based on gradient amplitude precomputation
CN111192194B (en) Panoramic image stitching method for curtain wall building facade
CN116883464A (en) Registration method for large-viewing-angle difference optics and SAR remote sensing image
CN108388902A (en) Sub- construction method is described in conjunction with the compound 3D of global frame point and part SHOT features
CN116612165A (en) Registration method for large-view-angle difference SAR image
CN114897705A (en) Unmanned aerial vehicle remote sensing image splicing method based on feature optimization
Cui et al. Global propagation of affine invariant features for robust matching
Jin et al. Registration of UAV images using improved structural shape similarity based on mathematical morphology and phase congruency
CN116935013A (en) Circuit board point cloud large-scale splicing method and system based on three-dimensional reconstruction
CN115511928A (en) Matching method of multispectral image
CN115588033A (en) Synthetic aperture radar and optical image registration system and method based on structure extraction
CN116468760A (en) Multi-source remote sensing image registration method based on anisotropic diffusion description
CN116026340A (en) Indoor positioning method suitable for multi-scale continuous zooming condition
CN115861792A (en) Multi-mode remote sensing image matching method for weighted phase orientation description
Tan et al. Automatic registration method of multi-source point clouds based on building facades matching in urban scenes
CN113674332B (en) Point cloud registration method based on topological structure and multi-scale features
Uskenbayeva et al. Contour analysis of external images
Li et al. Reliable and fast mapping of keypoints on large-size remote sensing images by use of multiresolution and global information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination