CN106373147A - Improved Lapras multi-extremum inhibition-based SAR image registration method - Google Patents

Improved Lapras multi-extremum inhibition-based SAR image registration method Download PDF

Info

Publication number
CN106373147A
CN106373147A CN201610702232.0A CN201610702232A CN106373147A CN 106373147 A CN106373147 A CN 106373147A CN 201610702232 A CN201610702232 A CN 201610702232A CN 106373147 A CN106373147 A CN 106373147A
Authority
CN
China
Prior art keywords
angle point
sar
sar image
matched
benchmark
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610702232.0A
Other languages
Chinese (zh)
Inventor
李亚超
李玥
苏星
苏星一
全英汇
邢孟道
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN201610702232.0A priority Critical patent/CN106373147A/en
Publication of CN106373147A publication Critical patent/CN106373147A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image

Landscapes

  • Image Analysis (AREA)

Abstract

The invention belongs to the technical field of radar SAR image registration, and discloses an improved Lapras multi-extremum inhibition-based SAR image registration method. The method comprises the following steps of: obtaining a reference SAR image and a to-be-matched SAR image, and respectively establishing SAR-Harris angular point response spaces of the two images; respectively extracting angular point coordinates of the SAR-Harris angular point response spaces; respectively positioning the angular point coordinates of the SAR-Harris angular point response spaces, determining a greyscale statistic value of a local neighborhood area of each angular point, determining a main direction of each angular point, and forming angular point feature vectors of the angular points of the two images; comparing the angular point feature vectors of the angular points of the two images to obtain matching point pairs of the two images; and carrying out image registration on the to-be-matched SAR image according to the matching point pairs of the two images, so as to improve the instantaneity and accuracy of the SAR image matching.

Description

Based on the sar method for registering images improving the suppression of Laplce's many extreme values
Technical field
The present invention relates to the many extreme values of Laplce are improved in radar sar image registration techniques field, more particularly, to a kind of being based on The sar method for registering images of suppression, realizes body hi-Fix for sar image registration real-time to complicated terrain scene.
Background technology
Increasingly mature, based on sar with synthetic aperture radar (synthetic aperture radar, sar) technology The research of image application has become a focus.Wherein, sar Image Feature Matching is because of its higher dual-use value And gain great popularity.Especially guide field in radar detection, realizing body hi-Fix using sar images match has become one Key technology.
Become larger it is impossible to meet the demand of long-time hi-Fix because traditional ins error accumulates in time, because This need to find new localization method and break through this bottleneck.Research finds to can achieve round-the-clock high accuracy based on the method for images match The target of corrected trajectory.Wherein, higher to sar image quality requirements based on the related method for registering of gradation of image, when image is deposited In gray difference and geometric deformation, there is certain limitation, and operand is larger;Method for registering based on characteristics of image is then Problem present in tradition coupling can be solved on the premise of ensureing matching precision, to the grey scale change of image, affine transformation etc. There is invariance.Matching technique therefore based on characteristics of image gradually occupies very important status in sar image application.
Because sar is imaged in tapered plane, when being therefore corrected to ground level, different degrees of geometric deformation occurs, such as revolve Turn, skew, stretching, projection, the conversion such as affine;Again, the imaging mechanism of radar system lead to sar image occur more serious with Machine coherent speckle noise, reduces sar signal noise ratio (snr) of image.
Content of the invention
For above-mentioned the deficiencies in the prior art, it is an object of the invention to provide a kind of being based on improves the many extreme values of Laplce The sar method for registering images of suppression, to improve real-time and the accuracy of sar images match, for follow-up body hi-Fix Registration position information is provided;The present invention does not limit image species simultaneously, is applicable to registration between more remote sensing images.
For reaching above-mentioned purpose, embodiments of the invention adopt the following technical scheme that
A kind of sar method for registering images based on improvement Laplce's many extreme values suppression, methods described comprises the steps:
Step 1, obtains benchmark sar image and sar image to be matched, and sets up the sar- of described benchmark sar image respectively Harris angle point responds the sar-harris angle point response space of space and described sar image to be matched;
Step 2, the sar-harris angle point extracting benchmark sar image respectively responds the angular coordinate in space and to be matched The sar-harris angle point of sar image responds the angular coordinate in space;
Step 3, responds the angular coordinate in space and sar to be matched to the sar-harris angle point of benchmark sar image respectively The angular coordinate that the sar-harris angle point of image responds space is positioned, and determines the gray scale system in each angle point local neighborhood area Evaluation, so that it is determined that the principal direction of each angle point and form benchmark sar image angle point Corner Feature vector sum sar to be matched The Corner Feature vector of the angle point of image;
Step 4, will be special for the angle point of vectorial for the Corner Feature of the angle point of the benchmark sar image angle point with sar image to be matched Levy vector to be compared, obtain benchmark sar image and the matching double points of sar image to be matched;
Step 5, according to the matching double points of benchmark sar image and sar image to be matched, enters to described sar image to be matched Row image registration.
The technical scheme that the present invention provides initially sets up the angle point response space of sar image to be matched and benchmark sar image Model, detects angle point response position information with improved Laplace operator on the basis of the suppression of many extreme values, obtains accurately After corner location coordinate, then obtain each angle point principal direction according to each angle point local neighborhood area grey-level statistics, finally Generate angle point provincial characteristicss vector and be used for angle steel joint region description;The present invention uses many extreme value suppression (24 extreme value suppression) knot Hop index filter function and improved Laplace operator are to accurately fixed in Laplce after sar image index Laplace transform Position corner location.Compare more traditional sar method for registering images, the present invention can achieve accurately sar image registration in real time, in registration Take advantage in speed and degree of accuracy.
The present invention compared with prior art has the advantage that (1) present invention is applied to suppression sar image speckle due to using The sar-harris corner detection operator of spot noise constructs the angle point response space with three layers of sar-harris angle point response diagram, There is higher real-time;(2) present invention, reduces first by image exponent filtering due to during accurate detection angular coordinate The impact to Laplace transform for the noise.(3) present invention is due to being drawn with reference to improved using many extreme values (24 extreme value) suppression Pula this to the sar image Laplace transform after exponent filtering, can be good at retaining the width letter between angle point and angle point Breath, it is to avoid detect unnecessary marginal point, improves the accuracy of Corner Detection.
Brief description
In order to be illustrated more clearly that the embodiment of the present invention or technical scheme of the prior art, below will be to embodiment or existing Have technology description in required use accompanying drawing be briefly described it should be apparent that, drawings in the following description be only this Some embodiments of invention, for those of ordinary skill in the art, on the premise of not paying creative work, acceptable Other accompanying drawings are obtained according to these accompanying drawings.
Fig. 1 is a kind of sar image registration side based on improvement Laplce's many extreme values suppression provided in an embodiment of the present invention The schematic flow sheet of method;
Fig. 2 is selected sar image to be matched and benchmark sar image in the present invention, and wherein Fig. 2 (a) is sar figure to be matched Picture, sar image on the basis of Fig. 2 (b);
Fig. 3 is sar image to be matched and the horizontal filtering figure of benchmark sar image, and wherein Fig. 3 (a) is sar image to be matched Horizontal filtering figure, the horizontal filtering figure of sar image on the basis of Fig. 3 (b)
Fig. 4 is sar image to be matched and the vertical filtering figure of benchmark sar image, and wherein Fig. 4 (a) is sar image to be matched Vertical filtering figure, the vertical filtering figure of sar image on the basis of Fig. 4 (b);
Fig. 5 is three angle point response diagrams of sar image to be matched and benchmark sar image, and wherein Fig. 5 (a) is sar to be matched Three angle point response diagrams of image, three angle point response diagrams of sar image on the basis of Fig. 5 (b);
Fig. 6 is three Laplace transform figures of sar image to be matched and benchmark sar image, and wherein Fig. 6 (a) is to treat Join three Laplace transform figures of sar image, three Laplace transform figures of sar image on the basis of Fig. 6 (b);
Fig. 7 is the Corner Detection figure of the sar image to be matched and benchmark sar image being obtained using the inventive method, wherein In figure angle point is labeled as white round dot, and Fig. 7 (a) is the Corner Detection figure of sar image to be matched, sar image on the basis of Fig. 7 (b) Corner Detection figure;
Fig. 8 be sar image to be matched that practical the inventive method obtains and benchmark sar image preliminary matches point to and carry Pure matching double points, are connected with black lines, and wherein Fig. 8 (a) is sar image to be matched and the preliminary matches of benchmark sar image Right, the purification matching double points of Fig. 8 (b) sar to be matched image and benchmark sar image of point;
Fig. 9 is the angle of the sar image to be matched and benchmark sar image being formed using sift-likesar image matching method Point detection figure, wherein in figure angle point is labeled as white round dot, and Fig. 9 (a) is the Corner Detection figure of sar image to be matched, Fig. 9 (b) On the basis of sar image Corner Detection figure;
Figure 10 is the sar image to be matched and benchmark sar image being formed using sift-likesar image matching method Preliminary matches point to and purification matching double points, be connected with black lines, wherein Figure 10 (a) be sar image to be matched and benchmark The preliminary matches point pair of sar image, the purification matching double points of Figure 10 (b) sar to be matched image and benchmark sar image;
Figure 11 is the Point matching error curve diagram to this paper matching process and sift-likesar image matching method, is divided into Distance y-axis side to the error with orientation both direction, distance to error representative x-axis deflection error, orientation error representative To error, wherein Figure 11 (a) is the Point matching error curve diagram of sift-likesar image matching method, and Figure 11 (b) is this The Point matching error curve diagram of bright method.
Specific embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clear, complete Site preparation description is it is clear that described embodiment is only a part of embodiment of the present invention, rather than whole embodiments.It is based on Embodiment in the present invention, it is every other that those of ordinary skill in the art are obtained under the premise of not making creative work Embodiment, broadly falls into the scope of protection of the invention.
The embodiment of the present invention provides a kind of sar method for registering images based on improvement Laplce's many extreme values suppression, reference Fig. 1, methods described comprises the steps:
Step 1, obtains benchmark sar image and sar image to be matched, and sets up the sar- of described benchmark sar image respectively Harris angle point responds the sar-harris angle point response space of space and described sar image to be matched.
Step 1 specifically includes following sub-step:
(1a) determine a class index weighting parameters value [α01,…αi,…,αn-1], wherein, α0=2, αi0·ki,(i ∈ [1, n-1]), k=21/3, n is the number of exponential weighting parameter value, and n=3;
(1b) exponential weighting filter function is determined according to exponential weighting parameter value αWherein, exponential weighting Parameter value α takes α successively01,…αi,…,αn-1, thus obtaining n exponential weighting filter function, e is the truth of a matter of exponential function;
(1c) it is respectively adopted n exponential weighting filter function and level filter is carried out to benchmark sar image and sar image to be matched Ripple and vertical filtering, obtain the corresponding n horizontal filtering figure of benchmark sar image and n vertical filtering figure, and sar to be matched The corresponding n horizontal filtering figure of image and n vertical filtering figure;
Wherein, the horizontal filtering function in exponential weighting filter function is:
m x + , α = &integral; &integral; a = r + , b = r i ( x + a , y + b ) × e - | a | + | b | α m x - , α = &integral; &integral; a = r - , b = r i ( x + a , y + b ) × e - | a | + | b | α
Vertical filtering function in exponential weighting filter function is:
m y + , α = &integral; &integral; a = r , b = r + i ( x + a , y + b ) × e - | a | + | b | α m y - , α = &integral; &integral; a = r , b = r - i ( x + a , y + b ) × e - | a | + | b | α
Wherein,Represent the horizontal filtering function along x-axis positive direction,Represent the horizontal filtering along x-axis negative direction Function,Represent the horizontal filtering function along y-axis positive direction,Represent the horizontal filtering function along y-axis negative direction, i (x, y) is the value of the pixel on image, and x, y represent abscissa and the vertical coordinate of image respectively, and a represents pixel in level side To side-play amount, b represents the side-play amount in vertical direction;
(1d) according to the corresponding n horizontal filtering function of benchmark sar image and n vertical filtering function, calculating benchmark sar The corresponding n horizontal gradient figure of image and n vertical gradient map, according to the corresponding n horizontal filtering function of sar image to be matched With n vertical filtering function, calculate the corresponding n horizontal gradient figure of sar image to be matched and n vertical gradient map;
Wherein, the horizontal gradient g of n horizontal gradient each pixel of in figurex,αComputing formula is:
Vertical gradient g of n vertical gradient each pixel of in figurey,αComputing formula is:
(1e) according to the corresponding n horizontal gradient computing formula of benchmark sar image and n vertical gradient computing formula, structure Make the corresponding n sar-harris matrix of benchmark sar image, calculated according to the corresponding n horizontal gradient of sar image to be matched public Formula and n vertical gradient computing formula, the construction corresponding n sar-harris matrix of sar image to be matched:
c s h ( x , y , α ) = g 2 · α * ( g x , α ) 2 ( g x , α ) · ( g y , α ) ( g x , α ) · ( g y , α ) ( g y , α ) 2
Wherein, csh(x, y, α) represents sar-harris matrix,Represent that standard deviation isGaussian function, * table Show convolution operation operator, represent product calculation;
(1f) according to the corresponding n sar-harris matrix of benchmark sar image, the sar- of construction benchmark sar image Harris angle point response equation, calculates the angle of each pixel according to the sar-harris angle point response equation of benchmark sar image Point response value, and form the angle point response diagram of benchmark sar image, thus the sar-harris angle point forming benchmark sar image rings Answer space, the sar-harris angle point response space of described benchmark sar image comprises n-layer angle point response diagram;According to sar to be matched The corresponding n sar-harris matrix of image, constructs the sar-harris angle point response equation of sar image to be matched, according to treating The sar-harris angle point response equation of coupling sar image calculates the angle point response value of each pixel, and forms sar to be matched The angle point response diagram of image, thus form the sar-harris angle point response space of sar image to be matched, described sar to be matched The sar-harris angle point response space of image comprises n-layer angle point response diagram;
Sar-harris angle point response equation rsh(x, y, α) is:
rsh(x, y, α)=det (csh(x,y,α))-d·trace(csh(x,y,α))2
Wherein, det representing matrix determinant, the mark of trace representing matrix, d is sar-harris angle point response equation Parameter, value is 0.4, and exponential weighting parameter value α takes α successively01,…αi,…,αn-1So that the sar- of benchmark sar image Harris angle point response space comprises n-layer angle point response diagram, and the sar-harris angle point response space of sar image to be matched comprises N-layer angle point response diagram, x represents the abscissa of pixel, and y represents the vertical coordinate of pixel.
Step 2, the sar-harris angle point extracting benchmark sar image respectively responds the angular coordinate in space and to be matched The sar-harris angle point of sar image responds the angular coordinate in space.
Step 2 specifically includes following sub-step:
(2a) the sar-harris angle point response space of benchmark sar image comprises n-layer angle point response diagram, sar figure to be matched The sar-harris angle point response space of picture comprises n-layer angle point response diagram, in the n-layer angle point response diagram of benchmark sar image Angle point is slightly extracted, and chooses in every layer of angle point response diagram and meets pre-conditioned angle point;N-layer angle to sar image to be matched Angle point in point response diagram is slightly extracted, and chooses in every layer of angle point response diagram and meets pre-conditioned angle point;
Described pre-conditioned it is: rsh(x,y,α)>dsh
Wherein, rsh(x, y, α) represents sar-harris angle point response equation, dshFor predetermined threshold value, value is 0.01, x, y Represent abscissa and the vertical coordinate of image respectively, α is exponential weighting parameter value;
(2b) meet pre-conditioned each in every layer of angle point response diagram to benchmark sar image and sar image to be matched Angle point, chooses 8 adjacent about angle points, forms the Square Neighborhood comprising nine angle points, if meeting pre-conditioned angle point Angle point response value be maximum in the angle point response value of nine angle points in described Square Neighborhood, then retain this satisfaction and preset bar The angle point of part, otherwise abandons this and meets pre-conditioned angle point, thus obtaining the thick of every layer of angle point response diagram in benchmark sar image Extract the thick extraction angular coordinate of every layer of angle point response diagram in angular coordinate and sar image to be matched;
(2c) every layer of angle point response diagram to benchmark sar image and sar image to be matched, according to every layer of angle point response diagram pair The exponential weighting parameter value α answering, determines the corresponding exponential weighting filter function of every layer of angle point response diagramRoot According to described exponential weighting filter function, corresponding angle point response diagram is filtered, obtains filtered angle point response diagram;
α=α0When exponential weighting function template as follows:
0.41 0.51 0.64 0.51 0.41
0.51 0.64 0.80 0.64 0.51
0.64 0.80 1 0.80 0.64
0.51 0.64 0.80 0.64 0.51
0.41 0.51 0.64 0.51 0.41
(2d) to every layer of angle point response diagram after benchmark sar image filtering and every layer of angle after sar image filtering to be matched Point response diagram, carries out Laplace transform using improved Laplce's filter function, obtains the La Pu of every layer of angle point response diagram Lars is worth, and forms Laplce's figure of every layer of angle point response diagram, the n-layer angle point response diagram composition benchmark sar in benchmark sar image Laplce's metric space of image, the n-layer angle point response diagram in sar image to be matched forms the La Pu of sar image to be matched Lars metric space;
Improved Laplce's filter functionForm is:
▿ f = [ σ i , j &element; [ - 2 , 2 ] f ( x + i , y + j ) - f ( x + 2 , y + 2 ) - f ( x + 2 , y - 2 ) - f ( x - 2 , y - 2 ) - f ( x - 2 , y + 2 ) ] - 20 f ( x , y )
Improved Laplce's filter function factor of a model is as follows:
0 1 1 1 0
1 1 1 1 1
1 1 -20 1 1
1 1 1 1 1
0 1 1 1 0
(2e) in the thick extraction angular coordinate to every layer of angle point response diagram in benchmark sar image and sar image to be matched The thick extraction angular coordinate of every layer of angle point response diagram, adopts 24 extreme value suppression methods fixed again in Laplce's metric space Parallactic angle point coordinates, obtains candidate angular coordinate;Wherein, described 24 extreme values suppress method is by certain thick angular coordinate extracting As central point, compare the Laplace transform value of this central point and the Laplace transform value of 20 adjacent about angle points It is compared, if the Laplace transform value of this central point is maximum, as candidate angular coordinate, otherwise, abandon this angle Point;
24 extreme value suppression templates in 24 extreme value suppression methods are:
Choose and comprise any one thick extraction angle point and 5 × 5 pixel regions centered on this thick extraction angle point, make 5 × 5 In pixel region, the weights of four pixels at four angles are 0, and the weights of other 21 pixels are 1, compare this thick extraction angle point Whether Laplace transform value is the maximum of the Laplacian values of all pixels point in 5 × 5 pixel regions, if so, then protects Stay this thick extraction angle point as candidate angular, otherwise, abandon this thick extraction angle point.
24 extreme value suppression templates are as follows:
0 1 1 1 0
1 1 1 1 1
1 1 1 1 1
1 1 1 1 1
0 1 1 1 0
24 extreme values suppress method: by the Laplace transform value of central point (herein for the thick corner location extracted) and phase Around adjacent, the Laplace transform value of 20 coordinate points compares, if maximum, then for corner location to be selected, otherwise, abandoning should Angle point.
(2f) to the Laplace transform value of each candidate angular coordinate of every layer of angle point response diagram in benchmark sar image with The Laplace transform value of the angle point response diagram respective coordinates position of its adjacent layer is compared, and retains Laplace transform value Big candidate angular coordinate is as the angular coordinate in the sar-harris angle point response space of benchmark sar image;
Shown in comparative approach equation below:
laplace(x,αn)≥laplace(x,αl) l∈{n-1,n+1}
To the Laplace transform value of each candidate angular coordinate of every layer of angle point response diagram in sar image to be matched and its The Laplace transform value of the angle point response diagram respective coordinates position of adjacent layer is compared, and retains Laplace transform value maximum Candidate angular coordinate as sar image to be matched sar-harris angle point respond space angular coordinate.
Step 3, responds the angular coordinate in space and sar to be matched to the sar-harris angle point of benchmark sar image respectively The angular coordinate that the sar-harris angle point of image responds space is positioned, and determines the gray scale system in each angle point local neighborhood area Evaluation, so that it is determined that the principal direction of each angle point and form benchmark sar image angle point Corner Feature vector sum sar to be matched The Corner Feature vector of the angle point of image.
Step 3 specifically includes following sub-step:
(3a) angular coordinate responding space to the sar-harris angle point of benchmark sar image and sar image to be matched leads to The method crossing sub- pixel interpolation, calculates the fractional part of each angular coordinate, by sar-harris angle point response equation rsh(x,y, α) do Taylor expansion, obtain:
r s h ( x ) = r s h ( x 0 ) + ∂ r s h t ∂ x ( x 0 ) x + 1 2 x t ∂ 2 r s h ∂ x 2 ( x 0 ) x
r s h ( y ) = r s h ( y 0 ) + ∂ r s h t ∂ y ( y 0 ) y + 1 2 y t ∂ 2 r s h ∂ y 2 ( y 0 ) y
Wherein, rshX () represents the Taylor expansion at pixel x, rshY () represents the Taylor expansion at pixel y, x0 Represent the initial coordinate values of angle point,WithRepresent single order at pixel x for the angle point response equation respectively Differential and second-order differential,WithRespectively represent first differential pixel y at for the angle point response equation with Second-order differential, ()tRepresent transposition operation;
Thus obtaining the side-play amount in extreme point position x direction for each angular coordinateInclined with extreme point position y direction Shifting amount
x ^ = - ∂ 2 r s h - 1 ∂ x 2 ∂ r s h ∂ x
Wherein, ()-1Represent inversion operation;
(3b) reject side-play amountOr side-play amountMore than the angular coordinate of 1 pixel cell, and by remaining angular coordinate Plus corresponding side-play amount, obtain the angular coordinate containing fractional part;
(3c) angular coordinate containing fractional part to each, chooses the area that this angular coordinate radius of neighbourhood is 3 × 1.5 α Domain, the gradient magnitude of all pixels point and gradient argument in zoning, obtain all pixels in this angular coordinate radius of neighbourhood The gradient statistic histogram of point, the trunnion axis of described gradient statistic histogram is gradient argument, and vertical axes are adding of gradient magnitude Power cumulative and;α is the exponential weighting parameter value of this angular coordinate place angle point response diagram herein;
The computing formula at the gradient angle of depression is:
The computing formula of gradient magnitude is:
(3d) choose the corresponding deflection of peak-peak in described gradient statistic histogram as the principal direction of corresponding angle point , if there are other peak values more than peak-peak energy more than 80% in described gradient statistic histogram, by this peak value in angle Corresponding angle point and its deflection are recorded, so that it is determined that the principal stresses angle of each angle point form the angle of benchmark sar image The Corner Feature vector of point characteristic vector and sar image to be matched.
In step 3, the Corner Feature vector of the Corner Feature vector sum sar to be matched image of composition benchmark sar image, tool Body includes:
(3e) all angular coordinates and corresponding principal stresses angle in determining benchmark sar image, and sar image to be matched In after all angular coordinates and corresponding principal stresses angle, centered on each angle point, choosing size around angle point isAngle point neighborhood;
(3f) angle point centered on each angle point, its corresponding angle point neighborhood is turned clockwise principal stresses angle θ, wherein, Spin matrix is as follows:(x1, y1) is the angular coordinate before rotation, and (x ', y ') is rotation Angular coordinate afterwards;
(3g) postrotational angle point neighborhood is divided into d1 × d1 sub-regions, that is, is divided into 16 sub-regions, calculate every The gradient magnitude of all pixels point and gradient argument in sub-regions, obtain the pixel Nogata of all pixels point in this subregion Figure, the histogrammic trunnion axis of described pixel is gradient argument, vertical axes be gradient magnitude weighted accumulation and;Every sub-regions Pixel rectangular histogram comprise 8 posts, each post represents the angle point range value of 45° angle scope, that is, each pixel rectangular histogram tool There is the range value in 8 directions;
(3h) using the range value in the pixel rectangular histogram of obtain 16 sub-regions as the Corner Feature corresponding to angle point Vector, thus obtain all angle points of all angle points corresponding Corner Feature vector sum sar image to be matched of benchmark sar image Corresponding Corner Feature vector.
Step 4, will be special for the angle point of vectorial for the Corner Feature of the angle point of the benchmark sar image angle point with sar image to be matched Levy vector to be compared, obtain benchmark sar image and the matching double points of sar image to be matched.
Step 4 specifically includes following sub-step:
(4a) during respectively in calculating benchmark sar image, the corresponding Corner Feature of the first angle point is vectorial with sar image to be matched The Euclidean distance of all angle points corresponding Corner Feature vector, chooses arest neighbors according to Euclidean distance in sar image to be matched Value and time neighbour's value, if arest neighbors value is less than predetermined threshold value with time ratio of neighbour's value, treat corresponding for described arest neighbors value Angle point in coupling sar image is with described first angle point as preliminary matches point pair, wherein, sar image on the basis of the first angle point In any one angle point;All angle points in described benchmark sar image are calculated as the first angle point, is obtained benchmark Sar image and all preliminary matches points pair of sar image to be matched;In the same manner, using all angle points in sar image to be matched as First angle point, calculates the first angle point corresponding Corner Feature vector and the corresponding spy of all angle points in benchmark sar image respectively Levy the Euclidean distance of vector, finally give all preliminary matches points pair of sar image to be matched and benchmark sar image;
Wherein, described predetermined threshold value can be set to 0.8.
(4b) using the consistent method of purification of ransac stochastical sampling to described preliminary matches point to purifying, deletion error Matching double points, obtain the final matching double points between benchmark sar image and sar image to be matched.
Extract all preliminary matches points of benchmark sar image and sar image to be matched to, sar image to be matched and benchmark All preliminary matches point centering identical preliminary matches points pair of sar image, as benchmark sar image and sar to be matched Final matching double points between image.
Step 5, according to the matching double points of benchmark sar image and sar image to be matched, enters to described sar image to be matched Row image registration.
Emulation experiment 1:
In conjunction with accompanying drawing 2, accompanying drawing 3, accompanying drawing 4, accompanying drawing 5, sar image to be matched is 512 × 256, and benchmark sar image is 512 × 256, sar image wherein to be matched is the sar image containing obvious speckle noise, and benchmark sar image is spaceborne sar image, There is radiation deformation in sar image to be matched and benchmark sar image.
For the speckle noise of sar image, the present invention is using the sar-harris angle inhibited to speckle noise Point detective operators construct three layers of sar image angle point response pattern become angle point response space so that sar method for registering images have right Speckle noise robustness.Sar-harris corner detection operator and sar-harris angle point response space formed as follows.
Step 1: choose exponential weighting alpha parameter value α0=2, α1=221/3、α2=221/3·21/3, obtain an index Weighting parameters sequence.
Step 2: according to exponential weighting filter functionGenerate three by the exponential weighting α value in step 1 Exponent filtering function, respectively using these three exponent filtering construction of function horizontal filtering functions and vertical filtering function, treats Join sar image and benchmark sar schemes horizontal and vertical filtering.Horizontal filtering function and vertical filtering function are as follows:
Horizontal filtering function:
m x + , α = &integral; &integral; a = r + , b = r i ( x + a , y + b ) × e - | a | + | b | α m x - , α = &integral; &integral; a = r - , b = r i ( x + a , y + b ) × e - | a | + | b | α
Vertical filtering function is as follows:
m y + , α = &integral; &integral; a = r , b = r + i ( x + a , y + b ) × e - | a | + | b | α m y - , α = &integral; &integral; a = r , b = r - i ( x + a , y + b ) × e - | a | + | b | α
Wherein i (a, b) is the pixel point value on image, x and y is illustrated respectively in side-play amount both horizontally and vertically, and α is Exponential weighting alpha parameter value.
Step 3: calculate the horizontal gradient of two width sar images and vertical gradient under three exponential weighting α values respectively, figure As gradient total amplitude and argument, computing formula is as follows:
Horizontal gradient and vertical gradient:
r x , α = l o g ( m x + , α m x - , α ) , r y , α = l o g ( m y + , α m y - , α )
Gradient total amplitude and argument:
mag α = ( g x , α ) 2 + ( g y , α ) 2 , ori α = a r c t a n ( g y , α g x , α )
Wherein magα、oriαFor the amplitude under corresponding parameter alpha and argument.
Step 4: calculate the product g of two width sar image two direction gradientx,α·gy,α、gx,α·gx,α、gy,α·gy,α, construction Sar-harris matrix, and using standard deviation beGaussian function weighting.Sar-harris matrix is as follows:
c s h ( x , y , α ) = g 2 · α * ( g x , α ) 2 ( g x , α ) · ( g y , α ) ( g x , α ) · ( g y , α ) ( g y , α ) 2
WhereinFor standard deviation it isGaussian function.* it is convolution operation operator.
Step 5: construction sar-harris angle point response equation is sar-harris corner detection operator, obtains two width images Three layers of angle point response diagram, form sar-harris angle point response space (different index weighting α value show as different scale).Angle Point response equation is:
rsh(x, y, α)=det (csh(x,y,α))-d·trace(csh(x,y,α))2
Wherein det is matrix determinant, and trace is the mark of matrix, and d is sar-harris angle point response equation parameter, This method intermediate value is 0.4.
As shown in Figure 3, emulate exponential weighting parameter alpha for this0The horizontal gradient figure of=2 times two width sar images.Image Middle transverse axis is the trunnion axis of sar image, and the image longitudinal axis is the vertical axises of sar image.
As shown in Figure 4, for this emulation in exponential weighting parameter alpha0The vertical gradient map of=2 times two width sar images.Figure In picture, transverse axis is the trunnion axis of sar image, and the image longitudinal axis is the vertical axises of sar image.
As shown in Figure 5, for this emulation in exponential weighting parameter alpha0The angle point response space of=2 times two width sar images Figure.Three layers of angle point response diagram are comprised in the angle point response space of wherein every width sar image.
Emulation experiment 2:
In conjunction with accompanying drawing 6, accompanying drawing 7, sar image to be matched and benchmark sar image such as emulation experiment 1, the present invention is to image Before Laplace transform in Corner Detection position fixing process, image exponent filtering is processed, reduce sar picture noise to Laplce The impact of conversion;Combine improved laplace model factor pair image angle point essence with 24 extreme value suppression to extract.Experiment Simulation process is as follows:
Step 1: after the angle point obtaining sar image to be matched and benchmark sar image responds space, angle steel joint is slightly carried Take, (different layers are the respective layer that different index weights α value, different α values performances to choose every layer of space of sar-harris angle point response For different scale) angle point response value meets the coordinate points of following condition, wherein threshold value d in angle point response diagramshValue is 0.01.
rsh(x,y,α)>dsh
Step 2: take all angle points that step 1 is selected, take its 8 point of adjacent surrounding, by the r of angle pointshValue is adjacent week Enclose the r of 8 pointsshValue compares, if the r of angle pointshValue maximum is then it is assumed that this angle point respective coordinates is the angle point position slightly extracting Put, otherwise, this angle point is abandoned.
Step 3: structural index parameter isIndex filter Wave function is to two width sar image filterings.
Exponent filtering function:
Step 4: carry out Laplace transform using the filtered image of improved Laplace model factor pair, formation is drawn general Lars metric space.
Improved Laplce's filter function form is:
▿ f = [ σ i , j &element; [ - 2,2 ] f ( x + i , y + j ) - f ( x + 2 , y + 2 ) - f ( x + 2 , y - 2 ) - f ( x - 2 , y - 2 ) - f ( x - 2 , y + 2 ) ] - 20 f ( x , y )
Step 6: in every layer of Laplacian space by thick extract corner location centered on choose surrounding neighbors form 5 × 5 pixel matrix, whether the Laplacian values comparing midpoint angle point are maximum, if maximum, then in all pixels point in matrix For candidate angular position, otherwise, this point is abandoned.
Step 7: the Laplacian values of every layer obtained in step 6 candidate angular coordinate are adjacent a layer respective coordinates position Put Laplacian values to compare, if maximum, then retain this angular coordinate, extract corner location for essence.Comparative approach is public as follows Shown in formula:
laplace(x,αn)≥laplace(x,αl) l∈{n-1,n+1}
As shown in Figure 6, be sar image to be matched and benchmark sar image Laplce's metric space.
As shown in Figure 7, be sar image to be matched and benchmark sar image and its all essences extract corner location, wherein angle Point position is the white round dot of image acceptance of the bid.It is 432 that sar in figure to be matched extracts angle point number, the angle point that benchmark sar in figure is extracted Number is 368.
Emulation experiment 3
In conjunction with accompanying drawing 8, sar image to be matched and benchmark sar image such as emulation experiment 1, this experimental program carries only in calculating After taking angle point principal direction, regeneration angle point characteristic vector description thereof, using Euclidean distance as characteristic vector similarity Criterion is slightly mated to the angle point of two width images, finally purifies matching double points using the consistent method of purification of ransac stochastical sampling.Tool Body implementation steps are:
Step 1: Taylor expansion at each corner location, calculate the side-play amount of corner location, corner location can be obtained The fractional part of information.Side-play amount formula is calculated as follows:
x ^ = - ∂ 2 r s h - 1 ∂ x 2 ∂ r s h ∂ x
Step 2: remove the angle point that side-play amount is more than 1;
Step 3: remaining angle point position coordinate value is obtained the angular coordinate information with decimal plus offset value;
Step 4: choose the region that the angle point radius of neighbourhood is 3 × 1.5 α, the gradient magnitude of all pixels point in zoning Generate gradient statistics Nogata with gradient argument (in this emulation experiment, gradient argument has 36 scopes, and each scope is 10 °) Figure;
Step 5: choosing the corresponding deflection of peak value in statistic histogram is angle point principal direction, the peak value pair more than 80% for the value The deflection answered is the auxiliary direction of angle point.All angle points with auxiliary deflection are replicated one time, and using auxiliary deflection as duplication The principal direction of point;
Step 6: centered on each angle point, choosing Size of Neighborhood is Region, the principal direction angle of corresponding angle point that this region is turned clockwise;
Step 7: angle point neighborhood is divided into d × d (d=4) sub-regions, calculates the gradient width of the pixel of every sub-regions Value and gradient argument (in this step, gradient argument totally 8 scopes, 45 ° of each scope) generate statistic histogram;
Step 8: the amplitude of the statistic histogram in all regions of angle point is arranged according to sequencing, forms 128 Wei Te Levy vector;
Step 9: calculate characteristic vector and all Corner Features of benchmark sar image of all angle points of sar image to be matched to The Euclidean distance of amount, and calculate arest neighbors value and time neighbour's value, if ratio is less than 0.8, for thick matching double points;
Step 10: using ransac stochastical sampling consistent method of purification, matching double points in step 9 are purified, purification is concrete to be walked Suddenly it is:
Judge whether current cycle time is to allow maximum cycle, if it is not, then carrying out next step;
Randomly select 4 thick matching double points and calculate image transformation matrix h;
The angle point collection that remaining all angle points meet transformation matrix h is calculated according to fault tolerance;
Judge in previous step that angle point integrates and whether (judge that angle point concentrates whether angle point number is current institute as optimal corner point set There are most persons in the middle of angle point collection number);
Judge transformation matrix h whether within the allowed band of error probability;
Recalculate optimum angle point according to transformation matrix h unanimously to collect, and recalculate the new images conversion that consistent collection meets Matrix h.
Obtain optimum consistent collection and corresponding image transformation matrix h.
As shown in accompanying drawing 8 (a), for the thick matching double points calculating in this emulation experiment step 9, the coupling in two width images Point is connected to black line.Matching double points are right for 152.
As shown in accompanying drawing 8 (b), for the purification matching double points after this emulation experiment step 10, the coupling in two width images Point is connected to black line.Matching double points are right for 81.
Emulation experiment 4
In conjunction with accompanying drawing 9, accompanying drawing 10, carry out this method with sift-like matching process (using 8 layers of sar-harris angle point Response diagram forms sar-harris metric space) experiment of matching result contrast simulation.Sar image to be matched and benchmark sar image As emulation experiment 1.
As shown in Figure 9, the angle point that sift-like matching process extracts is: 896 angle points of sar image to be matched, coupling 644 angle points of sar image.It can be seen that angle point is densely distributed, there is unnecessary marginal point to be extracted, make in this method Remain the width between angle point and angle point with 24 extreme value suppression with reference to improved laplace model factor pair image procossing Degree information, and reject excess margins point.
As shown in accompanying drawing 10 (a), sift-like matching process: thick coupling be paired into 206.In two width images Join and be a little connected to black line.It can be seen that there being the matching double points of many mistakes.
As shown in accompanying drawing 10 (b), sift-like matching process: matching double points are 76 after purification.
Table 1 sift-like is contrasted with context of methods matching result
Emulation experiment 5
In conjunction with accompanying drawing 11, carry out this method with sift-like matching process (using 8 layers of sar-harris angle point response diagram Form sar-harris metric space) matching error Comparative result emulation experiment.Sar image to be matched is with benchmark sar image such as Emulation experiment 1.
As shown in Figure 11, be this method and sift-like matching process carries out error to the match point point after purification and imitates True experiment figure, emulating image longitudinal axis unit is pixel cell, and transverse axis unit is match point.Such as table 2 below data can be obtained, we The method of method better than sift-like method in the error of match point after the purification.
Table 2 sift-like method and this method matching error contrast table
The above, the only specific embodiment of the present invention, but protection scope of the present invention is not limited thereto, and any Those familiar with the art the invention discloses technical scope in, change or replacement can be readily occurred in, all should contain Cover within protection scope of the present invention.Therefore, protection scope of the present invention should be defined by described scope of the claims.

Claims (8)

1. a kind of sar method for registering images based on improvement Laplce's many extreme values suppression is it is characterised in that methods described includes Following steps:
Step 1, obtains benchmark sar image and sar image to be matched, and sets up the sar- of described benchmark sar image respectively Harris angle point responds the sar-harris angle point response space of space and described sar image to be matched;
Step 2, the sar-harris angle point extracting benchmark sar image respectively responds angular coordinate and the sar to be matched figure in space The sar-harris angle point of picture responds the angular coordinate in space;
Step 3, responds the angular coordinate in space and sar image to be matched to the sar-harris angle point of benchmark sar image respectively Sar-harris angle point respond space angular coordinate positioned, determine the gray-scale statistical in each angle point local neighborhood area Value, so that it is determined that the Corner Feature vector sum sar to be matched of the principal direction of each angle point the angle point forming benchmark sar image schemes The Corner Feature vector of the angle point of picture;
Step 4, by the Corner Feature of vectorial for the Corner Feature of the angle point of the benchmark sar image angle point with sar image to be matched to Amount is compared, and obtains benchmark sar image and the matching double points of sar image to be matched;
Step 5, according to the matching double points of benchmark sar image and sar image to be matched, carries out figure to described sar image to be matched As registration.
2. a kind of sar method for registering images based on improvement Laplce's many extreme values suppression according to claim 1, it is special Levy and be, step 1 specifically includes following sub-step:
(1a) determine a class index weighting parameters value [α01,…αi,…,αn-1], wherein, α0=2, αi0·ki,(i∈[1, N-1]), k=21/3, n is the number of exponential weighting parameter value;
(1b) exponential weighting filter function is determined according to exponential weighting parameter value αWherein, exponential weighting parameter Value α takes α successively01,…αi,…,αn-1, thus obtaining n exponential weighting filter function, e is the truth of a matter of exponential function;
(1c) it is respectively adopted n exponential weighting filter function and horizontal filtering is carried out to benchmark sar image and sar image to be matched And vertical filtering, obtain the corresponding n horizontal filtering figure of benchmark sar image and n vertical filtering figure, and sar to be matched figure As corresponding n horizontal filtering figure and n vertical filtering figure;
Wherein, the horizontal filtering function in exponential weighting filter function is:
Vertical filtering function in exponential weighting filter function is:
Wherein,Represent the horizontal filtering function along x-axis positive direction,Represent the horizontal filtering function along x-axis negative direction,Represent the horizontal filtering function along y-axis positive direction,Represent the horizontal filtering function along y-axis negative direction, i (x, y) is The value of the pixel on image, x, y represent abscissa and the vertical coordinate of image respectively, and a represents pixel in the horizontal direction inclined Shifting amount, b represents the side-play amount in vertical direction;
(1d) according to the corresponding n horizontal filtering function of benchmark sar image and n vertical filtering function, calculating benchmark sar image Corresponding n horizontal gradient figure and n vertical gradient map, according to the corresponding n horizontal filtering function of sar image to be matched and n Individual vertical filtering function, calculates the corresponding n horizontal gradient figure of sar image to be matched and n vertical gradient map;
Wherein, the horizontal gradient g of n horizontal gradient each pixel of in figurex,αComputing formula is:
Vertical gradient g of n vertical gradient each pixel of in figurey,αComputing formula is:
(1e) according to the corresponding n horizontal gradient computing formula of benchmark sar image and n vertical gradient computing formula, construct base The corresponding n sar-harris matrix of quasi- sar image, according to the corresponding n horizontal gradient computing formula of sar image to be matched and N vertical gradient computing formula, the construction corresponding n sar-harris matrix of sar image to be matched:
Wherein, csh(x, y, α) represents sar-harris matrix,Represent that standard deviation isGaussian function, * represents volume Long-pending operation operator, represents product calculation;
(1f) according to the corresponding n sar-harris matrix of benchmark sar image, the sar-harris angle of construction benchmark sar image Point response equation, calculates the angle point response value of each pixel according to the sar-harris angle point response equation of benchmark sar image, And form the angle point response diagram of benchmark sar image, thus form the sar-harris angle point response space of benchmark sar image, institute The sar-harris angle point response space stating benchmark sar image comprises n-layer angle point response diagram;Corresponded to according to sar image to be matched N sar-harris matrix, construct sar image to be matched sar-harris angle point response equation, according to sar to be matched scheme The sar-harris angle point response equation of picture calculates the angle point response value of each pixel, and forms the angle of sar image to be matched Point response diagram, thus forming the sar-harris angle point response space of sar image to be matched, described sar image to be matched Sar-harris angle point response space comprises n-layer angle point response diagram;
Sar-harris angle point response equation rsh(x, y, α) is:
rsh(x, y, α)=det (csh(x,y,α))-d·trace(csh(x,y,α))2
Wherein, det representing matrix determinant, the mark of trace representing matrix, d is the parameter of sar-harris angle point response equation, Value is 0.4, and exponential weighting parameter value α takes α successively01,…αi,…,αn-1So that the sar-harris angle of benchmark sar image Point response space comprises n layer angle point response diagram, and the sar-harris angle point response space of sar image to be matched comprises n-layer angle point Response diagram, x represents the abscissa of pixel, and y represents the vertical coordinate of pixel.
3. a kind of sar method for registering images based on improvement Laplce's many extreme values suppression according to claim 1, it is special Levy and be, step 2 specifically includes following sub-step:
(2a) the sar-harris angle point response space of benchmark sar image comprises n-layer angle point response diagram, sar image to be matched Sar-harris angle point response space comprises n-layer angle point response diagram, to the angle point in the n-layer angle point response diagram of benchmark sar image Slightly extracted, choose in every layer of angle point response diagram and meet pre-conditioned angle point;The n-layer angle point of sar image to be matched is rung The angle point answering in figure is slightly extracted, and chooses in every layer of angle point response diagram and meets pre-conditioned angle point;
Described pre-conditioned it is: rsh(x,y,α)>dsh
Wherein, rsh(x, y, α) represents sar-harris angle point response equation, dshFor predetermined threshold value, value is 0.01;X, y are respectively Represent abscissa and the vertical coordinate of image, α is exponential weighting parameter value;
(2b) meet each pre-conditioned angle point in every layer of angle point response diagram to benchmark sar image and sar image to be matched, Choosing 8 adjacent about angle points, forming the Square Neighborhood comprising nine angle points, if meeting the angle point of pre-conditioned angle point Response value is the maximum in the angle point response value of nine angle points in described Square Neighborhood, then retain this and meet pre-conditioned angle Point, otherwise abandons this and meets pre-conditioned angle point, thus obtaining the thick extraction angle of every layer of angle point response diagram in benchmark sar image The thick extraction angular coordinate of every layer of angle point response diagram in point coordinates and sar image to be matched;
(2c) every layer of angle point response diagram to benchmark sar image and sar image to be matched, corresponding according to every layer of angle point response diagram Exponential weighting parameter value α, determines the corresponding exponential weighting filter function of every layer of angle point response diagramAccording to institute State exponential weighting filter function corresponding angle point response diagram is filtered, obtain filtered angle point response diagram;
(2d) every layer of angle point after every layer of angle point response diagram after benchmark sar image filtering and sar image filtering to be matched is rung Ying Tu, carries out Laplace transform using improved Laplce's filter function, obtains the Laplce of every layer of angle point response diagram Value, forms Laplce's figure of every layer of angle point response diagram, the n-layer angle point response diagram composition benchmark sar image in benchmark sar image Laplce's metric space, n-layer angle point response diagram in sar image to be matched forms the Laplce of sar image to be matched Metric space;
Improved Laplce's filter function form is as follows:
(2e) every layer in the thick extraction angular coordinate to every layer of angle point response diagram in benchmark sar image and sar image to be matched The thick extraction angular coordinate of angle point response diagram, adopts 24 extreme value suppression method orientation angles again in Laplce's metric space Point coordinates, obtains candidate angular coordinate;Wherein, described 24 extreme values suppress methods be using certain thick angular coordinate extracting as Central point, the Laplace transform value comparing this central point is carried out with the Laplace transform value of 20 adjacent about angle points Relatively, if the Laplace transform value of this central point is maximum, as candidate angular coordinate, otherwise, abandon this angle point;
24 extreme value inhibitions are as follows:
0 1 1 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 0
(2f) to the Laplace transform value of each candidate angular coordinate of every layer of angle point response diagram and its phase in benchmark sar image The Laplace transform value of the angle point response diagram respective coordinates position of adjacent bed is compared, and retains Laplace transform value maximum Candidate angular coordinate is as the angular coordinate in the sar-harris angle point response space of benchmark sar image;
The Laplace transform value of each candidate angular coordinate of every layer of angle point response diagram in sar image to be matched is adjacent The Laplace transform value of the angle point response diagram respective coordinates position of layer is compared, and retains the maximum time of Laplace transform value Select the role point coordinates as the angular coordinate in the sar-harris angle point response space of sar image to be matched.
4. a kind of sar method for registering images based on improvement Laplce's many extreme values suppression according to claim 3, it is special Levy and be, improved Laplce's filter functionFor:
Wherein, x, y represent abscissa and the vertical coordinate of image respectively.
5. a kind of sar method for registering images based on improvement Laplce's many extreme values suppression according to claim 3, it is special Levy and be, 24 extreme value suppression templates in 24 extreme value suppression methods are:
Choose and comprise any one thick extraction angle point and 5 × 5 pixel regions centered on this thick extraction angle point, make 5 × 5 pixels In region, the weights of four pixels at four angles are 0, and the weights of other 21 pixels are 1, compare this thick La Pu extracting angle point Whether Lars transformed value is the maximum of the Laplacian values of all pixels point in 5 × 5 pixel regions, and if so, then retaining should The thick angle point that extracts, as candidate angular, otherwise, abandons this thick extraction angle point.
6. a kind of sar method for registering images based on improvement Laplce's many extreme values suppression according to claim 1, it is special Levy and be, step 3 specifically includes following sub-step:
(3a) angular coordinate responding space to the sar-harris angle point of benchmark sar image and sar image to be matched passes through son The method of pixel interpolation, calculates the fractional part of each angular coordinate, by sar-harris angle point response equation rsh(x, y, α) does Taylor expansion, obtains:
Wherein, rshX () represents the Taylor expansion at pixel x, rshY () represents the Taylor expansion at pixel y, x0Represent The initial coordinate values of angle point,WithRepresent first differential at pixel x for the angle point response equation respectively And second-order differential,WithRepresent first differential at pixel y for the angle point response equation and second order respectively Differential, ()tRepresent transposition operation;
Thus obtaining the side-play amount in extreme point position x direction for each angular coordinateSide-play amount with extreme point position y direction
Wherein, ()-1Represent inversion operation;
(3b) reject side-play amountOr side-play amountMore than the angular coordinate of 1 pixel cell, and remaining angular coordinate is added Corresponding side-play amount, obtains the angular coordinate containing fractional part;
(3c) angular coordinate containing fractional part to each, chooses the region that this angular coordinate radius of neighbourhood is 3 × 1.5 α, meter Calculate the gradient magnitude of all pixels point and gradient argument in region, obtain the ladder of all pixels point in this angular coordinate radius of neighbourhood Degree statistic histogram, the trunnion axis of described gradient statistic histogram is gradient argument, and vertical axes are the weighted accumulation of gradient magnitude With;α is the exponential weighting parameter value of this angular coordinate place angle point response diagram herein;
The computing formula at the gradient angle of depression is:
The computing formula of gradient magnitude is:
(3d) choose the corresponding deflection of peak-peak in described gradient statistic histogram as the principal stresses angle of corresponding angle point, if There are other peak values more than peak-peak energy more than 80% in described gradient statistic histogram, then will be corresponding for this peak value Angle point and its deflection are recorded, so that it is determined that the principal stresses angle of each angle point form the Corner Feature of benchmark sar image The Corner Feature vector of vector sum sar to be matched image.
7. a kind of sar method for registering images based on improvement Laplce's many extreme values suppression according to claim 4, it is special Levy and be, in step 3, the Corner Feature vector of the Corner Feature vector sum sar to be matched image of composition benchmark sar image, tool Body includes:
(3e) all angular coordinates and corresponding principal stresses angle in determining benchmark sar image, and sar image to be matched In after all angular coordinates and corresponding principal stresses angle, centered on each angle point, choosing size around angle point isAngle point neighborhood;
(3f) angle point centered on each angle point, its corresponding angle point neighborhood is turned clockwise principal stresses angle θ, wherein, rotation Matrix is as follows:(x1, y1) is the angular coordinate before rotation, and (x ', y ') is postrotational angle Point coordinates;
(3g) postrotational angle point neighborhood is divided into d1 × d1 sub-regions, that is, is divided into 16 sub-regions, calculate every height The gradient magnitude of all pixels point and gradient argument in region, obtain the pixel rectangular histogram of all pixels point in this subregion, The histogrammic trunnion axis of described pixel is gradient argument, vertical axes be gradient magnitude weighted accumulation and;Every sub-regions Pixel rectangular histogram comprises 8 posts, and each post represents the angle point range value of 45° angle scope, and that is, each pixel rectangular histogram has 8 The range value in individual direction;
It is (3h) range value in the pixel rectangular histogram of obtain 16 sub-regions is vectorial as the Corner Feature of corresponding angle point, Thus all angle points obtaining all angle points corresponding Corner Feature vector sum sar image to be matched of benchmark sar image correspond to Corner Feature vector.
8. a kind of sar method for registering images based on improvement Laplce's many extreme values suppression according to claim 1, it is special Levy and be, step 4 specifically includes following sub-step:
(4a) in calculating benchmark sar image, the corresponding Corner Feature of the first angle point is vectorial all with sar image to be matched respectively Angle point corresponding Corner Feature vector Euclidean distance, in sar image to be matched according to Euclidean distance choose arest neighbors value and Secondary neighbour's value, if arest neighbors value is less than predetermined threshold value with time ratio of neighbour's value, will be corresponding to be matched for described arest neighbors value Angle point in sar image and described first angle point as preliminary matches point pair, wherein, in sar image on the basis of the first angle point Any one angle point;All angle points in described benchmark sar image are calculated as the first angle point, is obtained benchmark sar figure As all preliminary matches points pair with sar image to be matched;In the same manner, using all angle points in sar image to be matched as first Angle point, calculate respectively the first angle point corresponding Corner Feature vector with the corresponding feature of all angle points in benchmark sar image to The Euclidean distance of amount, finally gives all preliminary matches points pair of sar image to be matched and benchmark sar image;
(4b) all preliminary matches points of benchmark sar image and sar image to be matched are extracted to, sar image to be matched and benchmark All preliminary matches point centering identical preliminary matches points pair of sar image, as benchmark sar image and sar to be matched Final matching double points between image.
CN201610702232.0A 2016-08-22 2016-08-22 Improved Lapras multi-extremum inhibition-based SAR image registration method Pending CN106373147A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610702232.0A CN106373147A (en) 2016-08-22 2016-08-22 Improved Lapras multi-extremum inhibition-based SAR image registration method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610702232.0A CN106373147A (en) 2016-08-22 2016-08-22 Improved Lapras multi-extremum inhibition-based SAR image registration method

Publications (1)

Publication Number Publication Date
CN106373147A true CN106373147A (en) 2017-02-01

Family

ID=57879214

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610702232.0A Pending CN106373147A (en) 2016-08-22 2016-08-22 Improved Lapras multi-extremum inhibition-based SAR image registration method

Country Status (1)

Country Link
CN (1) CN106373147A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108230375A (en) * 2017-12-27 2018-06-29 南京理工大学 Visible images and SAR image registration method based on structural similarity fast robust
CN108304883A (en) * 2018-02-12 2018-07-20 西安电子科技大学 Based on the SAR image matching process for improving SIFT
CN108629343A (en) * 2018-04-28 2018-10-09 湖北民族学院 A kind of license plate locating method and system based on edge detection and improvement Harris Corner Detections
CN110531351A (en) * 2019-08-16 2019-12-03 山东工商学院 A kind of GPR image hyperbolic wave crest point detecting method based on Fast algorithm
CN113766209A (en) * 2020-05-29 2021-12-07 上海汉时信息科技有限公司 Camera offset processing method and device
CN117670958A (en) * 2024-01-31 2024-03-08 中国人民解放军国防科技大学 Registration method, device and equipment for SAR image of small aperture sequence

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080317383A1 (en) * 2005-12-22 2008-12-25 Koninklijke Philips Electronics, N.V. Adaptive Point-Based Elastic Image Registration
CN102722731A (en) * 2012-05-28 2012-10-10 南京航空航天大学 Efficient image matching method based on improved scale invariant feature transform (SIFT) algorithm
CN103310453A (en) * 2013-06-17 2013-09-18 北京理工大学 Rapid image registration method based on sub-image corner features
CN105631872A (en) * 2015-12-28 2016-06-01 西安电子科技大学 Remote sensing image registration method based on multiple feature points

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080317383A1 (en) * 2005-12-22 2008-12-25 Koninklijke Philips Electronics, N.V. Adaptive Point-Based Elastic Image Registration
CN102722731A (en) * 2012-05-28 2012-10-10 南京航空航天大学 Efficient image matching method based on improved scale invariant feature transform (SIFT) algorithm
CN103310453A (en) * 2013-06-17 2013-09-18 北京理工大学 Rapid image registration method based on sub-image corner features
CN105631872A (en) * 2015-12-28 2016-06-01 西安电子科技大学 Remote sensing image registration method based on multiple feature points

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
FLORA DELLINGER等: ""SAR-SIFT: A SIFT-LIKE ALGORITHM FOR APPLICATIONS ON SAR IMAGES"", 《2012 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM》 *
FLORA DELLINGER等: ""SAR-SIFT:A SIFT-Like Algorithm for SAR Images"", 《IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING》 *
XIAOPING YU等: ""The Application of Improved SIFT Algorithm in High Resolution SAR Image Matching in Mountain Areas"", 《2011 INTERNATIONAL SYMPOSIUM ON IMAGE AND DATA FUSION》 *
潘子昂: ""基于SIFT算法的图像匹配研究"", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *
王明印等: ""基于拉普拉斯图像锐化算法研究"", 《中国电子学会第十六届信息论学术年会论文集》 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108230375A (en) * 2017-12-27 2018-06-29 南京理工大学 Visible images and SAR image registration method based on structural similarity fast robust
CN108230375B (en) * 2017-12-27 2022-03-22 南京理工大学 Registration method of visible light image and SAR image based on structural similarity rapid robustness
CN108304883A (en) * 2018-02-12 2018-07-20 西安电子科技大学 Based on the SAR image matching process for improving SIFT
CN108629343A (en) * 2018-04-28 2018-10-09 湖北民族学院 A kind of license plate locating method and system based on edge detection and improvement Harris Corner Detections
CN110531351A (en) * 2019-08-16 2019-12-03 山东工商学院 A kind of GPR image hyperbolic wave crest point detecting method based on Fast algorithm
CN110531351B (en) * 2019-08-16 2023-09-26 山东工商学院 GPR image hyperbolic wave top detection method based on Fast algorithm
CN113766209A (en) * 2020-05-29 2021-12-07 上海汉时信息科技有限公司 Camera offset processing method and device
CN113766209B (en) * 2020-05-29 2024-04-30 上海汉时信息科技有限公司 Camera offset processing method and device
CN117670958A (en) * 2024-01-31 2024-03-08 中国人民解放军国防科技大学 Registration method, device and equipment for SAR image of small aperture sequence

Similar Documents

Publication Publication Date Title
CN106373147A (en) Improved Lapras multi-extremum inhibition-based SAR image registration method
CN110992263B (en) Image stitching method and system
CN103426186B (en) A kind of SURF fast matching method of improvement
CN102800097B (en) The visible ray of multi-feature multi-level and infrared image high registration accuracy method
CN102800148B (en) RMB sequence number identification method
US11710307B2 (en) Urban remote sensing image scene classification method in consideration of spatial relationships
CN107808386A (en) A kind of sea horizon detection method based on image, semantic segmentation
CN106529591A (en) Improved MSER image matching algorithm
CN102819839B (en) High-precision registration method for multi-characteristic and multilevel infrared and hyperspectral images
CN105608667A (en) Method and device for panoramic stitching
CN102800099B (en) Multi-feature multi-level visible light and high-spectrum image high-precision registering method
CN112084869A (en) Compact quadrilateral representation-based building target detection method
CN112254656B (en) Stereoscopic vision three-dimensional displacement measurement method based on structural surface point characteristics
CN103955950B (en) Image tracking method utilizing key point feature matching
CN106204415A (en) A kind of novel method for registering images
CN108009551A (en) Suitable for the power knife switch division position state identification method of electric operating robot
CN106203261A (en) Unmanned vehicle field water based on SVM and SURF detection and tracking
CN104599288A (en) Skin color template based feature tracking method and device
CN109934857A (en) A kind of winding detection method based on convolutional neural networks Yu ORB feature
CN106447662A (en) Combined distance based FCM image segmentation algorithm
CN116228539A (en) Unmanned aerial vehicle remote sensing image stitching method
CN107766866A (en) Set direction profile testing method based on receptive field subregion
CN114581307A (en) Multi-image stitching method, system, device and medium for target tracking identification
CN113052110B (en) Three-dimensional interest point extraction method based on multi-view projection and deep learning
CN106934395B (en) Rigid body target tracking method adopting combination of SURF (speeded Up robust features) and color features

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20170201

WD01 Invention patent application deemed withdrawn after publication