CN104318548B - Rapid image registration implementation method based on space sparsity and SIFT feature extraction - Google Patents

Rapid image registration implementation method based on space sparsity and SIFT feature extraction Download PDF

Info

Publication number
CN104318548B
CN104318548B CN201410531222.6A CN201410531222A CN104318548B CN 104318548 B CN104318548 B CN 104318548B CN 201410531222 A CN201410531222 A CN 201410531222A CN 104318548 B CN104318548 B CN 104318548B
Authority
CN
China
Prior art keywords
matched
image
matrix
point
sift feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201410531222.6A
Other languages
Chinese (zh)
Other versions
CN104318548A (en
Inventor
钟桦
焦李成
王海明
王爽
侯彪
田小林
熊涛
刘红英
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN201410531222.6A priority Critical patent/CN104318548B/en
Publication of CN104318548A publication Critical patent/CN104318548A/en
Application granted granted Critical
Publication of CN104318548B publication Critical patent/CN104318548B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/35Determination of transform parameters for the alignment of images, i.e. image registration using statistical methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Probability & Statistics with Applications (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a rapid image registration implementation method based on space sparsity and SIFT feature extraction and mainly aims at solving the problem of feature mismatching due to frequently extracted feature points unstable in smooth region and texture region in a classical SIFT feature extraction algorithm. The rapid image registration implementation method based on the space sparsity and the SIFT feature extraction comprises the steps of 1) extracting the sparse regions of a reference image and an image to be matched, respectively, 2) extracting SIFT feature points from the sparse regions of the reference image and the image to be matched, 3) performing rough matching on the set of SIFT feature points extracted from the reference image and the image to be matched, 4) filtering out mismatch in the rough matching result by use of a random consistency estimation algorithm, and 5) realizing registration on two SAR images by use of an affine transformation coefficient obtained through affine transformation by utilizing the final matched point pairs of the reference image and the image to be matched. The rapid image registration implementation method based on the space sparsity and the SIFT feature extraction is capable of improving the refrigeration efficiency under the premise of guaranteeing the accuracy of the classical SIFT feature refrigeration algorithm and can be applied to the refrigeration processing of the SAR images.

Description

A kind of fast image registration realization extracted based on space degree of rarefication and SIFT feature Method
Technical field
The invention belongs to technical field of image processing, be related to image registration, specifically a kind of based on space degree of rarefication and The fast image registration implementation method that SIFT feature is extracted, can be used for the change detection of SAR image, the aspect such as merges, splices Early stage registration work.
Background technology
Image formed by synthetic aperture radar (SAR) has round-the-clock, round-the-clock, high-resolution and powerful penetration capacity The features such as, therefore, this image is widely used target identification, change detection and surface surveillance, has been increasingly becoming current One of very representative earth observation means.Carrying out to SAR image splicing, merge, change the operation such as detection before, need by From areal, the image that obtains in same time different points of view or same sensor different time spatially joined Standard, eliminate because obtaining time of image, the difference of angle, environment and sensor imaging mechanism causes translation between image, rotation, The problems such as flexible and local deformation.
As one of remote sensing image processing basic task, the target of registration be exactly by different time, different visual angles or Two width that person's different sensors obtain or multiple image carry out the processing procedure of geometric alignment.For SAR image, its coherent imaging During formed speckle noise to gradation of image distribution impact, the in addition difference of image-forming condition, shooting time and season The factors such as change, the change of photographed scene atural object cause image have the characteristics that intensity profile complicated, unstable so that in optics It is capable of the algorithm of successful Application in remote sensing images and technology but baffles repeatedly when being applied to SAR image.
At present, the method for registration has been suggested in a large number.According to the difference of feature space, registration Algorithm can be generally divided into Two classes:Method for registering based on gray scale and the method for registering of feature based.Method based on gray scale is typically all directly using whole The half-tone information of width image, weighs two width image lap earth surface reflections by setting up the similarity measure between certain pixel The matching degree of attribute, and then search out translation during Optimum Matching, rotation and the transformation parameter such as flexible.Image based on gray scale Method for registering, its similarity measurements flow function includes cross-correlation function, fourier function and mutual information function etc., by using similar Under property maximum case, corresponding transformation relation enters line translation, finally realizes the registration between image.Using based on gray scale Method for registering to SAR image registering when, be easily subject to the shadow of the gray difference and noise leading in SAR image imaging process Ring.The method for registering images of feature based is to be detected by selecting notable and prominent target from image, extracts these The feature of characteristic point, sets up the transformation relation between image by the coupling between these features, finally realizes image registration.It is based on The method for registering images of feature can solve the problem that the registration problems existing between larger geometric distortion and the image of half-tone information difference.Often Characteristics of image has:Characteristic point (including angle point, high curvature point, flex point etc.), straightway, edge, profile, enclosed region, spy Levy structure etc.., then firstly the need of to image zooming-out characteristic point subject to registration, edge etc., it is right then to carry out for the method for registering of feature based Answer feature to or feature set between search coupling, and then obtain image transformation parameter.Method for registering based on edge feature needs Carry out Edge Gradient Feature using some edge detection operators, then the edge extracting is mated, but often due to extract Edge excessively trifling it is impossible to well embody image in architectural feature, subsequent match accuracy and speed is impacted.And point Feature have with respect to features such as line, faces be easy to extract, speed is fast, high precision, good stability etc. are permitted many advantages and had Wider application.
The image registration of feature based is higher to the antijamming capability of noise and deformation, thus has higher robustness. One of the most frequently used method for registering of feature based is the method for registering being extracted based on SIFT feature.Lowe is by the master of SIFT feature The step summary is wanted to be:(1) metric space extreme points extraction;(2) extreme value point location;(3) principal direction distribution;(4) key point description. SIFT has stronger rotation, scale invariability under the less anglec of rotation, also to illumination variation robust after normalization.But due to SIFT employ one 128 dimension vector carry out Expressive Features point, increased in the case that characteristic point is more computing cost and when Between complexity, and find best match when characteristics of needs between there is preferable discrimination.Calculate being extracted using SIFT feature Method process SAR image when, not only the sparsity structures such as straight line, broken line, infall extracted region to more SIFT feature point, And also can extract a lot of characteristic points in the unconspicuous smooth region of feature and texture region, but these smooth regions and line There are much similar to it points around the characteristic point in reason region, thus take that more and these features are inconspicuous, easy shape Become error hiding.
Content of the invention
It is an object of the invention to overcoming the shortcomings of prior art it is proposed that a kind of joint space sparsity structure and SIFT are special Levy the SAR image registration method of extraction, to realize extracting SIFT feature point in sparsity structure set of regions in SAR image, subtracting Reduce matching error rate on the basis of few algorithm computing cost and time efficiency, improve the efficiency of image registration.
The technical thought realizing the object of the invention is:Input with reference to figure and figure to be matched, obtain two width images respectively Sparse region, extracts SIFT feature point to the sparse region with reference to figure and figure to be matched, using secondary recently closely ratio criterion (NNDR) as similarity measure, it is right slightly to be mated, and removes error hiding using random Uniform estimates (RANSAC) method, Finally try to achieve affine transformation parameter using affine transformation, thus realizing the registration with reference to figure and figure to be matched.
The technical scheme is that:A kind of fast image registration realization extracted based on space degree of rarefication and SIFT feature Method, comprises the steps:
(1) two width multidate SAR image of the areal that input obtains in different time, are denoted as respectively with reference to figure and treat Coupling figure, calculates degree of rarefication to each pixel with reference to figure and figure to be matched, selects the area that the high pixel of degree of rarefication is located Domain, as sparse region, is denoted as Mask respectivelyRAnd MaskS
(2) respectively to reference to figure sparse region MaskRWith figure sparse region Mask to be matchedSExtract SIFT feature point, obtain It is designated as { R with reference to figure and figure SIFT feature point set to be matched respectivelym}、{Sn};
(3) to reference to the SIFT feature point set { R extracting on figure and figure to be matchedmAnd { SnCarry out characteristic matching, obtain Thick matching characteristic is to { (pq)i, the thick coupling point set with reference to figure and in figure to be matched is respectively { piAnd { qi, i represents that coupling is right Call number;
(4) utilize random Uniform estimates algorithm to remove error hiding pair to thick matching result, remove error hiding to rear reference The feature point pairs set that figure and figure to be matched remain is designated as { (PQ)i, now each retain with reference to figure and in figure to be matched Characteristic point of getting off is respectively { Pi}、{Qi, wherein i represent mate to call number;
(5) using the coupling point set { P with reference to figure and figure to be matchediAnd { Qi, affine transformation ginseng is obtained by affine transformation Number, finally according to affine transformation parameter to two width image registrations.
Above-mentioned steps (1) comprise the steps:
1) carry out two-layer wavelet decomposition to reference to figure R and figure S to be matched respectively, wavelet basis takes db1, obtains with reference to figure and treats The details coefficients figure of coupling figure, is designated as R respectivelydb1、Sdb1
2) coefficient of variation is calculated to each pixel with reference to figure details coefficients figure and figure details coefficients figure to be matched, so Afterwards centered on current pixel point, the neighborhood as m*m for the size is taken to calculate variance, you can to obtain the variance system of current pixel point Number CV;Set threshold value Tcv, it is in T with the coefficient of variation of SAR imagecvThe linear relationship of=β CV, wherein β are linear system Number;By threshold value TcvCan get coefficient of variation Matrix C V_mask of binaryzation:
The coefficient of variation matrix with reference to figure and figure binaryzation to be matched can be calculated, be designated as CV_mask respectivelyRAnd CV_ maskS
3) refinement matrix T is obtained to the coefficient of variation matrix micronization processes with reference to figureR, equally to figure variance system to be matched Matrix number micronization processes obtain refinement matrix TS
4) calculate the sparse matrix with reference to figure and figure to be matched;
5) front 20% pixel selecting sparse angle value maximum in sparse matrix is as degree of rarefication point, sparse expansion process Obtain sparse region.
Above-mentioned steps 4) comprise the steps:
A () calculates the size and Orientation of each pixel gradient in refinement matrix:
B () analyzes the direction of each pixel gradient, the direction character of statistics current pixel point, obtain with reference to figure and to be matched The direction character matrix of figure, is designated as D respectivelyR、DS
C (), by refinement matrix and direction character matrix, can calculate the degree of rarefication with reference to figure and figure to be matched, degree of rarefication is fixed Justice is:
Wherein, (i, j) is the index of current pixel point, Ω be centered on current pixel point size as r × r neighborhood, N is Pixel sum in neighborhood Ω, D is direction character matrix, and definition T is refinement matrix;Thus obtain again with reference to figure and figure to be matched Sparse matrix SR、SS.
Above-mentioned steps (b) comprise the steps:
A) 16 direction templates of (2r+1) * (2r+1) size, the corresponding angle in the sub- direction of each of direction template are constructed Scope is respectivelyI ∈ [1,16], r are direction template radius;
B) analyze each direction character refining marginal point in refinement matrix with 16 direction templates;
C) the direction character matrix normalization with reference to figure and figure to be matched is processed, you can obtain with reference to figure and figure to be matched Direction character matrix DR、DS.
Beneficial effects of the present invention:The present invention is to judge smooth region, the texture of image using SAR image coefficient of variation Region, sparsity structure region, exclusion feature is inconspicuous and smooth region that easily cause error hiding and texture region, only those is rolled over At line flex point, the sparse region that is located of the higher pixel of the degree of rarefication such as infall extract SIFT feature point and carry out characteristic matching, Realize image registration by asking for affine transformation parameter.The present invention has advantages below compared with prior art:
1. the present invention has been using classical SIFT operator, sets up Gauss yardstick during extracting SIFT feature point During space, while using Gaussian function blurred picture, decrease SAR image speckle noise, thus improve pixel at it The ga s safety degree of neighborhood, decreases the generation of error hiding to a certain extent.
2. the present invention devises a kind of degree of rarefication and defines method, can more accurately analyze broken line, flex point, crosspoint etc. The obvious sparse region of feature, exclusion smooth region, the degree of rarefication such as texture region be low and the unconspicuous non-sparse region of feature.
3. the present invention only extracts SIFT feature point to reference to figure and figure sparse region to be matched, and excludes smooth region, line The degree of rarefications such as reason region are low, the unconspicuous non-sparse region of feature, thus reduce error hiding probability, and so that algorithm is more accelerated Prompt.
4. the present invention realizes simple, clear in structure, little to requiring with reference to figure and figure overlapping area to be matched, and SIFT calculates Son still has higher registration accuracy to the situation having rotation, the situation of change of scale or comprise region of variation.
Brief description
Fig. 1 is the flow chart of the present invention;
Fig. 2 is the sub-process figure calculating sparse region in the present invention;
Fig. 3 is two width SAR image of present invention experiment input, and Fig. 3 (a) is with reference to figure, and Fig. 3 (b) is figure to be matched;
Fig. 4 is the sparse region Mask with reference to figure and figure to be matchedA、MaskB
Fig. 5 is the characteristic point distribution with reference to figure and figure to be matched that classical SIFT algorithm extracts;
Fig. 6 is to the sparse region Mask with reference to figure and figure to be matchedA、MaskBExtract the distribution of SIFT feature point;
Fig. 7 is the registration result with reference to figure and figure to be matched being obtained according to affine transformation parameter, is shown with chessboard method.
Specific embodiment
With reference to Fig. 1, the present invention to implement step as follows:
Step 1, multidate SAR image R of areal and S that input two width obtain in different time, for the ease of retouching State, our image R are called with reference to figure, and image S is called figure to be matched, calculate the sparse region Mask of two width images respectivelyRWith MaskS.This step, with reference to Fig. 2, comprises the following steps that:
(1a) carry out two-layer wavelet decomposition to reference to figure R and figure S to be matched respectively, wavelet basis takes db1, obtain with reference to figure and The details coefficients of figure to be matched are designated as R respectivelydb1、Sdb1
(1b) to reference to figure details coefficients Rdb1With figure details coefficients S to be matcheddb1Each pixel calculate variance system Number CV.CV is the tolerance of probability distribution dispersion, and it is defined as the ratio of standard deviation and average:
Wherein σxRepresent centered on current pixel point size as r × standard deviation of r neighborhood,μxRepresent with current pixel point Centered on size be r × r neighborhood average.Then centered on current pixel point, take a size as m × neighborhood of m calculates Variance, you can obtain the coefficient of variation CV of current pixel point, r=3, m=5 in present example.Can be joined with above method Examine figure details coefficients Rdb1Coefficient of variation Matrix C VR;In the same manner for figure details coefficients S to be matcheddb1There is coefficient of variation Matrix C VS. Set threshold value Tcv, it relevant with the coefficient of variation CV of SAR image (for strength S AR imageFor amplitude SAR imageWherein L is SAR image regarding number), universal experience it is regarded as and variance system Linear, the i.e. T of numbercv=β CV, in the present invention, coefficient constant β value is taken as 0.55.Can get the coefficient of variation square of binaryzation Battle array CV_mask:
The coefficient of variation matrix with reference to figure and figure binaryzation to be matched can be calculated, be designated as CV_mask respectivelyRAnd CV_ maskS.
(1c) to reference to figure CV_maskRMatrix micronization processes obtain refinement matrix TR, to figure CV_mask to be matchedSMatrix Micronization processes obtain refinement matrix TS.
(1d) respectively to the sparse matrix with reference to figure and figure to be matched.Calculate comprising the following steps that of sparse matrix:
1d1) calculate the gradient magnitude of each pixel and direction in refinement matrix:
The index of current pixel point in the refinement matrix that wherein (x, y) represents, d is the gray scale of pixel in refinement matrix Value, m(x,y)For gradient magnitude, k(x,y)For gradient direction.
1d2) construct 16 direction templates of 15 × 15 sizes, the corresponding angular range in the sub- direction of each of direction template is respectivelyI ∈ [1,16], the pixel total number on each sub- direction on direction template is N, Direction template radius is r.
16 direction templates 1d3) are utilized to analyze the direction character of each refinement marginal point of refinement matrix.In refinement matrix, Take size centered on current refined marginal point to be 15 × 15 neighborhood, analyze the gradient side of each pixel on each sub- direction To, the statistical gradient direction pixel number consistent with the sub- direction of this template, it is designated as si, statistics can obtain current refined marginal point Direction character vector after normalization
1d4) calculate the direction character matrix with reference to figure and figure to be matched.Set a threshold valueN is direction Pixel total number on each sub- direction in template, r is direction template radius, and in the present invention, threshold coefficient β value is 1.8. The direction character of analysis refinement marginal point, if hi≥Thist, then this direction is one of principal direction, the principal direction of each marginal point Number, as its direction character, can get the direction character matrix D with reference to figure and figure to be matchedR、DS.
1d5) calculate the degree of rarefication with reference to figure and figure to be matched, define as sparse matrix,
Wherein, (i, j) is the index of current pixel point, Ω be centered on current pixel point size as r × r neighborhood, N is Pixel sum in neighborhood Ω, D is direction character matrix, and definition T is refinement matrix.Can get dilute with reference to figure and figure to be matched Thin matrix SR、SS.
(1e) select in sparse matrix maximum front 20% pixel of sparse angle value as degree of rarefication point, at the expansion of sparse point Reason has obtained sparse region.
Step 2, extracts SIFT feature point, the tool of feature point extraction to the sparse region with reference to figure R and figure S to be matched respectively Body step is as follows:
(2a) build Gaussian scale-space.Metric space under different scale for the one width two dimensional image can be by original image and height This kernel function convolution obtains:
L (x, y, σ)=G (x, y, σ) * I (x, y)
Wherein:L (x, y, σ) represents the metric space of image, and I (x, y) represents original image, and (x, y) represents that input picture is worked as The index of preceding pixel point, * represents convolution algorithm, and G (x, y, σ) is variable dimension gaussian kernel function, and σ is the metric space factor, core Function is defined as:
Discrete sampling is carried out to metric space, that is, takes the Gaussian kernel under the different multiples of σ and down-sampled images convolution, institute A series of Gaussian image generating, referred to as gaussian pyramid (LOG).Gaussian pyramid (LOG) is divided into O group, and every group comprises S+3 layer Metric space, takes O=4, S=2 in the present invention.
First seek each image in first group of Gaussian scale-space.Input picture I0, core letter through different size Gaussian kernel (Gaussian kernel is respectively σ to numberi=kiσ) convolution obtains output image Ii, i=0,12,3,4, as the first of Gaussian scale-space group Image.
Every tomographic image size of second group of gaussian pyramid is the 1/4 of first group of image size respectively, and its input picture is Obtained by the down-sampling that S tomographic image sampled rate in first group of LoG is 2.As S=2, second group of gaussian pyramid defeated Enter image I '0By first group of gaussian pyramid I1Down-sampling obtains, and then (Gaussian kernel is the kernel function through different size Gaussian kernel σi=kiσ) convolution obtains exporting I 'i, i=0,12,3,4, that is, obtain second group of figure of Gaussian scale-space.Can get height in the same manner Third and fourth of this metric space organizes image.Thus generate Gaussian scale-space.
(2b) build difference Gaussian scale-space (DOG).Metric space letter by adjacent two layers in gaussian pyramid (LOG) Count to carry out subtracting each other and just can obtain difference of Gaussian pyramid (DOG), represented with D (x, y, σ):
D (x, y, σ)=(G (x, y, k σ)-G (x, y, σ) × I (x, y))=L (x, y, k σ)-L (x, y, σ)
Wherein, (x, y) represents the current pixel point in input picture, and I (x, y) represents pending image, i.e. difference Gauss Metric space D (x, y, σ) is subtracted each other and is obtained by the two-layer adjacent Gaussian scale-space function of one invariant k of difference.
(2c) extreme point detection.By each inner for difference Gaussian scale-space (DOG) pixel and 8 pixels with layer Totally 26 pixels compare size to 9 pixels of point, 9 pixels on upper strata and lower floor, if current pixel point is very big Value or minimum, then be set to key point current pixel point.
(2d) being accurately positioned of key point.Because the impact to difference Gaussian scale-space (DOG) of edge and noise is larger, Therefore also need further to screen the key point detecting in previous step, be accurately positioned the characteristic point with scale invariability, and And need to filter low contrast point and skirt response point, improve robustness and the stability of characteristic matching.In order to accurately determine spy Levy yardstick a little and position, must to detect Local Extremum carries out three-dimensional quadratic function matching.In Local Extremum (x0, y0, σ) and place, the Taylor expansion of metric space function D (x, y, σ) is:
In above formula, X=(x, y, σ)TRepresent position and the dimensional information of characteristic point, X0Represent characteristic point to be selected position and Dimensional information, single order and second order local derviation are calculated as follows,
Then this fitting function is carried out with derivation and makes first derivative be zero, you can obtain extreme value x of x
WillSubstitute into metric space function D (x, y, σ), can obtain
IfThen represent that the contrast of this point is relatively low, now rejected.
Obtain an important step of stably accurate characteristic point, be the characteristic point filtering out skirt response.Filter side During edge response characteristic point, with reference to following criterion:At the edge infall and peak value of difference of Gaussian spatial function D (x, y, σ), one Individual larger principal curve value is present in the characteristic point of image border, but less in the value of vertical direction principal curvatures.Wherein, ask main Curvature value can be by 2 × 2 Hessian matrix H, Hessian matrix H,
Due to the characteristic value of H-matrix and the proportional relation of the principal curvatures of D, only demand its ratio using specifically not seeking characteristic value The method of ratio, α, β are respectively the characteristic value of eigenvalue of maximum and minimum, γ=α/β, then have:
The computing formula of main diagonal element sum Tr (H) of matrix H is:Tr (H)=Dxx+Dyy=alpha+beta;Matrix H determinant The computing formula of value Det (H) be:Det (H)=DxxDyy-(Dxy)2=α β.
Therefore we only need to detect
Wherein γ=10.
(2e) generate Feature Descriptor.In order to ensure the rotational invariance of characteristic point, need using field around characteristic point The principal direction to calculate characteristic point for the gradient distribution character of pixel.The gradient magnitude of each pixel and direction calculating are as follows:
Wherein, characteristic point is m (x, y) in the size of (x, y) place gradient, and direction is θ (x, y), characteristic point in pyramid The yardstick of (x, y) to be represented with L (x, y).In the neighborhood centered on current signature point, using gradient orientation histogram statistics The gradient direction of pixel in neighborhood.Gradient direction scope be [0,360 °), wherein every 10 ° be a histogram cylinder.Have 36 Cylinder.The crest of gradient orientation histogram represents the principal direction of neighborhood gradient at this feature point, as the main side of this feature point To.If there is the peak value of main peaking capacity more than 80% in gradient orientation histogram, the corresponding direction of this peak value as auxiliary direction, Each characteristic point has a principal direction and several auxiliary directions.
After determining characteristic point principal direction, Feature Descriptor can be generated.16 × 16 windows are taken centered on characteristic point, The gradient orientation histogram in 8 directions is calculated on 4 × 4 image fritter, draws the accumulated value of each gradient direction.For each 4 × 4 sub-block statistical gradient information, 16 × 16 neighborhood produces 4 × 4 sub-blocks, and therefore Feature Descriptor has 4 × 4 × 8=altogether 128 dimensions.Feature Descriptor is made up of the gradient orientation histogram of all sub-blocks, ultimately form 128 dimensions SIFT feature describe to Amount.
Step 3, carries out characteristic matching to the SIFT feature point of reference picture and image to be matched, recycles random uniformity (RANSAC) method of estimation filters the error hiding pair in matching result.Comprise the following steps that:
(3a) carry out Feature Points Matching to reference to the SIFT feature point that figure and figure sparse region to be matched extract, with nearest Neighborhood distance to be realized with secondary nearest neighbor distance ratio (NNDR) criterion.Assume that A is certain characteristic point with reference in figure, DAIts corresponding spy Levy vector.It is characterized point B and time neighbour (Euclidean distance time in the arest neighbors (Euclidean distance is minimum) of in figure characteristic point A subject to registration Minimum) it is characterized point C, their corresponding characteristic vectors are DBAnd DC, nearest neighbor distance is fixed with secondary nearest neighbor distance ratio (NNDR) criterion Justice is:
Wherein TdisIt is distance than threshold value, DARepresent 128 dimensional feature description of current signature point A in reference picture, DBTable Show 128 dimensional feature description of most like characteristic point B of in figure to be matched and A, DCRepresent the in figure to be matched spy similar to A time Levy 128 dimensional feature description of point C, the ratio between arest neighbors Euclidean distance and secondary neighbour's Euclidean distance is more than threshold value Tdis When then it is assumed that a pair of characteristic point corresponding to arest neighbors Euclidean distance is matching characteristic point pair.Threshold value value is bigger, thinks Join requirement looser, error matching points have probability and increase;The less requirement meaning to mate of threshold value value is stricter, mistake Join a probability existing to reduce, can be removed some simultaneously and correctly mate.Therefore threshold value value is according to registering image species of institute etc. Actual conditions set.It is denoted as { p respectively with reference to the point set in figure and in figure to be matched coupling after the completion of couplingiAnd { qi, that is, obtain Thick coupling is to { (pq)i, i represent mate to call number.
(3b) because the impact of mixed and disorderly background and noise etc., previous step obtains SIFT coupling centering and is usually present many mistakes Matching double points, they can cause the error of the geometric correction of imagery, and we are removed thick with random Uniform estimates (RANSAC) method Erroneous matching pair in matching result, is denoted as { P respectively with reference to the point set that figure and in figure to be matched mate centering after removing error hiding (xi,yj) and { Q (x 'i,y′i), i represent mate to call number.
Step 4, using the coupling point set { P (x with reference to figure and figure to be matchedi,yi) and { Q (x 'i,y′i), by affine Transforming function transformation function obtains accurate affine transformation parameter, and radiation transformation model is as follows:
Transformation matrix M is:
Wherein, a1、a2、b1、b2For rotary variable, c1For horizontal offset, c2For vertical offset.Finally according to solve Affine transformation matrix carries out registration to image.
Effect of the present invention can be further characterized by by following experiment:
One. experiment condition and content
Hardware platform is:Intel(R)Pentium(R)1CPU2.4GHz;
Software platform is:WindowXPProfessional, MATLABR2010;
Test used input picture as shown in figure 3, wherein Fig. 3 (a) and Fig. 3 (b) is SAR image.
Experiment content:
Emulation content 1, under these experimental conditions, is utilized respectively classical SIFT algorithm, the rapid registering in the present invention is calculated Method extracts SIFT feature point, further according to secondary recently closely ratio criterion (NNDR), in different distance than threshold value TdisUnder conditions of The characteristic point that two methods are extracted carries out Feature Points Matching.
Emulation content 2, on the basis of the thick matching result obtaining, is removed by mistake with random Uniform estimates algorithm (RANCAC) Coupling.Using the coupling finally remaining to seek affine transformation parameter, finally realize the registration of image.
If in view of distance than threshold value TdisObtain too little, in thick matching result correct coupling to number will very little, The requirement of affine transformation cannot be met;If distance is than threshold value TdisObtain too big, in thick matching result, correct coupling is to all The ratio that coupling centering is occupied is too low, and random Uniform estimates (RANCAC) algorithm robustness is too poor.So distance in experiment Ratio threshold value TdisChoose several values as follows:0.70、075、0.80、0.85.
Two. experimental result
The sparse region that the present invention obtains is as shown in figure 4, Fig. 4 (a) is the sparse region with reference to figure;Fig. 4 (b) is to be matched The sparse region of figure.As can be seen from the figure sparse region region has focused largely on the regions such as broken line flex point, infall, these areas Domain degree of rarefication is high, feature is obvious.
The method extracted using classical SIFT feature is to the result such as Fig. 5 institute extracting characteristic point with reference to figure and figure to be matched Show, in figure white point represents and extracts characteristic point, Fig. 5 (a) is the SIFT feature point distribution with reference to figure;Fig. 5 (b) is figure to be matched SIFT feature point distribution.From result figure it can be seen that come, classical SIFT feature extracting method can to image center line target area, The high region detection of the degree of rarefications such as infall region goes out more SIFT feature point, unconspicuous smooth region and texture to feature Region also can detect some characteristic points, and often exists a lot similar to it around the point of these smooth regions and texture region Point, thus easily formed error hiding.
Using the fast image registration method extracted with SIFT feature based on space degree of rarefication to reference to figure and figure to be matched The result extracting characteristic point extracts characteristic point as shown in fig. 6, in figure white point represents, Fig. 6 (a) is to divide with reference to the characteristic point of figure Cloth;Fig. 6 (b) is the characteristic point distribution of figure to be matched.From result figure it can be seen that to carry based on space degree of rarefication and SIFT feature The characteristic point that the fast image registration method taking is extracted exists only in that degree of rarefication is higher and feature obvious broken line flex point, intersection Place region, the point degree of rarefication in these regions is high, feature is stable, be difficult formation error hiding.
Using based on the feature matching method of Euclidean distance ratio, the characteristic point that two methods are extracted slightly is mated.Again With random Uniform estimates method RANSAC, error hiding is removed to thick matching result.The thick knot mating and removing error hiding of two methods Fruit is listed in Table 1, and is used as the evaluation index of matching effect quality with coupling accuracy.The definition of coupling accuracy:
The result of table 1 two methods Feature Points Matching
It is found that in the secondary recently closely ratio threshold value T of difference from table 1dis=0.70, Tdis=0.75, Tdis= 0.80、TdisUnder=0.85, in the present invention, correct coupling is in all ratios mated shared by centering all than classical SIFT algorithm all High.And, in the case that in thick matching result, error hiding rate is higher the advantage with respect to classical SIFT algorithm of the present invention is more Plus substantially.From coupling accuracy statistics as can be seen that in different Euclidean distances than threshold condition, the present invention mates accuracy ratio The coupling of classical SIFT algorithm is correctly in hgher efficiency.
Affine transformation parameter is obtained by affine transformation, takes threshold value to be TdisWhat when=0.75, final matching results obtained is imitative Penetrate transformation parameter as shown in table 2:
Table 2:Affine transformation parameter result
It is found that the affine transformation parameter that the inventive method obtains is more or less the same with SIFT arithmetic result from table 2, but It is that very big improvement is had on time efficiency, the algorithm time in the present invention has been reduced to 20.06 in 81.04 seconds by original Second, shorten to original a quarter, substantially increased registering efficiency, threshold value has been TdisWhen=0.75, the registration of two width images Result is as shown in fig. 7, shown with chessboard method.Because the present invention is only obvious to features such as the broken line of SAR image, flex point, crosspoints Sparse region extract SIFT feature point, and exclude that the degree of rarefications such as smooth region, texture region are low, feature is inconspicuous, easy shape Become the region of erroneous matching, thus improve registering efficiency on the premise of reducing error hiding rate.
The present invention have devised a kind of new fast image registration based on space degree of rarefication and SIFT feature feature extraction Method, first analyzes the degree of rarefication matrix of image, and to feature in image, SIFT feature point is extracted in obvious sparse region, is ensureing The efficiency of algorithm is improve while registration accuracy.
To sum up, the present invention is to judge the smooth region of image, texture region, sparse knot using SAR image coefficient of variation Structure region, exclusion feature is inconspicuous and smooth region that easily cause error hiding and texture region, only at those broken line flex points, friendship The sparse region that the higher pixel of degree of rarefication is located at fork etc. is extracted SIFT feature point and is carried out characteristic matching, affine by asking for Transformation parameter is realizing image registration.The present invention has advantages below compared with prior art:
1. the present invention has been using classical SIFT operator, sets up Gauss yardstick during extracting SIFT feature point During space, while using Gaussian function blurred picture, decrease SAR image speckle noise, thus improve pixel at it The ga s safety degree of neighborhood, decreases the generation of error hiding to a certain extent.
2. the present invention devises a kind of degree of rarefication and defines method, can more accurately analyze broken line, flex point, crosspoint etc. The obvious sparse region of feature, exclusion smooth region, the degree of rarefication such as texture region be low and the unconspicuous non-sparse region of feature.
3. the present invention is only to reference picture and image sparse extracted region SIFT feature point to be matched, and excludes smooth area The degree of rarefications such as domain, texture region are low, the unconspicuous non-sparse region of feature, thus reduce error hiding probability, and make algorithm More quick.
4. the present invention realizes simple, clear in structure, little to requiring with reference to figure and figure overlapping area to be matched, and SIFT calculates Son still has higher registration accuracy to the situation having rotation, the situation of change of scale or comprise region of variation.
The part of present embodiment not narration in detail belongs to the known conventional means of the industry, does not describe one by one here. Exemplified as above is only illustration to the present invention, does not constitute the restriction to protection scope of the present invention, every with this Bright same or analogous design belongs within protection scope of the present invention.

Claims (3)

1. a kind of fast image registration implementation method extracted based on space degree of rarefication and SIFT feature it is characterised in that:Including Following steps:
(1) two width multidate SAR image of the areal that input obtains in different time, are denoted as respectively with reference to figure and to be matched Figure, calculates degree of rarefication to each pixel with reference to figure and figure to be matched, selects the region work that the high pixel of degree of rarefication is located For sparse region, it is denoted as Mask respectivelyRAnd MaskS
This step (1) specifically includes following steps:
1) carry out two-layer wavelet decomposition to reference to figure R and figure S to be matched respectively, wavelet basis takes db1, obtain with reference to figure and to be matched The details coefficients figure of figure, is designated as R respectivelydb1、Sdb1
2) to reference to figure details coefficients figure and figure details coefficients figure to be matched each pixel calculate coefficient of variation, then with Centered on current pixel point, the neighborhood that a size is m*m is taken to calculate variance, you can to obtain the coefficient of variation of current pixel point CV;Set threshold value Tcv, it is in T with the coefficient of variation of SAR imagecvThe linear relationship of=β CV, wherein β are linear system Number;By threshold value TcvCan get coefficient of variation Matrix C V_mask of binaryzation:
C V _ m a s k = 1 , C V > T c v 0 , C V < = T c v
Calculate the coefficient of variation matrix with reference to figure and figure binaryzation to be matched, be designated as CV_mask respectivelyRAnd CV_maskS
3) refinement matrix T is obtained to the coefficient of variation matrix micronization processes with reference to figureR, equally to figure coefficient of variation square to be matched Battle array micronization processes obtain refinement matrix TS
4) calculate the sparse matrix with reference to figure and figure to be matched;
5), as degree of rarefication point, sparse expansion process obtains final product to select maximum front 20% pixel of sparse angle value in sparse matrix Arrive sparse region;
(2) respectively to reference to figure sparse region MaskRWith figure sparse region Mask to be matchedSExtract SIFT feature point, obtain reference Figure and figure SIFT feature point set to be matched are designated as { R respectivelym}、{Sn};
(3) to reference to the SIFT feature point set { R extracting on figure and figure to be matchedmAnd { SnCarry out characteristic matching, obtain thick Join feature to { (pq)i, the thick coupling point set with reference to figure and in figure to be matched is respectively { piAnd { qi, i represent mate to rope Quotation marks;
(4) utilize random Uniform estimates algorithm to thick matching result remove error hiding pair, remove error hiding to rear with reference to figure with The feature point pairs set that figure to be matched remains is designated as { (PQ)i, now each remain with reference to figure and in figure to be matched Characteristic point is respectively { Pi}、{Qi, wherein i represent mate to call number;
(5) using the coupling point set { P with reference to figure and figure to be matchediAnd { Qi, affine transformation parameter is obtained by affine transformation, Finally according to affine transformation parameter to two width image registrations.
2. a kind of fast image registration realization side being extracted based on space degree of rarefication and SIFT feature according to claim 1 Method it is characterised in that:Described step 4), comprise the steps:
A () calculates the size and Orientation of each pixel gradient in refinement matrix:
B () analyzes the direction of each pixel gradient, the direction character of statistics current pixel point, obtain with reference to figure and figure to be matched Direction character matrix, is designated as D respectivelyR、DS
C (), by refinement matrix and direction character matrix, can calculate reference picture and the degree of rarefication of image to be matched, degree of rarefication is fixed Justice is:
S = 1 N &Sigma; ( i , j ) &Element; &Omega; D ( i , j ) &CenterDot; T ( i , j ) 2
Wherein, (i, j) is the index of current pixel point, Ω be centered on current pixel point size as r × r neighborhood, N be neighborhood Pixel sum in Ω, D is direction character matrix, and definition T is refinement matrix;Thus obtain dilute with reference to figure and figure to be matched again Thin matrix SR、SS.
3. a kind of fast image registration realization side being extracted based on space degree of rarefication and SIFT feature according to claim 2 Method, is characterized in that:Described step (b), comprises the steps:
A) 16 direction templates of (2r+1) * (2r+1) size, the corresponding angular range in the sub- direction of each of direction template are constructed It is respectivelyI ∈ [1,16], r are direction template radius;
B) analyze each direction character refining marginal point in refinement matrix with 16 direction templates;
C) to reference
The direction character matrix normalization of figure and figure to be matched is processed, you can obtain the direction character square with reference to figure and figure to be matched Battle array DR、DS.
CN201410531222.6A 2014-10-10 2014-10-10 Rapid image registration implementation method based on space sparsity and SIFT feature extraction Expired - Fee Related CN104318548B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410531222.6A CN104318548B (en) 2014-10-10 2014-10-10 Rapid image registration implementation method based on space sparsity and SIFT feature extraction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410531222.6A CN104318548B (en) 2014-10-10 2014-10-10 Rapid image registration implementation method based on space sparsity and SIFT feature extraction

Publications (2)

Publication Number Publication Date
CN104318548A CN104318548A (en) 2015-01-28
CN104318548B true CN104318548B (en) 2017-02-15

Family

ID=52373774

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410531222.6A Expired - Fee Related CN104318548B (en) 2014-10-10 2014-10-10 Rapid image registration implementation method based on space sparsity and SIFT feature extraction

Country Status (1)

Country Link
CN (1) CN104318548B (en)

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104732546B (en) * 2015-04-02 2017-06-16 西安电子科技大学 The non-rigid SAR image registration method of region similitude and local space constraint
CN104966294B (en) * 2015-06-15 2016-03-23 清华大学 Based on Polarimetric SAR Image matching process and the device of orientation angle inverting
CN105139412B (en) * 2015-09-25 2018-04-24 深圳大学 A kind of high spectrum image angular-point detection method and system
CN105279522A (en) * 2015-09-30 2016-01-27 华南理工大学 Scene object real-time registering method based on SIFT
CN105427308B (en) * 2015-11-20 2017-03-15 中国地质大学(武汉) A kind of sparse and dense characteristic mates the method for registering images for combining
CN105809693B (en) * 2016-03-10 2018-11-16 西安电子科技大学 SAR image registration method based on deep neural network
CN106023187B (en) * 2016-05-17 2019-04-19 西北工业大学 A kind of method for registering images based on SIFT feature and angle relative distance
CN106550229A (en) * 2016-10-18 2017-03-29 安徽协创物联网技术有限公司 A kind of parallel panorama camera array multi-view image bearing calibration
CN109003293A (en) * 2017-06-07 2018-12-14 北京航空航天大学 Inhibit the SAR image registration method of model based on anisotropy spot
CN108304766B (en) * 2017-12-12 2019-01-04 交通运输部规划研究院 A method of dangerous material stockyard is screened based on high-definition remote sensing
CN108121972A (en) * 2017-12-25 2018-06-05 北京航空航天大学 A kind of target identification method under the conditions of partial occlusion
WO2019184719A1 (en) * 2018-03-29 2019-10-03 青岛海信移动通信技术股份有限公司 Photographing method and apparatus
CN111460864B (en) * 2019-01-22 2023-10-17 天津大学青岛海洋技术研究院 Animal disease detection method based on image recognition
CN110009670A (en) * 2019-03-28 2019-07-12 上海交通大学 The heterologous method for registering images described based on FAST feature extraction and PIIFD feature
CN110097585B (en) * 2019-04-29 2021-01-26 中国水利水电科学研究院 SAR image matching method and system based on SIFT algorithm
CN110097015B (en) * 2019-05-08 2020-05-26 杭州视在科技有限公司 Automatic identification method for deviation of preset position of dome camera based on dense feature point matching
CN110659637A (en) * 2019-09-24 2020-01-07 国网河北省电力有限公司电力科学研究院 Electric energy meter number and label automatic identification method combining deep neural network and SIFT features
CN110992413A (en) * 2019-12-13 2020-04-10 中国人民解放***箭军工程大学 High-precision rapid registration method for airborne remote sensing image
CN111325722B (en) * 2020-02-17 2024-02-20 江苏诚印科技有限公司 Seal image accurate identification method and system and seal image identification processing method
CN115176274A (en) * 2020-06-08 2022-10-11 上海交通大学 Heterogeneous image registration method and system
CN112669360B (en) * 2020-11-30 2023-03-10 西安电子科技大学 Multi-source image registration method based on non-closed multi-dimensional contour feature sequence
CN114998630B (en) * 2022-07-19 2022-11-04 北京科技大学 Ground-to-air image registration method from coarse to fine
CN116109915B (en) * 2023-04-17 2023-07-18 济宁能源发展集团有限公司 Intelligent recognition method for container door state
CN117830301B (en) * 2024-03-04 2024-05-14 青岛正大正电力环保设备有限公司 Slag dragging region detection method based on infrared and visible light fusion characteristics

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103020647A (en) * 2013-01-08 2013-04-03 西安电子科技大学 Image classification method based on hierarchical SIFT (scale-invariant feature transform) features and sparse coding
CN103413119A (en) * 2013-07-24 2013-11-27 中山大学 Single sample face recognition method based on face sparse descriptors
CN103544711A (en) * 2013-11-08 2014-01-29 国家测绘地理信息局卫星测绘应用中心 Automatic registering method of remote-sensing image
CN103984966A (en) * 2014-05-29 2014-08-13 西安电子科技大学 SAR image target recognition method based on sparse representation

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8620093B2 (en) * 2010-03-15 2013-12-31 The United States Of America As Represented By The Secretary Of The Army Method and system for image registration and change detection

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103020647A (en) * 2013-01-08 2013-04-03 西安电子科技大学 Image classification method based on hierarchical SIFT (scale-invariant feature transform) features and sparse coding
CN103413119A (en) * 2013-07-24 2013-11-27 中山大学 Single sample face recognition method based on face sparse descriptors
CN103544711A (en) * 2013-11-08 2014-01-29 国家测绘地理信息局卫星测绘应用中心 Automatic registering method of remote-sensing image
CN103984966A (en) * 2014-05-29 2014-08-13 西安电子科技大学 SAR image target recognition method based on sparse representation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于局部SIFT特征点的双阈值配准算法;邓朝省 等;《计算机工程与应用》;20140123;第189-193页 *

Also Published As

Publication number Publication date
CN104318548A (en) 2015-01-28

Similar Documents

Publication Publication Date Title
CN104318548B (en) Rapid image registration implementation method based on space sparsity and SIFT feature extraction
DE112020004810B4 (en) Systems and methods for recording surface normals with polarization
CN110443836B (en) Point cloud data automatic registration method and device based on plane features
CN104574421B (en) Large-breadth small-overlapping-area high-precision multispectral image registration method and device
CN107248159A (en) A kind of metal works defect inspection method based on binocular vision
CN104930985B (en) Binocular vision 3 D topography measurement method based on space-time restriction
CN104899888B (en) A kind of image sub-pixel edge detection method based on Legendre squares
CN107067415A (en) A kind of quick accurate positioning method of target based on images match
CN106934795A (en) The automatic testing method and Forecasting Methodology of a kind of glue into concrete beam cracks
CN104933434A (en) Image matching method combining length between perpendiculars (LBP) feature extraction method and surf feature extraction method
CN106709950A (en) Binocular-vision-based cross-obstacle lead positioning method of line patrol robot
CN109859226A (en) A kind of detection method of the X-comers sub-pix of figure segmentation
CN108765476A (en) A kind of polarization image method for registering
CN112396643A (en) Multi-mode high-resolution image registration method with scale-invariant features and geometric features fused
CN104732532A (en) Remote sensing satellite multispectral image registration method
CN105631872B (en) Remote sensing image registration method based on multi-characteristic points
CN106327464A (en) Edge detection method
CN110378924A (en) Level set image segmentation method based on local entropy
Wang et al. Automatic fundus images mosaic based on SIFT feature
CN105279522A (en) Scene object real-time registering method based on SIFT
CN111462198A (en) Multi-mode image registration method with scale, rotation and radiation invariance
CN112241964A (en) Light strip center extraction method for line structured light non-contact measurement
CN104966283A (en) Imaging layered registering method
CN103679713A (en) Two-dimensional image registration method for partially matched images
KR101733028B1 (en) Method For Estimating Edge Displacement Againt Brightness

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20170215