CN103824302B - The SAR image change detection merged based on direction wave area image - Google Patents

The SAR image change detection merged based on direction wave area image Download PDF

Info

Publication number
CN103824302B
CN103824302B CN201410091118.XA CN201410091118A CN103824302B CN 103824302 B CN103824302 B CN 103824302B CN 201410091118 A CN201410091118 A CN 201410091118A CN 103824302 B CN103824302 B CN 103824302B
Authority
CN
China
Prior art keywords
pixel
image
difference
pixels
joint
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410091118.XA
Other languages
Chinese (zh)
Other versions
CN103824302A (en
Inventor
白静
焦李成
刘斌
马文萍
马晶晶
王爽
张向荣
胡波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN201410091118.XA priority Critical patent/CN103824302B/en
Publication of CN103824302A publication Critical patent/CN103824302A/en
Application granted granted Critical
Publication of CN103824302B publication Critical patent/CN103824302B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses a kind of SAR image change detection merged based on direction wave area image, overcome disparity map in prior art and when merging, can not preferably retain disparity map detailed information, the problem that disparity map uses binarization segmentation method can cause region of variation information dropout.The step that the present invention realizes is: (1) input picture;(2) Image semantic classification;(3) disparity map is generated;(4) tectonic syntaxis disparity map;(5) segmentation associating disparity map;(6) output image.The present invention has the advantage describing region of variation with not changing zone boundary more accurately, loss is relatively low, can be applicable to the natural disaster detection in synthetic aperture radar SAR image field of target recognition.

Description

SAR image change detection method based on directional wave domain image fusion
Technical Field
The invention belongs to the technical field of image processing, and further relates to a Synthetic Aperture Radar (SAR) image change detection method based on directional wave domain image fusion in the technical field of SAR image change detection. The method can be used for natural disaster detection in the field of SAR image target identification.
Background
The research object of change detection is multi-temporal data, and requires to acquire imaging data of an observation target at different moments, and the purpose of change detection is to extract change information of the observation object at different moments by analyzing the multi-temporal data. Synthetic Aperture Radar (SAR) imaging is used as an important ground observation means, particularly an active microwave imaging principle, so that the SAR can be observed all weather and all day long without being influenced by weather and illumination, in addition, the penetration capability of microwave enables the SAR to penetrate through forests, and the SAR has the detection capability on maneuvering targets hidden in forest regions, so that SAR images become important information sources for image change detection.
The patent application of the university of electronic science and technology of Xian in the 'sar image change detection method based on neighborhood logarithm ratio and anisotropic diffusion' (patent application number: CN201110005209, publication number: CN 102096921A) provides an sar image change detection method based on neighborhood logarithm ratio and anisotropic diffusion. The method comprises the following specific steps: firstly, constructing a difference graph of two time phase graphs by adopting a neighborhood logarithm ratio method, and then carrying out self-adaptive anisotropic diffusion filtering and maximum inter-class variance method segmentation on the difference graph to obtain a final change detection result. Although the method can better maintain the edge information of the change area, the method still has the defect that the change area information is lost and the undetected rate of change detection is high due to the fact that the binary segmentation is not suitable for describing the pixel types of the combined difference map.
The Sigan electronic science and technology university provides an optical remote sensing image change detection method based on image fusion in the patent application 'optical remote sensing image change detection based on image fusion' (patent application number: CN2012102340761, publication number: CN 102750705A). The method comprises the following specific steps: firstly, two difference maps are fused in a wavelet domain to obtain a combined difference map, and then the fused difference map is segmented by using a fuzzy local C-means clustering method to obtain a change detection result. Although the method improves the precision of the detection result by improving the performance of the difference map of the optical remote sensing image, the method still has the defect that the detail information of the difference map cannot be well reserved when the difference map is fused because the extraction performance of wavelet transformation on line singular information is poor, so that the boundary detection of a changed area and an unchanged area in the change detection result is inaccurate.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides a SAR image change detection method based on directional wave domain image fusion. According to the SAR image change detection method, the accuracy of the SAR image change detection result is improved by improving the performance of the difference map and improving the segmentation strategy of the difference map.
In order to achieve the purpose, the method comprises the following specific steps:
(1) inputting an image:
two SAR images S optionally shot in the same area and at different time1And S2As the SAR image to be changed.
(2) Image preprocessing:
respectively processing SAR image S by adopting a probabilistic block matching filtering method1And S2Filtering to obtain filtered image X1And X2
(3) Generating a difference map:
(3a) using ratio operator formula to filter image X1And X2The pixel values in (1) are operated to obtain a difference map I1A pixel value of (a);
(3b) using a hybrid operator formula to filter the image X1And X2The pixel values in (1) are operated to obtain a difference map I2The pixel value of (2).
(4) Constructing a joint difference map:
(4a) respectively to difference chart I1And I2Performing directional wave transformation to obtain a difference chart I1And I2The directional wave low-frequency subband coefficient and the high-frequency subband coefficient;
(4b) using the following formula, for difference plot I1And I2The low-frequency subband coefficients of the directional waves are fused to obtain a combined difference chart I3Directional wave low frequency subband coefficient of (2):
L ij I 3 = 1 2 ( L ij I 1 + L ij I 2 ) , i ∈ M , j ∈ N
wherein,representation of joint disparity map I3The coefficients of the ith row and the jth column in the directional wave low-frequency sub-band,shows a difference chart I1The coefficients of the ith row and the jth column of the directional wave low-frequency sub-band of (1),shows a difference chart I2The coefficients of the ith row and the jth column of the directional wave low-frequency sub-band of (1), M represents a difference map I1And I2N denotes the disparity map I1And I2The number of columns;
(4c) using the following formula, for difference plot I1And I2The high-frequency sub-band coefficients of the directional waves are fused to obtain a combined difference chart I3Directional wave high frequency subband coefficient of (2):
H ij I 3 = max ( H ij I 1 , H ij I 2 ) , i ∈ M , j ∈ N
wherein,representation of joint disparity map I3The coefficient of the ith row and the jth column in the directional wave high-frequency sub-band of (1), max (-) represents the maximum value operation,shows a difference chart I1The ith row and the jth column of the high-frequency subband of (2),shows a difference chart I2M represents the difference map I1And I2N denotes the disparity map I1And I2The number of columns;
(4d) to joint difference chart I3The low-frequency coefficient and the high-frequency coefficient of the directional wave are subjected to inverse directional wave transformation to obtain a combined difference chart I3
(5) Partitioning the joint difference map:
(5a) adopting a fuzzy C-means clustering method, and combining the difference chart I according to the following steps3All the pixels are divided into strong change pixels, weak change pixels and unchanged pixels:
(5a1) will combine difference maps I3The gray value of all the pixels is used as the characteristic vector of each pixel, and meanwhile, the gray value of all the pixels is used as the characteristic vector of each pixel, and the gray value is extracted from the joint difference map I3The gray value of 3 pixels is arbitrarily selected as 3 initial clustering centers of the 3 types of pixels;
(5a2) updating the joint difference map I according to3Membership of feature vector of middle pixel:
u ik = Σ ( | | x k - v i | | | | x k - v j | | ) - 2 , j = 1,2 , . . . , n
wherein u isikRepresentation of joint disparity map I3The feature vector of the kth pixel belongs to the ith pixel∑ is the summation operation, xkRepresentation of joint disparity map I3Characteristic vector of the k-th pixel, viRepresenting the cluster center of the ith class of pixels, vjDenotes the cluster center of the jth pixel, k denotes the joint difference map I3The serial number of the feature vector of the middle pixel, I represents the current cluster type, j represents any cluster type, and n represents the joint difference graph I3The number of feature vectors of the middle pixel, | | | · | |, represents the Euclidean distance solving operation;
(5a3) the cluster center vector is updated as follows:
v i = Σ ( u ik ) 2 x k Σ ( u ik ) 2 , k = 1,2 , . . . , n
wherein v isiRepresentation of joint disparity map I3Cluster center vector for class i pixels, ∑ for summation operation, μikRepresentation of joint disparity map I3The feature vector of the kth pixel is the membership degree, x, of the ith pixelkRepresentation of joint disparity map I3The feature vector of the k-th pixel in (a), I represents the class of the cluster, and k represents the joint disparity map I3The serial number of the feature vector of the middle pixel, n represents the joint difference map I3The number of feature vectors of the middle pixel;
(5a4) performing the first step to the third step for 20 times in an iteration way to obtain a joint difference graph I3A membership matrix of eigenvectors of the middle pixel;
(5a5) taking the row number of the maximum value of the membership degree of each column in the membership matrix as the class of the pixel to obtain a strong variation class pixel, a weak variation class pixel and an unchanged class pixel;
(5b) combining the strong variation pixels and the weak variation pixels into variation pixels;
(5c) using the unchanged pixels in the step (5a) as the change detection result image IrThe non-change pixels in (5b) are used as the change detection result image IrTo obtain a change detection result image Ir
(6) Outputting an image:
outputting a change detection result image Ir
Compared with the prior art, the method of the invention has the following advantages:
firstly, because the two difference maps are fused by adopting the direction wave domain image fusion method, the defect that the detail information of the difference maps cannot be well reserved during the fusion of the difference maps in the prior art is overcome, and the performance of extracting the detail information by combining the difference maps is improved, so that the method has the advantage of more accurately describing the boundary of the changed region and the unchanged region.
Secondly, because the difference image pixels are divided into the strong variation pixels, the weak variation pixels and the unchanged pixels by adopting the fuzzy C-means method, the defect that the information of the variation area is lost due to the adoption of a binarization segmentation method for the difference image in the prior art is overcome, the information of the variation area in the combined difference image is more accurately reserved, and the method has the advantage of low omission ratio.
Drawings
FIG. 1 is a flow chart of the present invention;
FIG. 2 is a comparison graph of the SAR image change detection effect of Bern (Bern) region in the method of the present invention and the prior art;
fig. 3 is a comparison graph of the detection effect of the method of the present invention on the change of the SAR image in the Ottawa (Ottawa) area.
Detailed Description
The invention is further described below with reference to the accompanying drawings.
Referring to the attached figure 1, the implementation steps of the invention are as follows:
step 1: an image is input.
Two SAR images S optionally shot in the same area and at different time1And S2The synthetic aperture radar SAR images used in the embodiment of the present invention are shown in fig. 2(a), fig. 2(b), fig. 3(a) and fig. 3(b), respectively, wherein fig. 2(a) is a synthetic aperture radar SAR image of 4 months 1999 in Bern (Bern) area, 301 × 301 in size, fig. 2(b) is a synthetic aperture radar SAR image of 5 months 1999 in Bern (Bern) area, 301 × 301 in size, fig. 3(a) is a synthetic aperture radar SAR image of 5 months 1997 in Ottawa (Ottawa) area, 290 × 350 in size, and fig. 3(b) is a synthetic aperture radar image of 8 months 1997 in Ottawa (Ottawa) area, 290 SAR 290 × 350 in size.
Step 2: and (5) image preprocessing.
Respectively processing SAR image S by adopting a probabilistic block matching filtering method1And S2Filtering to obtain filtered image X1And X2. The probabilistic block matching filtering method comprises the following specific steps:
firstly, setting SAR image S1And S2The size of the neighborhood window for any pixel point in the array is 7 × 7.
Secondly, respectively calculating SAR images S according to the following formula1And S2The weight of any pixel point and other pixel points in the neighborhood window of the pixel point is as follows:
w ( q , t ) = exp ( - Σ p = 1 49 1 h log ( A q , p A t , r + A t , r A q , p ) )
where w (q, t) denotes the SAR image S1And S2The weight values of any pixel point q and other pixel points t in a neighborhood window of the pixel point are exp (·) representing the index taking operation, ∑ representing the summation operation, h representing a smooth parameter for controlling exponential attenuation degree, log (·) representing the logarithm taking operation, p representing the serial number of the pixel point of any pixel point q in a neighborhood with the size of 7 × 7, r representing the serial number of the pixel point of pixel point t in a neighborhood with the size of 7 × 7, and r = p, Aq,pExpressing the gray value A of any pixel point q at the position corresponding to the pixel point number p in the neighborhoodt,rAnd expressing the gray value of the pixel point t at the position corresponding to the pixel point sequence number r in the neighborhood.
Thirdly, respectively calculating the SAR images S according to the following formula1And S2The gray scale estimation value of any pixel point:
z ^ 1 ( q ) = Σ t ∈ W q w ( q , t ) A t 2 Σ t ∈ W q w ( q , t )
wherein,representing SAR images S1And S2The gray scale estimation value of any pixel point q in the tree, ∑ represents the summation operation, WqRepresenting SAR images S1And S2A neighborhood window of 7 × 7 size of any pixel point q in the space, w (q, t) represents SAR image S1And S2The weight value of any pixel point q and other pixel points t in the neighborhood window, AtAnd representing the gray value of the pixel point t.
Fourthly, the first step, the second step and the third step are iterated for 25 times to obtain an SAR image S1And S2Filtered image X of1And X2
And step 3: and generating a difference map.
Using ratio operator formula to filter image X1And X2The pixel values in (1) are operated to obtain a difference map I1The pixel value of (2). The ratio operator formula is as follows:
D = Σ k = 1 49 min ( X 1 , k N i , X 2 , k N i ) Σ k = 1 49 max ( X 1 , k N i , X 2 , k N i )
wherein D represents a disparity map I1∑ denotes a sum operation, min (-) denotes a min operation, max (-) denotes a max operation, NiRepresenting the filtered image X1And X2Of a neighborhood of size 7 × 7 centered on the ith pixel, i representing the filtered image X1And X2K denotes the pixel number in the neighborhood Ni, k =1,2, ·,49,representing the filtered image X1Middle neighborhood NiThe number k of the pixels of (a),representing the filtered image X2Middle neighborhood NiThe kth pixel of (1).
Using a hybrid operator formula to filter the image X1And X2The pixel values in (1) are operated to obtain a difference map I2The pixel value of (2). The hybrid operator formula is as follows:
F = 1 2 { log ( 1 2 ( X 1 , k N i X 2 , k N i + X 2 , k N i X 1 , k N i ) ) + 1 49 Σ k = 1 49 log ( 1 2 ( X 1 , k N i X 2 , k N i + X 2 , k N i X 1 , k N i ) ) }
wherein F represents a disparity map I2Log (-) denotes a logarithm operation, ∑ denotes a sum operation, NiRepresenting the filtered image X1And X2Of a neighborhood of size 7 × 7 centered on the ith pixel, i representing the filtered image X1And X2Pixel number in (1), k represents neighborhood NiThe pixel number in (1), k =1,2, ·,49,representing the filtered image X1Middle neighborhood NiThe number k of the pixels of (a),representing the filtered image X2Middle neighborhood NiThe kth pixel of (1).
And 4, step 4: a joint difference map is constructed.
Adopting a direction wave domain image fusion method to perform difference image I1And I2Performing fusion to obtain a combined difference map I3. The method for fusing the directional wave domain images comprises the following specific steps:
first, to difference chart I1And I2Performing directional wave transformation to obtain a difference chart I1And I2Low frequency subband coefficients and high frequency subband coefficients.
Second, using the following formula, for the difference plot I1And I2The low-frequency subband coefficients of the directional waves are fused to obtain a combined difference chart I3Directional wave low frequency subband coefficient of (2):
L ij I 3 = 1 2 ( L ij I 1 + L ij I 2 ) , i ∈ M , j ∈ N
wherein,representation of joint disparity map I3The coefficients of the ith row and the jth column in the directional wave low-frequency sub-band,shows a difference chart I1The coefficients of the ith row and the jth column of the directional wave low-frequency sub-band of (1),shows a difference chart I2The coefficients of the ith row and the jth column of the directional wave low-frequency sub-band of (1), M represents a difference map I1And I2N denotes the disparity map I1And I2The number of columns.
Third, using the following formula, for the difference chart I1And I2The high-frequency sub-band coefficients of the directional waves are fused to obtain a combined difference chart I3Directional wave high frequency subband coefficient of (2):
H ij I 3 = max ( H ij I 1 , H ij I 2 ) , i ∈ M , j ∈ N
wherein,representation of joint disparity map I3The coefficient of the ith row and the jth column in the directional wave high-frequency sub-band of (1), max (-) represents the maximum value operation,shows a difference chart I1The ith row and the jth column of the high-frequency subband of (2),shows a difference chart I2M represents the difference map I1And I2N denotes the disparity map I1And I2The number of columns.
The fourth step is to combine the difference map I3The low-frequency coefficient and the high-frequency coefficient of the directional wave are subjected to inverse directional wave transformation to obtain a combined difference chart I3
And 5: and (5) segmenting the joint difference map.
Using mouldsFuzzy C mean clustering method, combining difference chart I3All the pixels are divided into pixels with strong variation, pixels with weak variation and pixels with unchanged variation. The fuzzy C-means clustering method comprises the following specific steps:
first, combine the difference maps I3The gray value of each pixel is used as the characteristic vector of the pixel, and meanwhile, the joint difference map I3The gray value of 3 pixels is arbitrarily selected as 3 initial clustering centers of the 3 types of pixels.
Second, the joint difference map I is updated according to the following formula3Membership of feature vector of middle pixel:
u ik = Σ ( | | x k - v i | | | | x k - v j | | ) - 2 , j = 1,2 , . . . , n
wherein u isikRepresentation of joint disparity map I3The membership of the eigenvector of the kth pixel to the i-th pixel, ∑ is the summation operation, xkRepresentation of joint disparity map I3Characteristic vector of the k-th pixel, viRepresenting the cluster center of the ith class of pixels, vjDenotes the cluster center of the jth pixel, k denotes the joint difference map I3The serial number of the feature vector of the middle pixel, I represents the current cluster type, j represents any cluster type, and n represents the joint difference graph I3The number of feature vectors of the middle pixel, | | | · | |, represents the euclidean distance solving operation.
Thirdly, updating the clustering center vector according to the following formula:
v i = Σ ( u ik ) 2 x k Σ ( u ik ) 2 , k = 1,2 , . . . , n
wherein v isiRepresentation of joint disparity map I3Cluster center vector for class i pixels, ∑ for summation operation, μikRepresentation of joint disparity map I3The feature vector of the kth pixel is the membership degree, x, of the ith pixelkRepresentation of joint disparity map I3The feature vector of the k-th pixel in (a), I represents the class of the cluster, and k represents the joint disparity map I3The serial number of the feature vector of the middle pixel, n represents the joint difference map I3The number of feature vectors of the middle pixel.
Fourthly, the first step to the third step are iterated for 20 times to obtain a joint difference graph I3A membership matrix of eigenvectors of the middle pixel.
And fifthly, taking the row number of the maximum value of the membership degree of each column in the membership matrix as the class of the pixel to obtain the strong variation class pixel, the weak variation class pixel and the unchanged class pixel.
Using unchanged pixels obtained by the fuzzy C-means clustering method as a change detection result image IrThe strong change pixels and the weak change pixels are combined into change pixels and used as change detection result images IrTo obtain a change detection result image Ir
Step 6: outputting a change detection resultImage Ir
The effect of the present invention will be further explained with reference to the simulation diagrams of fig. 2 and fig. 3.
1, simulation experiment conditions:
the hardware test platform of the invention is: the processor is an Inter Core2Duo CPU E8200, the dominant frequency is 2.67GHz, the memory is 2GB, and the software platform is as follows: windows7 flagship version 32-bit operating system and MatlabR2012 b. The input images of the invention are two synthetic aperture radar SAR images in a Bern (Bern) area and two synthetic aperture radar SAR images in an Ottawa (Ottawa) area respectively, wherein the two synthetic aperture radar SAR images in the Bern (Bern) area are 301 multiplied by 301, the two synthetic aperture radar SAR images in the Ottawa (Ottawa) area are 290 multiplied by 350, and the formats of the two synthetic aperture radar SAR images are BMP.
2, simulation content:
the two methods used in the prior art for comparison are respectively as follows:
the patent technology owned by the university of west' an electronic technology "SAR image change detection method based on neighborhood logarithm ratio and anisotropic diffusion" (patent application number: CN201110005209, publication number: CN 102096921B) provides a synthetic aperture radar SAR image change detection method, which is based on the maximum inter-class variance method for short.
An optical remote sensing image change detection method provided in the patent technology ' optical remote sensing image change detection based on image fusion ' (patent application No. CN201210234076, publication No. CN 102750705) owned by the university of Western ' An electronic technology, is called wavelet domain image fusion method for short.
3. And (3) simulation result analysis:
fig. 2 is a comparison diagram of detecting changes of SAR images of berne (Bern) regions according to the present invention and the prior art, where fig. 2(a) is a synthetic aperture radar SAR image of the berne (Bern) region in 4 months 1999, fig. 2(b) is a synthetic aperture radar SAR image of the berne (Bern) region in 5 months 1999, fig. 2(c) is a reference image of detecting changes of the synthetic aperture radar SAR images of the Bern (Bern) regions, fig. 2(d) is a change detection result based on the maximum inter-class variance method, fig. 2(e) is a change detection result based on the wavelet domain image fusion method, and fig. 2(f) is a change detection result of the present invention.
Fig. 3 is a comparison graph of the change detection of the SAR image in the Ottawa (Ottawa) area in comparison with the prior art, where fig. 3(a) is the synthetic aperture radar SAR image in the Ottawa (Ottawa) area in 5 months 1997, fig. 3(b) is the synthetic aperture radar SAR image in the Ottawa (Ottawa) area in 8 months 1997, fig. 3(c) is the change detection reference image of the synthetic aperture radar SAR image in the Ottawa (Ottawa) area, fig. 3(d) is the change detection result based on the maximum inter-class variance method, fig. 3(e) is the change detection result based on the wavelet domain image fusion method, and fig. 3(f) is the change detection result of the method of the present invention.
The experimental result evaluation parameters are respectively the number of missed detections, the number of false detections, the total number of errors and the total error rate, and table 1 shows statistics of change detection results in a simulation experiment, wherein, OSTU represents the maximum inter-class variance method, WDFCD represents the wavelet domain image fusion method, ME represents the number of missed detections, FA represents the number of false detections, OE represents the total error number, and OER represents the total error rate.
TABLE 1Bern area and Ottawa area test results
As can be seen from fig. 2, fig. 3 and table 1, the detection result is the worst based on the variance method between the maximum classes, which mainly shows that the number of missed detections is too large, and is especially reflected in that many small targets are not detected; the performance of the wavelet domain-based image fusion method is superior to that of the maximum inter-class variance method, the false detection number and the missed detection number are reduced, but the phenomenon that a smaller target cannot be detected still exists; the method is superior to the former two methods in terms of error detection number and omission number, and particularly, a plurality of smaller point targets are detected, which shows that the method is superior to the former two methods in terms of detection precision.
The above simulation experiments show that: the method of the invention has good performance in the aspects of robustness to speckle noise and detection precision, can solve the problems of sensitivity to speckle noise, low detection precision and the like in the existing method, and is a very practical synthetic aperture radar SAR image change detection method.

Claims (1)

1. A SAR image change detection method based on directional wave domain image fusion comprises the following steps:
(1) inputting an image:
two SAR images S optionally shot in the same area and at different time1And S2As a SAR image to be changed and detected;
(2) image preprocessing:
respectively processing SAR image S by adopting a probabilistic block matching filtering method1And S2Filtering to obtain filtered image X1And X2
The probabilistic block matching filtering method is to respectively carry out the SAR image S according to the following steps1And S2Filtering to obtain filtered image X1And X2
Firstly, setting SAR image S1And S2The size of a neighborhood window of any pixel point in the image is 7 × 7;
secondly, respectively calculating SAR images S according to the following formula1And S2The weight of any pixel point and other pixel points in the neighborhood window of the pixel point is as follows:
w ( q , t ) = exp ( - Σ p = 1 49 1 h l o g ( A q , p A t , r + A t , r A q , p ) )
where w (q, t) denotes the SAR image S1And S2The weight of any pixel point q and other pixel points t in a neighborhood window of the pixel point is exp (·) to represent the operation of fetching the exponent, ∑ to represent the operation of summation, h to represent the smooth parameter for controlling the exponential attenuation degree, log (·) to represent the operation of fetching the logarithm, p to represent the serial number of the pixel point of any pixel point q in the neighborhood of 7 × 7 size, r to represent the serial number of the pixel point of pixel point t in the neighborhood of 7 × 7 size, and r ═ p, Aq,pExpressing the gray value A of any pixel point q at the position corresponding to the pixel point number p in the neighborhoodt,rExpressing the gray value of the pixel point t at the position corresponding to the pixel point serial number r in the neighborhood;
thirdly, calculating according to the following formulaSAR image S1And S2The gray scale estimation value of any pixel point:
z ^ 1 ( q ) = Σ t ∈ W q w ( q , t ) A t 2 Σ t ∈ W q w ( q , t )
wherein,representing SAR images S1And S2The gray scale estimation value of any pixel point q in the tree, ∑ represents the summation operation, WqRepresenting SAR images S1And S2A neighborhood window of 7 × 7 size of any pixel point q in the space, w (q, t) represents SAR image S1And S2The weight value of any pixel point q and other pixel points t in the neighborhood window, AtExpressing the gray value of the pixel point t;
fourthly, the first step, the second step and the third step are iterated for 25 times to obtain an SAR image S1And S2Filtered image X of1And X2
(3) Generating a difference map:
(3a) using ratio operator formula to filter image X1And X2The pixel values in (1) are operated to obtain a difference map I1A pixel value of (a);
the ratio operator formula is as follows:
D = Σ k = 1 49 m i n ( X 1 , k N i , X 2 , k N i ) Σ k = 1 49 m a x ( X 1 , k N i , X 2 , k N i )
wherein D represents a disparity map I1∑ denotes a sum operation, min (-) denotes a min operation, max (-) denotes a max operation, NiRepresenting the filtered image X1And X2Of a neighborhood of size 7 × 7 centered on the ith pixel, i representing the filtered image X1And X2Pixel number in (1), k represents neighborhood NiThe pixel number in (1), 2, …,49,representing the filtered image X1Middle neighborhood NiThe number k of the pixels of (a),representing the filtered image X2Middle neighborhood NiThe kth pixel of (1);
(3b) using a hybrid operator formula to filter the image X1And X2The pixel values in (1) are operated to obtain a difference map I2A pixel value of (a);
the formula of the mixed operator is as follows:
F = 1 2 { log ( 1 2 ( X 1 , k N i X 2 , k N i + X 2 , k N i X 1 , k N i ) ) + 1 49 Σ k = 1 49 log ( 1 2 ( X 1 , k N i X 2 , k N i + X 2 , k N i X 1 , k N i ) ) }
wherein F represents a disparity map I2Log (-) denotes a logarithm operation, ∑ denotes a sum operation, NiRepresenting the filtered image X1And X2Of a neighborhood of size 7 × 7 centered on the ith pixel, i representing the filtered image X1And X2Pixel number in (1), k represents neighborhood NiThe pixel number in (1), 2, …,49,representing the filtered image X1Middle neighborhood NiThe number k of the pixels of (a),representing the filtered image X2Middle neighborhood NiThe kth pixel of (1);
(4) constructing a joint difference map:
(4a) respectively to difference chart I1And I2Performing directional wave transformation to obtain a difference chart I1And I2The directional wave low-frequency subband coefficient and the high-frequency subband coefficient;
(4b) using the following formula, for difference plot I1And I2The low-frequency subband coefficients of the directional waves are fused to obtain a combined difference chart I3Directional wave low frequency subband coefficient of (2):
L h v I 3 = 1 2 ( L h v I 1 + L h v I 2 ) , h ∈ M , v ∈ N
wherein,representation of joint disparity map I3The coefficients of the h-th row and the v-th column in the directional wave low-frequency sub-band,shows a difference chart I1The coefficients of the h row and the v column of the directional wave low-frequency sub-band,shows a difference chart I2M represents the coefficient of the h-th line and v-th column of the directional wave low-frequency subband, and M represents the difference map I1And I2N denotes the disparity map I1And I2The number of columns;
(4c) using the following formula, for difference plot I1And I2The high-frequency sub-band coefficients of the directional waves are fused to obtain a combined difference chart I3Directional wave high frequency subband coefficient of (2):
H h v I 3 = m a x ( H h v I 1 , H h v I 2 ) , h ∈ M , v ∈ N
wherein,representation of joint disparity map I3The coefficients of the h-th row and the v-th column in the directional wave high-frequency sub-band of (1), max (-) represents the maximum value operation,shows a difference chart I1The h-th row and the v-th column of the high-frequency subband,shows a difference chart I2M represents the coefficient of the h-th line and the v-th column in the directional wave low-frequency subband, and M represents the difference map I1And I2N denotes the disparity map I1And I2The number of columns;
(4d) to joint difference chart I3Low frequency coefficient and high frequency of the directional waveThe coefficients are subjected to inverse directional wave transformation to obtain a joint difference map I3
(5) Partitioning the joint difference map:
(5a) adopting a fuzzy C-means clustering method, and combining the difference chart I according to the following steps3All the pixels are divided into strong change pixels, weak change pixels and unchanged pixels:
(5a1) will combine difference maps I3The gray value of all the pixels is used as the characteristic vector of each pixel, and meanwhile, the gray value of all the pixels is used as the characteristic vector of each pixel, and the gray value is extracted from the joint difference map I3The gray value of 3 pixels is arbitrarily selected as 3 initial clustering centers of the 3 types of pixels;
(5a2) updating the joint difference map I according to3Membership of feature vector of middle pixel:
u i k = Σ ( | | x k - v i | | | | x k - v j | | ) - 2 , j = 1 , 2 , ... , n
wherein u isikRepresentation of joint disparity map I3The membership of the eigenvector of the kth pixel to the i-th pixel, ∑ is the summation operation, xkRepresentation of joint disparity map I3Characteristic vector of the k-th pixel, viRepresenting the cluster center of the ith class of pixels, vjDenotes the cluster center of the jth pixel, k denotes the joint difference map I3The serial number of the feature vector of the middle pixel, I represents the current cluster type, j represents any cluster type, and n represents the joint difference graph I3The number of feature vectors of the middle pixel, | | | · | |, represents the Euclidean distance solving operation;
(5a3) the cluster center vector is updated as follows:
v i = Σ ( u i k ) 2 x k Σ ( u i k ) 2 , k = 1 , 2 , ... , n
wherein v isiRepresentation of joint disparity map I3Cluster center vector for class i pixels, ∑ for summation operation, μikRepresentation of joint disparity map I3The feature vector of the kth pixel is the membership degree, x, of the ith pixelkRepresentation of joint disparity map I3The feature vector of the k-th pixel in (a), I represents the class of the cluster, and k represents the joint disparity map I3The serial number of the feature vector of the middle pixel, n represents the joint difference map I3The number of feature vectors of the middle pixel;
(5a4) iteratively executing (5a1) to (5a3) 20 times to obtain a joint difference map I3A membership matrix of eigenvectors of the middle pixel;
(5a5) taking the row number of the maximum value of the membership degree of each column in the membership matrix as the class of the pixel to obtain a strong variation class pixel, a weak variation class pixel and an unchanged class pixel;
(5b) combining the strong variation pixels and the weak variation pixels into variation pixels;
(5c) using the unchanged pixels in the step (5a) as the change detection result image IrThe non-change pixels in (5b) are used as the change detection result image IrTo obtain a change detection result image Ir
(6) Outputting an image:
outputting a change detection resultImage Ir
CN201410091118.XA 2014-03-12 2014-03-12 The SAR image change detection merged based on direction wave area image Active CN103824302B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410091118.XA CN103824302B (en) 2014-03-12 2014-03-12 The SAR image change detection merged based on direction wave area image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410091118.XA CN103824302B (en) 2014-03-12 2014-03-12 The SAR image change detection merged based on direction wave area image

Publications (2)

Publication Number Publication Date
CN103824302A CN103824302A (en) 2014-05-28
CN103824302B true CN103824302B (en) 2016-08-17

Family

ID=50759343

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410091118.XA Active CN103824302B (en) 2014-03-12 2014-03-12 The SAR image change detection merged based on direction wave area image

Country Status (1)

Country Link
CN (1) CN103824302B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104200471B (en) * 2014-08-30 2017-03-01 西安电子科技大学 SAR image change detection based on adaptive weight image co-registration
CN105389817B (en) * 2015-11-04 2017-11-14 河海大学 A kind of two phase remote sensing image variation detection methods
CN105740806B (en) * 2016-01-27 2018-12-28 大连楼兰科技股份有限公司 A kind of perspective transform target's feature-extraction method based on multi-angle of view
CN108764119B (en) * 2018-05-24 2022-03-18 西安电子科技大学 SAR image change detection method based on iteration maximum between-class variance
CN110119782A (en) * 2019-05-14 2019-08-13 西安电子科技大学 SAR image change detection based on FPGA
CN110956602B (en) * 2019-12-17 2022-12-06 内蒙古工业大学 Method and device for determining change area and storage medium
CN115937199B (en) * 2023-01-06 2023-05-23 山东济宁圣地电业集团有限公司 Spraying quality detection method for insulating layer of power distribution cabinet
CN116385866B (en) * 2023-02-09 2024-05-10 中国铁道科学研究院集团有限公司铁道建筑研究所 SAR image-based railway line color steel house change detection method and device

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102254323A (en) * 2011-06-10 2011-11-23 西安电子科技大学 Method for carrying out change detection on remote sensing images based on treelet fusion and level set segmentation

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004212256A (en) * 2003-01-06 2004-07-29 National Institute Of Information & Communication Technology Noise reduction processing method and device for sar image

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102254323A (en) * 2011-06-10 2011-11-23 西安电子科技大学 Method for carrying out change detection on remote sensing images based on treelet fusion and level set segmentation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于提升方向波变换域的SAR图像压缩;白静等;《红外与毫米波学报》;20090830;第28卷(第4期);311-315 *

Also Published As

Publication number Publication date
CN103824302A (en) 2014-05-28

Similar Documents

Publication Publication Date Title
CN103824302B (en) The SAR image change detection merged based on direction wave area image
CN110472627B (en) End-to-end SAR image recognition method, device and storage medium
Xu et al. Review of video and image defogging algorithms and related studies on image restoration and enhancement
CN108846835B (en) Image change detection method based on depth separable convolutional network
US9147255B1 (en) Rapid object detection by combining structural information from image segmentation with bio-inspired attentional mechanisms
CN102324021B (en) Infrared dim-small target detection method based on shear wave conversion
CN107563433B (en) Infrared small target detection method based on convolutional neural network
CN102722891B (en) Method for detecting image significance
Wu et al. Classification of defects with ensemble methods in the automated visual inspection of sewer pipes
CN107808138B (en) Communication signal identification method based on FasterR-CNN
Ishitsuka et al. Object Detection in Ground‐Penetrating Radar Images Using a Deep Convolutional Neural Network and Image Set Preparation by Migration
CN105096310A (en) Method and System for Segmentation of the Liver in Magnetic Resonance Images Using Multi-Channel Features
Sonawane et al. A brief survey on image segmentation methods
CN108171119B (en) SAR image change detection method based on residual error network
CN113536963B (en) SAR image airplane target detection method based on lightweight YOLO network
CN109344917A (en) A kind of the species discrimination method and identification system of Euproctis insect
CN103093478A (en) Different source image rough edge test method based on rapid nuclear spatial fuzzy clustering
Yang et al. Semantic segmentation in architectural floor plans for detecting walls and doors
Giannarou et al. Optimal edge detection using multiple operators for image understanding
Chen et al. Improved fast r-cnn with fusion of optical and 3d data for robust palm tree detection in high resolution uav images
Wenyan et al. SAR image change detection based on equal weight image fusion and adaptive threshold in the NSST domain
CN104851090B (en) Image change detection method and device
CN109543716B (en) K-line form image identification method based on deep learning
CN116310795A (en) SAR aircraft detection method, system, device and storage medium
Srivatsa et al. Application of least square denoising to improve admm based hyperspectral image classification

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant