CN103824302A - SAR (synthetic aperture radar) image change detecting method based on direction wave domain image fusion - Google Patents

SAR (synthetic aperture radar) image change detecting method based on direction wave domain image fusion Download PDF

Info

Publication number
CN103824302A
CN103824302A CN201410091118.XA CN201410091118A CN103824302A CN 103824302 A CN103824302 A CN 103824302A CN 201410091118 A CN201410091118 A CN 201410091118A CN 103824302 A CN103824302 A CN 103824302A
Authority
CN
China
Prior art keywords
pixel
disparity map
image
associating
represent
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410091118.XA
Other languages
Chinese (zh)
Other versions
CN103824302B (en
Inventor
白静
焦李成
刘斌
马文萍
马晶晶
王爽
张向荣
胡波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN201410091118.XA priority Critical patent/CN103824302B/en
Publication of CN103824302A publication Critical patent/CN103824302A/en
Application granted granted Critical
Publication of CN103824302B publication Critical patent/CN103824302B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses an SAR (synthetic aperture radar) image change detecting method based on direction wave domain image fusion, which overcomes the problems that difference image detail information cannot be kept well when difference images are fused in the prior art, and information of a change region is lost when difference images are processed by a binarization segmentation method. The SAR image change detecting method provided by the invention comprises the following implementation steps: (1) inputting images; (2) pretreating the images; (3) generating difference images; (4) constructing a combined difference image; (5) segmenting the combined difference image; and (6) outputting images. The SAR image change detecting method based on direction wave domain image fusion has the advantages of describing boundaries of a changed region and an unchanged region accurately, and being low in detection leakage rate, and can be applied to the natural disaster detection in the field of SAR image aim recognition.

Description

The SAR image change detection method merging based on direction wave area image
Technical field
The invention belongs to technical field of image processing, further relate to a kind of synthetic-aperture radar (Synthetic Aperture Radar, the SAR) image change detection method merging based on direction wave area image in diameter radar image change detection techniques field.The present invention can be used for disaster and detects in SAR image object identification field.
Background technology
Changing the research object detecting is multidate data, requires to obtain observed object at imaging data in the same time not, and changing the object detecting is by multidate data analysis, extracts the change information that object of observation does not occur in the same time.Synthetic-aperture radar (Synthetic Aperture Radar, SAR) imaging is as the important Observations Means over the ground of one, particularly its active microwave imaging principle, enable not to be subject to the impact of weather and illumination, round-the-clock, round-the-clock is observed over the ground, and in addition, the penetration capacity of microwave makes SAR can penetrate forest, have detectivity for the maneuvering target being hidden in wood land, therefore SAR image has become the important information source of Image Change Detection.
Xian Electronics Science and Technology University has proposed a kind of sar image change detection method based on neighborhood logarithm ratio and anisotropy diffusion in its patented claim " based on the sar image change detection method of neighborhood logarithm ratio and anisotropy diffusion " (number of patent application: CN201110005209, publication number: CN102096921A).The concrete steps of the method are: first, the disparity map of phasor while adopting neighborhood logarithm ratio method construct two width, then, carries out self-adaptation anisotropic diffusion filtering and maximum variance between clusters is cut apart to disparity map, obtains finally changing testing result.Although the method can keep the marginal information of region of variation preferably, but the deficiency still existing is, because binarization segmentation is described not too applicable to the pixel class of associating disparity map, so adopt binarization segmentation method can cause region of variation information dropout to disparity map, cause changing the loss detecting larger.
Xian Electronics Science and Technology University has proposed a kind of remote sensing image change detecting method based on image co-registration in its patented claim " remote sensing image based on image co-registration changes detection " (number of patent application: CN2012102340761, publication number: CN102750705A).The method concrete steps are: first, obtain associating disparity map by merge two width disparity map in wavelet field, then, use fuzzy Local C means Method to cut apart the disparity map after fusion, obtain changing testing result.Although the method is by improving the disparity map performance of remote sensing image, improve the precision of testing result, but the deficiency still existing is, because wavelet transformation is for the extraction poor-performing of the unusual information of line, while causing disparity map to merge, disparity map detailed information be can not retain preferably, region of variation in testing result and the Boundary Detection out of true of region of variation not made to change.
Summary of the invention
The object of the invention is to overcome above-mentioned the deficiencies in the prior art, proposed a kind of SAR image change detection method merging based on direction wave area image.The present invention, by improving disparity map performance and improving disparity map segmentation strategy, has improved the accuracy rate of SAR Image Change Detection result.
For achieving the above object, concrete steps of the present invention are as follows:
(1) input picture:
The two width SAR image S that optional areal, different time are taken 1and S 2as waiting to change detection SAR image.
(2) image pre-service:
Adopt probability piecemeal matched filtering method, respectively to SAR image S 1and S 2filtering, obtains image X after filtering 1and X 2.
(3) generate disparity map:
(3a) utilize ratio operator formula, to image X after filtering 1and X 2in pixel value operate, obtain disparity map I 1pixel value;
(3b) utilize hybrid operator formula, to image X after filtering 1and X 2in pixel value operate, obtain disparity map I 2pixel value.
(4) tectonic syntaxis disparity map:
(4a) respectively to disparity map I 1and I 2travel direction wave conversion, obtains disparity map I 1and I 2direction wave low frequency sub-band coefficient and high-frequency sub-band coefficient;
(4b) utilize following formula, to disparity map I 1and I 2direction wave low frequency sub-band coefficient merge, obtain associating disparity map I 3direction wave low frequency sub-band coefficient:
L ij I 3 = 1 2 ( L ij I 1 + L ij I 2 ) , i ∈ M , j ∈ N
Wherein,
Figure BDA0000476168860000031
represent associating disparity map I 3direction wave low frequency sub-band in the coefficient of capable, the j of i row,
Figure BDA0000476168860000032
represent disparity map I 1the coefficient of capable, the j of direction wave low frequency sub-band i row,
Figure BDA0000476168860000033
represent disparity map I 2the coefficient of capable, the j of direction wave low frequency sub-band i row, M represents disparity map I 1and I 2line number, N represents disparity map I 1and I 2columns;
(4c) utilize following formula, to disparity map I 1and I 2direction wave high-frequency sub-band coefficient merge, obtain associating disparity map I 3direction wave high-frequency sub-band coefficient:
H ij I 3 = max ( H ij I 1 , H ij I 2 ) , i ∈ M , j ∈ N
Wherein, represent associating disparity map I 3direction wave high-frequency sub-band in the coefficient of capable, the j of i row, max () represents to get maxima operation, represent disparity map I 1high-frequency sub-band in the coefficient of capable, the j of i row,
Figure BDA0000476168860000037
represent disparity map I 2direction wave low frequency sub-band in the coefficient of capable, the j of i row, M represents disparity map I 1and I 2line number, N represents disparity map I 1and I 2columns;
(4d) to associating disparity map I 3direction wave low frequency coefficient and the inverse transformation of high frequency coefficient travel direction ripple, obtain associating disparity map I 3.
(5) cut apart associating disparity map:
(5a) adopt fuzzy C-means clustering method, as follows, will combine disparity map I 3in all pixels be divided into strong variation class pixel, weak variation class pixel and do not change class pixel:
(5a1) will combine disparity map I 3in the gray-scale value of all pixels as the proper vector of each pixel, meanwhile, from associating disparity map I 3in select arbitrarily the gray-scale value of 3 pixels as 3 initial cluster centers of 3 class pixels;
(5a2) according to the following formula, upgrade associating disparity map I 3the degree of membership of the proper vector of middle pixel:
u ik = Σ ( | | x k - v i | | | | x k - v j | | ) - 2 , j = 1,2 , . . . , n
Wherein, u ikrepresent associating disparity map I 3in the proper vector of k pixel be under the jurisdiction of the degree of membership of i class pixel, ∑ is sum operation, x krepresent associating disparity map I 3in the proper vector of k pixel, v irepresent the cluster centre of i class pixel, v jrepresent the cluster centre of j class pixel, k represents associating disparity map I 3the sequence number of the proper vector of middle pixel, i represents current cluster classification, and j represents cluster classification arbitrarily, and n represents associating disparity map I 3the number of the proper vector of middle pixel, || || represent to ask Euclidean distance operation;
(5a3) according to the following formula, upgrade cluster centre vector:
v i = Σ ( u ik ) 2 x k Σ ( u ik ) 2 , k = 1,2 , . . . , n
Wherein, v irepresent associating disparity map I 3the cluster centre vector of i class pixel, ∑ is sum operation, μ ikrepresent associating disparity map I 3in the proper vector of k pixel be under the jurisdiction of the degree of membership of i class pixel, x krepresent associating disparity map I 3in the proper vector of k pixel, i represents the classification of cluster, k represents associating disparity map I 3the sequence number of the proper vector of middle pixel, n represents associating disparity map I 3the number of the proper vector of middle pixel;
(5a4) first step to the three step iteration are carried out 20 times, obtained associating disparity map I 3the proper vector of middle pixel be subordinate to matrix;
(5a5) in being subordinate to matrix, the peaked line number of every row degree of membership, as the classification of pixel, obtains changing by force class pixel, weak variation class pixel and does not change class pixel;
(5b) will change by force class pixel and weak variation class pixel and merge into variation class pixel;
(5c) using the not variation class pixel in step (5a) as changing testing result image I rin not variation class pixel, using the variation class pixel in step (5b) as changing testing result image I rin variation class pixel, obtain changing testing result image I r.
(6) output image:
Exporting change testing result image I r.
The inventive method has the following advantages compared with prior art:
First, because the present invention has adopted direction wave area image fusion method, two width disparity map are merged, overcome the shortcoming that can not retain preferably disparity map detailed information when disparity map merges in prior art, improve associating disparity map and extracted the performance of detailed information, made the present invention there is the region of variation of description and the more accurate advantage in region of variation border not.
Second, because the present invention adopts fuzzy C-mean algorithm method, disparity map pixel is divided into strong variation class pixel, weak variation class pixel and does not change class pixel, having overcome in prior art adopts binarization segmentation method can cause the shortcoming of region of variation information dropout to disparity map, retain more accurately the region of variation information in associating disparity map, made the present invention have advantages of that loss is lower.
Accompanying drawing explanation
Fig. 1 is process flow diagram of the present invention;
Fig. 2 is that the inventive method and prior art are to the regional SAR Image Change Detection of Berne (Bern) effect contrast figure;
Fig. 3 is that the inventive method and prior art are to Ottawa (Ottawa) regional SAR Image Change Detection effect contrast figure.
Embodiment
Below in conjunction with accompanying drawing, the present invention will be further described.
With reference to accompanying drawing 1, performing step of the present invention is as follows:
Step 1: input picture.
The two width SAR image S that optional areal, different time are taken 1and S 2as waiting to change detection SAR image.The synthetic-aperture radar SAR image using in the embodiment of the present invention is respectively as shown in Fig. 2 (a), Fig. 2 (b) and Fig. 3 (a), Fig. 3 (b).Wherein, Fig. 2 (a) is Berne (Bern) the area synthetic-aperture radar SAR image in April, 1999, size is 301 × 301, Fig. 2 (b) is Berne (Bern) the area synthetic-aperture radar SAR image in May, 1999, size is 301 × 301, Fig. 3 (a) is area, Ottawa (Ottawa) the synthetic-aperture radar SAR image in May, 1997, size is 290 × 350, Fig. 3 (b) is area, Ottawa (Ottawa) the synthetic-aperture radar SAR image in August, 1997, and size is 290 × 350.
Step 2: image pre-service.
Adopt probability piecemeal matched filtering method, respectively to SAR image S 1and S 2filtering, obtains image X after filtering 1and X 2.The concrete steps of probability piecemeal matched filtering method are as follows:
The first step, sets SAR image S 1and S 2in arbitrarily pixel neighborhood of a point window size be 7 × 7.
Second step, according to the following formula, calculates respectively SAR image S 1and S 2in any weights of pixel and interior other pixels of this pixel neighborhood of a point window:
w ( q , t ) = exp ( - Σ p = 1 49 1 h log ( A q , p A t , r + A t , r A q , p ) )
Wherein, w (q, t) represents SAR image S 1and S 2in any weights of pixel q and interior other pixels t of this pixel neighborhood of a point window, exp () represents the operation of fetching number, ∑ represents sum operation, h represents the smoothing parameter of control characteristic attenuation degree, log () represents to take the logarithm operation, and p represents any pixel q pixel sequence number in the neighborhood of 7 × 7 sizes, and r represents the sequence number of pixel t pixel in 7 × 7 large small neighbourhoods, and r=p, A q,prepresent the gray-scale value of any pixel q pixel sequence number p corresponding position in neighborhood, A t,rrepresent the gray-scale value of pixel t pixel sequence number r corresponding position in neighborhood.
The 3rd step, according to the following formula, calculates respectively SAR image S 1and S 2in any gray scale estimated value of pixel:
z ^ 1 ( q ) = Σ t ∈ W q w ( q , t ) A t 2 Σ t ∈ W q w ( q , t )
Wherein,
Figure BDA0000476168860000063
represent SAR image S 1and S 2in any gray scale estimated value of pixel q, ∑ represents sum operation, W qrepresent SAR image S 1and S 2in any neighborhood window of 7 × 7 sizes of pixel q, w (q, t) represents SAR image S 1and S 2in any pixel q and its neighborhood window in the weights of other pixels t, A trepresent the gray-scale value of pixel t.
The 4th step, carries out the first step, second step, the 3rd step iteration 25 times, obtains SAR image S 1and S 2filtering after image X 1and X 2.
Step 3: generate disparity map.
Utilize ratio operator formula, to image X after filtering 1and X 2in pixel value operate, obtain disparity map I 1pixel value.Ratio operator formula is as follows:
D = Σ k = 1 49 min ( X 1 , k N i , X 2 , k N i ) Σ k = 1 49 max ( X 1 , k N i , X 2 , k N i )
Wherein, D represents disparity map I 1pixel value, ∑ represents sum operation, min () represents to get minimum value operation, max () represents to get maxima operation, N iimage X after expression filtering 1and X 2in centered by i pixel, the neighborhood of size as 7 × 7, i represents image X after filtering 1and X 2in pixel sequence number, k represents the pixel sequence number in neighborhood Ni, k=1,2 ..., 49,
Figure BDA0000476168860000072
image X after expression filtering 1middle neighborhood N ik pixel,
Figure BDA0000476168860000073
image X after expression filtering 2middle neighborhood N ik pixel.
Utilize hybrid operator formula, to image X after filtering 1and X 2in pixel value operate, obtain disparity map I 2pixel value.Hybrid operator formula is as follows:
F = 1 2 { log ( 1 2 ( X 1 , k N i X 2 , k N i + X 2 , k N i X 1 , k N i ) ) + 1 49 Σ k = 1 49 log ( 1 2 ( X 1 , k N i X 2 , k N i + X 2 , k N i X 1 , k N i ) ) }
Wherein, F represents disparity map I 2pixel value, log () represents the operation of taking the logarithm, ∑ represents sum operation, N iimage X after expression filtering 1and X 2in centered by i pixel, the neighborhood of size as 7 × 7, i represents image X after filtering 1and X 2in pixel sequence number, k represents neighborhood N iin pixel sequence number, k=1,2 ..., 49,
Figure BDA0000476168860000075
image X after expression filtering 1middle neighborhood N ik pixel, image X after expression filtering 2middle neighborhood N ik pixel.
Step 4: tectonic syntaxis disparity map.
Adopt direction wave area image fusion method, to disparity map I 1and I 2merge, obtain associating disparity map I 3.The concrete steps of direction wave area image fusion method are as follows:
The first step, respectively to disparity map I 1and I 2travel direction wave conversion, obtains disparity map I 1and I 2direction wave low frequency sub-band coefficient and high-frequency sub-band coefficient.
Second step, utilizes following formula, to disparity map I 1and I 2direction wave low frequency sub-band coefficient merge, obtain associating disparity map I 3direction wave low frequency sub-band coefficient:
L ij I 3 = 1 2 ( L ij I 1 + L ij I 2 ) , i ∈ M , j ∈ N
Wherein,
Figure BDA0000476168860000082
represent associating disparity map I 3direction wave low frequency sub-band in the coefficient of capable, the j of i row,
Figure BDA0000476168860000083
represent disparity map I 1the coefficient of capable, the j of direction wave low frequency sub-band i row, represent disparity map I 2the coefficient of capable, the j of direction wave low frequency sub-band i row, M represents disparity map I 1and I 2line number, N represents disparity map I 1and I 2columns.
The 3rd step, utilizes following formula, to disparity map I 1and I 2direction wave high-frequency sub-band coefficient merge, obtain associating disparity map I 3direction wave high-frequency sub-band coefficient:
H ij I 3 = max ( H ij I 1 , H ij I 2 ) , i ∈ M , j ∈ N
Wherein,
Figure BDA0000476168860000086
represent associating disparity map I 3direction wave high-frequency sub-band in the coefficient of capable, the j of i row, max () represents to get maxima operation,
Figure BDA0000476168860000087
represent disparity map I 1high-frequency sub-band in the coefficient of capable, the j of i row,
Figure BDA0000476168860000088
represent disparity map I 2direction wave low frequency sub-band in the coefficient of capable, the j of i row, M represents disparity map I 1and I 2line number, N represents disparity map I 1and I 2columns.
The 4th step, to associating disparity map I 3direction wave low frequency coefficient and the inverse transformation of high frequency coefficient travel direction ripple, obtain associating disparity map I 3.
Step 5: cut apart associating disparity map.
Adopt fuzzy C-means clustering method, will combine disparity map I 3in all pixels be divided into strong variation class pixel, weak variation class pixel and do not change class pixel.The concrete steps of fuzzy C-means clustering method are as follows:
The first step, will combine disparity map I 3in the gray-scale value of each pixel as the proper vector of this pixel, meanwhile, from associating disparity map I 3in select arbitrarily the gray-scale value of 3 pixels as 3 initial cluster centers of 3 class pixels.
Second step, according to the following formula, upgrades associating disparity map I 3the degree of membership of the proper vector of middle pixel:
u ik = Σ ( | | x k - v i | | | | x k - v j | | ) - 2 , j = 1,2 , . . . , n
Wherein, u ikrepresent associating disparity map I 3in the proper vector of k pixel be under the jurisdiction of the degree of membership of i class pixel, ∑ is sum operation, x krepresent associating disparity map I 3in the proper vector of k pixel, v irepresent the cluster centre of i class pixel, v jrepresent the cluster centre of j class pixel, k represents associating disparity map I 3the sequence number of the proper vector of middle pixel, i represents current cluster classification, and j represents cluster classification arbitrarily, and n represents associating disparity map I 3the number of the proper vector of middle pixel, || || represent to ask Euclidean distance operation.
The 3rd step, according to the following formula, upgrade cluster centre vector:
v i = Σ ( u ik ) 2 x k Σ ( u ik ) 2 , k = 1,2 , . . . , n
Wherein, v irepresent associating disparity map I 3the cluster centre vector of i class pixel, ∑ is sum operation, μ ikrepresent associating disparity map I 3in the proper vector of k pixel be under the jurisdiction of the degree of membership of i class pixel, x krepresent associating disparity map I 3in the proper vector of k pixel, i represents the classification of cluster, k represents associating disparity map I 3the sequence number of the proper vector of middle pixel, n represents associating disparity map I 3the number of the proper vector of middle pixel.
The 4th step, carries out the first step to the three step iteration 20 times, obtains associating disparity map I 3the proper vector of middle pixel be subordinate to matrix.
The 5th step, in being subordinate to matrix, the peaked line number of every row degree of membership, as the classification of pixel, obtains changing by force class pixel, weak variation class pixel and does not change class pixel.
The not variation class pixel that fuzzy C-means clustering method is obtained is as changing testing result image I rin not variation class pixel, will change by force class pixel and weak variation class pixel and merge into and change class pixel, and as changing testing result image I rin variation class pixel, obtain changing testing result image I r.
Step 6: exporting change testing result image I r.
Analogous diagram below in conjunction with accompanying drawing 2 and accompanying drawing 3 is described further effect of the present invention.
1, emulation experiment condition:
Hardware test platform of the present invention is: processor is Inter Core2Duo CPU E8200, and dominant frequency is 2.67GHz, internal memory 2GB, and software platform is: Windows7 Ultimate 32-bit operating system and Matlab R2012b.Input picture of the present invention is respectively the two width synthetic-aperture radar SAR images in Berne (Bern) area and the two width synthetic-aperture radar SAR images in Ottawa (Ottawa) area, wherein the regional two width synthetic-aperture radar SAR image sizes of Berne (Bern) are 301 × 301, the regional two width synthetic-aperture radar SAR image sizes in Ottawa (Ottawa) are 290 × 350, and form is all BMP.
2, emulation content:
Two methods that the prior art that the present invention uses contrasts are as follows respectively:
Patented technology " based on the sar image change detection method of neighborhood logarithm ratio and anisotropy the diffusion " (number of patent application: CN201110005209 that Xian Electronics Science and Technology University has, publication number: CN102096921B) in the synthetic-aperture radar SAR image change detection method of the middle proposition mentioned, be called for short based on maximum variance between clusters.
Patented technology " remote sensing image based on image co-registration changes the detection " (number of patent application: CN201210234076 that Xian Electronics Science and Technology University has, publication number: CN102750705) the middle remote sensing image change detecting method proposing, is called for short based on wavelet field image co-registration method.
3. analysis of simulation result:
Fig. 2 is that the inventive method and prior art are to the regional SAR Image Change Detection of Berne (Bern) comparison diagram, wherein, Fig. 2 (a) is Berne (Bern) the area synthetic-aperture radar SAR image in April, 1999, Fig. 2 (b) is Berne (Bern) the area synthetic-aperture radar SAR image in May, 1999, Fig. 2 (c) is the regional synthetic-aperture radar SAR Image Change Detection of Berne (Bern) reference picture, Fig. 2 (d) is the variation testing result based on maximum variance between clusters, Fig. 2 (e) is the variation testing result based on wavelet field image co-registration method, the variation testing result that Fig. 2 (f) is the inventive method.
Fig. 3 the inventive method and prior art are to Ottawa (Ottawa) regional SAR Image Change Detection comparison diagram, wherein, Fig. 3 (a) is area, Ottawa (Ottawa) the synthetic-aperture radar SAR image in May, 1997, Fig. 3 (b) is area, Ottawa (Ottawa) the synthetic-aperture radar SAR image in August, 1997, Fig. 3 (c) is Ottawa (Ottawa) regional synthetic-aperture radar SAR Image Change Detection reference picture, Fig. 3 (d) is the variation testing result based on maximum variance between clusters, Fig. 3 (e) is the variation testing result based on wavelet field image co-registration method, the variation testing result that Fig. 3 (f) is the inventive method.
Experimental result evaluating is respectively undetected number, false retrieval number, total wrong number and total false rate, table 1 is depicted as and in emulation experiment, changes testing result statistics, wherein, OSTU represents based on maximum variance between clusters, WDFCD represents based on wavelet field image co-registration method, ME represents that undetected number, FA represent that false retrieval number, OE represent total wrong number, and OER represents total false rate.
Table 1Bern area and Ottawa area testing result
Figure BDA0000476168860000111
Can find out in conjunction with Fig. 2, Fig. 3 and table 1, the poorest based on maximum variance between clusters testing result, main manifestations is that undetected number is too many, is especially embodied in multiple small targets perhaps and does not detect; Performance based on wavelet field image co-registration method is better than based on maximum variance between clusters, and false retrieval number and undetected number have decline, but the phenomenon that still exists less target to detect; The inventive method is no matter at false retrieval number be undetectedly all better than first two method aspect several, and especially many less point targets are detected, and illustrate that the inventive method is better than first two method in accuracy of detection.
Above emulation experiment shows: the inventive method is aspect the robustness to coherent speckle noise or accuracy of detection, all to show good performance, can solve in existing method, exist to the problem such as coherent speckle noise sensitivity, accuracy of detection be low, be a kind of very practical synthetic-aperture radar SAR image change detection method.

Claims (4)

1. the SAR image change detection method merging based on direction wave area image, comprises the steps:
(1) input picture:
The two width SAR image S that optional areal, different time are taken 1and S 2as waiting to change detection SAR image;
(2) image pre-service:
Adopt probability piecemeal matched filtering method, respectively to SAR image S 1and S 2filtering, obtains image X after filtering 1and X 2;
(3) generate disparity map:
(3a) utilize ratio operator formula, to image X after filtering 1and X 2in pixel value operate, obtain disparity map I 1pixel value;
(3b) utilize hybrid operator formula, to image X after filtering 1and X 2in pixel value operate, obtain disparity map I 2pixel value;
(4) tectonic syntaxis disparity map:
(4a) respectively to disparity map I 1and I 2travel direction wave conversion, obtains disparity map I 1and I 2direction wave low frequency sub-band coefficient and high-frequency sub-band coefficient;
(4b) utilize following formula, to disparity map I 1and I 2direction wave low frequency sub-band coefficient merge, obtain associating disparity map I 3direction wave low frequency sub-band coefficient:
L ij I 3 = 1 2 ( L ij I 1 + L ij I 2 ) , i ∈ M , j ∈ N
Wherein,
Figure FDA0000476168850000012
represent associating disparity map I 3direction wave low frequency sub-band in the coefficient of capable, the j of i row,
Figure FDA0000476168850000013
represent disparity map I 1the coefficient of capable, the j of direction wave low frequency sub-band i row,
Figure FDA0000476168850000014
represent disparity map I 2the coefficient of capable, the j of direction wave low frequency sub-band i row, M represents disparity map I 1and I 2line number, N represents disparity map I 1and I 2columns;
(4c) utilize following formula, to disparity map I 1and I 2direction wave high-frequency sub-band coefficient merge, obtain associating disparity map I 3direction wave high-frequency sub-band coefficient:
H ij I 3 = max ( H ij I 1 , H ij I 2 ) , i ∈ M , j ∈ N
Wherein,
Figure FDA0000476168850000022
represent associating disparity map I 3direction wave high-frequency sub-band in the coefficient of capable, the j of i row, max () represents to get maxima operation,
Figure FDA0000476168850000023
represent disparity map I 1high-frequency sub-band in the coefficient of capable, the j of i row,
Figure FDA0000476168850000024
represent disparity map I 2direction wave low frequency sub-band in the coefficient of capable, the j of i row, M represents disparity map I 1and I 2line number, N represents disparity map I 1and I 2columns;
(4d) to associating disparity map I 3direction wave low frequency coefficient and the inverse transformation of high frequency coefficient travel direction ripple, obtain associating disparity map I 3;
(5) cut apart associating disparity map:
(5a) adopt fuzzy C-means clustering method, as follows, will combine disparity map I 3in all pixels be divided into strong variation class pixel, weak variation class pixel and do not change class pixel:
(5a1) will combine disparity map I 3in the gray-scale value of all pixels as the proper vector of each pixel, meanwhile, from associating disparity map I 3in select arbitrarily the gray-scale value of 3 pixels as 3 initial cluster centers of 3 class pixels;
(5a2) according to the following formula, upgrade associating disparity map I 3the degree of membership of the proper vector of middle pixel:
u ik = Σ ( | | x k - v i | | | | x k - v j | | ) - 2 , j = 1,2 , . . . , n
Wherein, u ikrepresent associating disparity map I 3in the proper vector of k pixel be under the jurisdiction of the degree of membership of i class pixel, ∑ is sum operation, x krepresent associating disparity map I 3in the proper vector of k pixel, v irepresent the cluster centre of i class pixel, v jrepresent the cluster centre of j class pixel, k represents associating disparity map I 3the sequence number of the proper vector of middle pixel, i represents current cluster classification, and j represents cluster classification arbitrarily, and n represents associating disparity map I 3the number of the proper vector of middle pixel, || || represent to ask Euclidean distance operation;
(5a3) according to the following formula, upgrade cluster centre vector:
v i = Σ ( u ik ) 2 x k Σ ( u ik ) 2 , k = 1,2 , . . . , n
Wherein, v irepresent associating disparity map I 3the cluster centre vector of i class pixel, ∑ is sum operation, μ ikrepresent associating disparity map I 3in the proper vector of k pixel be under the jurisdiction of the degree of membership of i class pixel, x krepresent associating disparity map I 3in the proper vector of k pixel, i represents the classification of cluster, k represents associating disparity map I 3the sequence number of the proper vector of middle pixel, n represents associating disparity map I 3the number of the proper vector of middle pixel;
(5a4) first step to the three step iteration are carried out 20 times, obtained associating disparity map I 3the proper vector of middle pixel be subordinate to matrix;
(5a5) in being subordinate to matrix, the peaked line number of every row degree of membership, as the classification of pixel, obtains changing by force class pixel, weak variation class pixel and does not change class pixel;
(5b) will change by force class pixel and weak variation class pixel and merge into variation class pixel;
(5c) using the not variation class pixel in step (5a) as changing testing result image I rin not variation class pixel, using the variation class pixel in step (5b) as changing testing result image I rin variation class pixel, obtain changing testing result image I r;
(6) output image:
Exporting change testing result image I r.
2. the SAR image change detection method merging based on direction wave area image according to claim 1, is characterized in that: the described probability piecemeal matched filtering method of step (2) refers to, as follows respectively to SAR image S 1and S 2filtering, obtains image X after filtering 1and X 2:
The first step, sets SAR image S 1and S 2in arbitrarily pixel neighborhood of a point window size be 7 × 7;
Second step, according to the following formula, calculates respectively SAR image S 1and S 2in any weights of pixel and interior other pixels of this pixel neighborhood of a point window:
w ( q , t ) = exp ( - Σ p = 1 49 1 h log ( A q , p A t , r + A t , r A q , p ) )
Wherein, w (q, t) represents SAR image S 1and S 2in any weights of pixel q and interior other pixels t of this pixel neighborhood of a point window, exp () represents the operation of fetching number, ∑ represents sum operation, h represents the smoothing parameter of control characteristic attenuation degree, log () represents to take the logarithm operation, and p represents any pixel q pixel sequence number in the neighborhood of 7 × 7 sizes, and r represents the sequence number of pixel t pixel in 7 × 7 large small neighbourhoods, and r=p, A q,prepresent the gray-scale value of any pixel q pixel sequence number p corresponding position in neighborhood, A t,rrepresent the gray-scale value of pixel t pixel sequence number r corresponding position in neighborhood;
The 3rd step, according to the following formula, calculates respectively SAR image S 1and S 2in any gray scale estimated value of pixel:
z ^ 1 ( q ) = Σ t ∈ W q w ( q , t ) A t 2 Σ t ∈ W q w ( q , t )
Wherein, represent SAR image S 1and S 2in any gray scale estimated value of pixel q, ∑ represents sum operation, W qrepresent SAR image S 1and S 2in any neighborhood window of 7 × 7 sizes of pixel q, w (q, t) represents SAR image S 1and S 2in any pixel q and its neighborhood window in the weights of other pixels t, A trepresent the gray-scale value of pixel t;
The 4th step, carries out the first step, second step, the 3rd step iteration 25 times, obtains SAR image S 1and S 2filtering after image X 1and X 2.
3. the SAR image change detection method merging based on direction wave area image according to claim 1, is characterized in that: the ratio operator formula described in step (3a) is as follows:
D = Σ k = 1 49 min ( X 1 , k N i , X 2 , k N i ) Σ k = 1 49 max ( X 1 , k N i , X 2 , k N i )
Wherein, D represents disparity map I 1pixel value, ∑ represents sum operation, min () represents to get minimum value operation, max () represents to get maxima operation, N iimage X after expression filtering 1and X 2in centered by i pixel, the neighborhood of size as 7 × 7, i represents image X after filtering 1and X 2in pixel sequence number, k represents neighborhood N iin pixel sequence number, k=1,2 ..., 49,
Figure FDA0000476168850000051
image X after expression filtering 1middle neighborhood N ik pixel,
Figure FDA0000476168850000052
image X after expression filtering 2middle neighborhood N ik pixel.
4. the SAR image change detection method merging based on direction wave area image according to claim 1, is characterized in that: the hybrid operator formula described in step (3b) is as follows:
F = 1 2 { log ( 1 2 ( X 1 , k N i X 2 , k N i + X 2 , k N i X 1 , k N i ) ) + 1 49 Σ k = 1 49 log ( 1 2 ( X 1 , k N i X 2 , k N i + X 2 , k N i X 1 , k N i ) ) }
Wherein, F represents disparity map I 2pixel value, log () represents the operation of taking the logarithm, ∑ represents sum operation, N iimage X after expression filtering 1and X 2in centered by i pixel, the neighborhood of size as 7 × 7, i represents image X after filtering 1and X 2in pixel sequence number, k represents neighborhood N iin pixel sequence number, k=1,2 ..., 49, image X after expression filtering 1middle neighborhood N ik pixel,
Figure FDA0000476168850000055
image X after expression filtering 2middle neighborhood N ik pixel.
CN201410091118.XA 2014-03-12 2014-03-12 The SAR image change detection merged based on direction wave area image Active CN103824302B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410091118.XA CN103824302B (en) 2014-03-12 2014-03-12 The SAR image change detection merged based on direction wave area image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410091118.XA CN103824302B (en) 2014-03-12 2014-03-12 The SAR image change detection merged based on direction wave area image

Publications (2)

Publication Number Publication Date
CN103824302A true CN103824302A (en) 2014-05-28
CN103824302B CN103824302B (en) 2016-08-17

Family

ID=50759343

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410091118.XA Active CN103824302B (en) 2014-03-12 2014-03-12 The SAR image change detection merged based on direction wave area image

Country Status (1)

Country Link
CN (1) CN103824302B (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104200471A (en) * 2014-08-30 2014-12-10 西安电子科技大学 SAR image change detection method based on adaptive weight image fusion
CN105389817A (en) * 2015-11-04 2016-03-09 河海大学 Two-time phase remote sensing image change detection method
CN105740806A (en) * 2016-01-27 2016-07-06 大连楼兰科技股份有限公司 Target characteristic extraction method based on multi-view perspective transformation
CN108764119A (en) * 2018-05-24 2018-11-06 西安电子科技大学 SAR image change detection based on iteration maximum between-cluster variance
CN110119782A (en) * 2019-05-14 2019-08-13 西安电子科技大学 SAR image change detection based on FPGA
CN110956602A (en) * 2019-12-17 2020-04-03 内蒙古工业大学 Method and device for determining change area and storage medium
CN115937199A (en) * 2023-01-06 2023-04-07 山东济宁圣地电业集团有限公司 Spraying quality detection method for insulating layer of power distribution cabinet
CN116385866A (en) * 2023-02-09 2023-07-04 中国铁道科学研究院集团有限公司铁道建筑研究所 SAR image-based railway line color steel house change detection method and device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004212256A (en) * 2003-01-06 2004-07-29 National Institute Of Information & Communication Technology Noise reduction processing method and device for sar image
CN102254323A (en) * 2011-06-10 2011-11-23 西安电子科技大学 Method for carrying out change detection on remote sensing images based on treelet fusion and level set segmentation

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004212256A (en) * 2003-01-06 2004-07-29 National Institute Of Information & Communication Technology Noise reduction processing method and device for sar image
CN102254323A (en) * 2011-06-10 2011-11-23 西安电子科技大学 Method for carrying out change detection on remote sensing images based on treelet fusion and level set segmentation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
白静等: "基于提升方向波变换域的SAR图像压缩", 《红外与毫米波学报》 *

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104200471A (en) * 2014-08-30 2014-12-10 西安电子科技大学 SAR image change detection method based on adaptive weight image fusion
CN104200471B (en) * 2014-08-30 2017-03-01 西安电子科技大学 SAR image change detection based on adaptive weight image co-registration
CN105389817B (en) * 2015-11-04 2017-11-14 河海大学 A kind of two phase remote sensing image variation detection methods
CN105389817A (en) * 2015-11-04 2016-03-09 河海大学 Two-time phase remote sensing image change detection method
CN105740806B (en) * 2016-01-27 2018-12-28 大连楼兰科技股份有限公司 A kind of perspective transform target's feature-extraction method based on multi-angle of view
CN105740806A (en) * 2016-01-27 2016-07-06 大连楼兰科技股份有限公司 Target characteristic extraction method based on multi-view perspective transformation
CN108764119A (en) * 2018-05-24 2018-11-06 西安电子科技大学 SAR image change detection based on iteration maximum between-cluster variance
CN108764119B (en) * 2018-05-24 2022-03-18 西安电子科技大学 SAR image change detection method based on iteration maximum between-class variance
CN110119782A (en) * 2019-05-14 2019-08-13 西安电子科技大学 SAR image change detection based on FPGA
CN110956602A (en) * 2019-12-17 2020-04-03 内蒙古工业大学 Method and device for determining change area and storage medium
CN115937199A (en) * 2023-01-06 2023-04-07 山东济宁圣地电业集团有限公司 Spraying quality detection method for insulating layer of power distribution cabinet
CN116385866A (en) * 2023-02-09 2023-07-04 中国铁道科学研究院集团有限公司铁道建筑研究所 SAR image-based railway line color steel house change detection method and device
CN116385866B (en) * 2023-02-09 2024-05-10 中国铁道科学研究院集团有限公司铁道建筑研究所 SAR image-based railway line color steel house change detection method and device

Also Published As

Publication number Publication date
CN103824302B (en) 2016-08-17

Similar Documents

Publication Publication Date Title
CN103824302A (en) SAR (synthetic aperture radar) image change detecting method based on direction wave domain image fusion
Huang et al. A new building extraction postprocessing framework for high-spatial-resolution remote-sensing imagery
CN111325748B (en) Infrared thermal image nondestructive testing method based on convolutional neural network
CN102819740B (en) A kind of Single Infrared Image Frame Dim targets detection and localization method
CN105844279A (en) Depth learning and SIFT feature-based SAR image change detection method
CN101996401B (en) Target analysis method and apparatus based on intensity image and depth image
CN102968790B (en) Remote sensing image change detection method based on image fusion
CN101546428A (en) Image fusion of sequence infrared and visible light based on region segmentation
CN105069807A (en) Punched workpiece defect detection method based on image processing
CN104299232B (en) SAR image segmentation method based on self-adaptive window directionlet domain and improved FCM
CN103093478B (en) Based on the allos image thick edges detection method of quick nuclear space fuzzy clustering
CN103745216B (en) A kind of radar image clutter suppression method based on Spatial characteristic
CN103177458A (en) Frequency-domain-analysis-based method for detecting region-of-interest of visible light remote sensing image
CN108171119B (en) SAR image change detection method based on residual error network
CN103020649A (en) Forest type identification method based on texture information
CN103353988A (en) Method for evaluating performance of heterogeneous SAR (synthetic aperture radar) image feature matching algorithm
CN104751185A (en) SAR image change detection method based on mean shift genetic clustering
CN105405132A (en) SAR image man-made target detection method based on visual contrast and information entropy
CN104794729A (en) SAR image change detection method based on significance guidance
CN105469428A (en) Morphological filtering and SVD (singular value decomposition)-based weak target detection method
CN104463821A (en) Method for fusing infrared image and visible light image
CN103065320A (en) Synthetic aperture radar (SAR) image change detection method based on constant false alarm threshold value
Yang et al. Semantic segmentation in architectural floor plans for detecting walls and doors
CN105512622A (en) Visible remote-sensing image sea-land segmentation method based on image segmentation and supervised learning
CN115272882A (en) Discrete building detection method and system based on remote sensing image

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant