CN104899873A - SAR image significance area detection method based on anisotropic diffusion space - Google Patents

SAR image significance area detection method based on anisotropic diffusion space Download PDF

Info

Publication number
CN104899873A
CN104899873A CN201510254252.1A CN201510254252A CN104899873A CN 104899873 A CN104899873 A CN 104899873A CN 201510254252 A CN201510254252 A CN 201510254252A CN 104899873 A CN104899873 A CN 104899873A
Authority
CN
China
Prior art keywords
pixel
yardstick
matrix
row
div
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510254252.1A
Other languages
Chinese (zh)
Other versions
CN104899873B (en
Inventor
张强
吴艳
王凡
张磊
樊建伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xi'an Huoyanwei Optoelectronic Technology Co ltd
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN201510254252.1A priority Critical patent/CN104899873B/en
Publication of CN104899873A publication Critical patent/CN104899873A/en
Application granted granted Critical
Publication of CN104899873B publication Critical patent/CN104899873B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image

Landscapes

  • Engineering & Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses an SAR image significance area detection method based on anisotropic diffusion space. A problem of a failure to effectively detect SAR image significance areas of a current algorithm with spot noise is mainly solved. The method comprises steps of (1) calculating edge strength and diffusion coefficient of a pixel point in different scales; (2) constructing different scales of scale graphs and comparison graphs thereof by use of different scales of line edge parameter matrixes and row edge parameter matrixes; (3) constructing corresponding scale windows on different scale graphs and comparison graphs thereof, calculating scale significance measurement, judging significance thereby and determining significance measurement and significance scales of the pixel point; and (4) obtaining stable significance area coordinates and area scopes thereof via iteration. According to the invention, effects of spot noise are reduced; detection precision in improved; scopes of significance area can be effectively given out; and the method can be used for SAR image target detection and target recognition.

Description

Based on the SAR image salient region detection method in Anisotropic diffusion space
Technical field
The invention belongs to technical field of image processing, relate to SAR image salient region and detect, can be used as SAR image target detection and target identification.
Technical background
Synthetic-aperture radar SAR system, as the active microwave imaging radar of one, because of the feature that it has round-the-clock, round-the-clock and penetrability, has become the important tool that remote sensing fields obtains data.Along with the increase of SAR image data volume and the development of image analysis technology, the demand of automatic treatment S AR image is also day by day strong.Particularly SAR image target detection technique, not only can reduce the workload of artificial interpretation, or the basis of the automatic target detection ATR technology of SAR image and key link.Therefore, effectively, obtain SAR image target area accurately and can improve the efficiency of SAR image target identification and the positioning precision of target.
Target area in SAR image all exists with background usually obviously to be distinguished.In human visual system, namely this kind of region is exactly salient region irrelevant with scene content in bottom vision.Therefore, just target area can be obtained by the salient region detected in SAR image.Traditional optical imagery salient region detection method, by Laurent.Itti, the multiple dimensioned salient region detection method (Laurent.Itti that the people such as Christof Koch and Ernst Niebur propose, Christof Koch and Ernst Niebur.A Model of Saliency-Based Visual Attention for Rapid Scene Analysis.IEEE Trans.on Pattern Analysis and Machine Intelligence, 1998,20 (11): 1254-1259).Steps of the method are: first gaussian pyramid decomposition is carried out to image; By " the thin yardstick of " center " obtains early vision feature with the center-surrounding of " surrounding " thick yardstick is poor; After obtaining remarkable figure by normalization center-surrounding is poor, employing winner is complete tactfully must obtain salient region position.Because the simple and strong robustness of the method, adopt by a lot of optical imagery target identification system.But research finds, marginal information is the key that salient region judges and locates, and the gaussian pyramid that the method adopts decomposes can not keep marginal position accurately, thus affects the positioning precision of salient region.Again because the method cannot provide clear and definite salient region size, thus salient region scope in the picture accurately cannot be marked.When the method is applied to SAR image, real image intensity can be changed owing to producing a large amount of speckle noise in SAR system imaging process, false edge is caused in homogenous region, the true edge of brighter salient region is made again to thicken, thus marginal information accurately cannot be utilized to judge salient region, reduce the accuracy of its location simultaneously.
Summary of the invention
The object of the invention is to overcome above-mentioned problematic shortcoming, a kind of SAR image salient region detection method based on Anisotropic diffusion space is proposed, to reduce the erroneous judgement to region significance and position coordinates thereof, improve the accuracy of algorithm, and effectively provide salient region scope, differentiate for succeeding target and identify to lay a good foundation.
For achieving the above object, technical scheme of the present invention comprises the steps:
(1) input the SAR image SI of width I × J size, utilize one piece of rectangle homogenous region R on this image, calculate the equivalent number ENL of this image;
(2) given false-alarm probability p fa, calculate initial edge thresholding T according to equivalent number ENL;
(3) out to out λ is set max, smallest dimension λ minwith yardstick interval delta λ, and set k successively value as from 0 to (λ maxminall integers of)/Δ λ, at yardstick λ minunder+k × Δ λ, utilize (λ min+ k × Δ λ) × (λ min+ k × Δ λ) detection window, calculate the edge strength g of each pixel (i, j) i, j, k, wherein, wherein i is the row at pixel place, and j is the row at pixel place, 1≤i≤I, 1≤j≤J;
(4) according to edge strength g i, j, k, calculate at yardstick λ minthe coefficient of diffusion div of each pixel (i, j) under+k × Δ λ i, j, k;
(5) set m successively value as from 0 to I all integers, n successively value is all integers from 0 to J, according to yardstick λ minthe coefficient of diffusion of+k × Δ λ, calculates yardstick λ minthe row edge parameters matrix A that the m of+k × Δ λ is capable m,kwith the column border parameter matrix A ' of the n-th row n,k;
(6) establish k successively value from 0 to (λ maxminall integers of)/Δ λ-1, according to yardstick λ minthe row edge parameters matrix of+k × Δ λ and column border parameter matrix, utilize additive operator to divide strategy, calculate yardstick λ minthe scalogram U of+k × Δ λ kand comparison diagram U k';
(7) according to the scalogram U that (6) calculate kand comparison diagram U k', calculate initial conspicuousness matrix Y t:
7a) calculate each pixel (i, j) at yardstick λ minyardstick significance measure S under+k × Δ λ i, j, k;
7b) find out 7a) (the λ that tries to achieve maxminone maximum in)/Δ λ yardstick significance measure, if the yardstick R that this yardstick significance measure is corresponding i,jfor λ minor λ max-Δ λ, then pixel (i, j) does not have salient region, no longer defines the significance measure of this pixel, otherwise pixel (i, j) has salient region, its significance measure S i,jfor yardstick R i,jyardstick significance measure, and by row vector (i , j, R i,j, S i,j) add initial conspicuousness matrix Y t;
(8) initial conspicuousness matrix Y is chosen tin row corresponding to the maximum significance measure of front ε % build new conspicuousness matrix Y t', 0 < ε≤100, then obtain stable conspicuousness matrix Y by alternative manner s, extract and stablize conspicuousness matrix Y sthe first two element of every a line as the pixel count of the square length of side, draws corresponding square salient region as the ranks coordinate at center and the 3rd element in SAR image.
The present invention compared with prior art tool has the following advantages:
(1) the present invention adopts the detection window of different scale to judge the edge strength of pixel, and the edge strength of different scale is set up corresponding scalogram by Anisotropic diffusion, can effectively describe different scale marginal information while accurately providing marginal position, thus improve the accuracy of region significance judgement and location;
(2) the present invention is when the conspicuousness of judging area, adopts the square window that different scale is corresponding to calculate significance measure, and using the conspicuousness yardstick of the yardstick of this tolerance as region, thus give salient region actual range in the picture.
Simulation result shows, the present invention, compared with existing SM salient region detection method, effectively describes the marginal information on different scale, adds the detection accuracy of salient region, and effectively give salient region scope.
Accompanying drawing explanation
Fig. 1 of the present inventionly realizes general flow chart;
Fig. 2 is the sub-process figure calculating edge strength in the present invention;
Fig. 3 is the sub-process figure calculating coefficient of diffusion in the present invention;
Fig. 4 is the sub-process figure calculating row edge parameters matrix and column border parameter matrix in the present invention;
Fig. 5 is the sub-process figure of slipstick degree figure and comparison diagram thereof in the present invention;
Fig. 6 is the sub-process figure of calculation stability conspicuousness matrix in the present invention;
Fig. 7 is the scalogram in the present invention, actual measurement SAR image being carried out to anisotropy parameter foundation;
Fig. 8 is to the low resolution actual measurement SAR image salient region testing result figure containing vehicle target with the present invention;
Fig. 9 is to the high resolving power actual measurement SAR image salient region testing result figure containing vehicle target with the present invention;
Figure 10 is to the actual measurement SAR image salient region testing result figure containing ship target with the present invention.
Embodiment
Below in conjunction with accompanying drawing, embodiments of the invention and effect are further illustrated:
With reference to Fig. 1, specific embodiment of the invention step is as follows:
Step 1. inputs the SAR image SI of width I × J size, utilizes one piece of rectangle homogenous region R on this image, calculates the equivalent number ENL of this image:
NEL = mean ( m , n ) &Element; R ( x m , n ) 2 var ( m , n ) &Element; R ( x m , n ) ,
Wherein, mean () averages, and var () asks variance, and (m, n) ∈ R represents that pixel (m, n) is contained in the R of region, x m,nrepresent the pixel value of pixel (m, n), I is the line number of image, and J is the columns of image.
The given false-alarm probability p of step 2. fa, calculate initial edge thresholding T according to equivalent number ENL:
T = Qinv ( 1 - p fa , ENL ) ENL ,
Wherein, Qinv () is inverse incomplete gamma functions, false-alarm probability p fabe arrange according to target conspicuousness degree in the picture, this example is set to 10%.
Step 3. utilizes detection window, calculates the edge strength g of each pixel i, j, k.
With reference to Fig. 2, being implemented as follows of this step:
3a) setup parameter: out to out λ max, smallest dimension λ min, yardstick interval delta λ, the initial row coordinate of pixel (i, j) is i=1, and row coordinate is j=1, scale coefficient k=0, wherein λ max, λ min, the Δ λ size possible in the picture according to target is arranged, and this example establishes λ max≤ 60, λ min>=2,2≤Δ λ≤10;
3b) centered by pixel (i, j), utilize (λ min+ k × Δ λ) × (λ min+ k × Δ λ) all pixels in detection window, calculate this pixel at yardstick λ minlower average μ under+k × Δ λ i, j, kwith upper average μ ' i, j, k:
&mu; i , j , k = &epsiv; i , j , k - NEL &times; &psi; i , j , k 2 / ( ENL + 1 ) - &epsiv; i , j , k 2
&mu; i , j , k &prime; = &epsiv; i , j , k + NEL &times; &psi; i , j , k 2 / ( ENL + 1 ) - &epsiv; i , j , k 2
Wherein, ε i, j, kfor (λ centered by pixel (i, j) min+ k × Δ λ) × (λ min+ k × Δ λ) average pixel value of pixel in detection window, for (λ centered by pixel (i, j) min+ k × Δ λ) × (λ min+ k × Δ λ) mean value of the pixel value square of pixel in detection window;
3c) utilize initial edge thresholding T and lower average μ i, j, k, calculate pixel (i, j) at yardstick λ minedge thresholding T under+k × Δ λ i, j, k:
T i,j,k=μ i,j,k×T;
3d) statistics (λ centered by pixel (i, j) min+ k × Δ λ) × (λ min+ k × Δ λ) be more than or equal to T in detection window i, j, kpixel number num i, j, kbe less than T i, j, kpixel number num ' i, j, k, calculate pixel (i, j) at yardstick λ minedge strength g under+k × Δ λ i, j, k:
g i,j,k=(μ′ i,j,ki,j,k-1)×min(num i,j,k/(λ min+k×Δλ) 2,num' i,j,k/(λ min+k×Δλ) 2)
Wherein, min () represents the reckling got in the two;
3e) jump to corresponding steps according to scale size: work as λ min+ k × Δ λ < λ maxtime, make k=k+1, and return step 3b); Otherwise, perform step 3f);
3f) jump to corresponding steps according to pixel coordinate: as j ≠ J, make j=j+1, k=0, and return step 3b); As i ≠ I, j=J, make i=i+1, j=1, k=0, and return step 3b); Work as i=I, during j=J, perform step 4;
It should be noted that: calculate SAR image edge strength and be not limited to the said method that this example provides, also can adopt prior art following any one:
One is average ratio ROA method, see R.Touzi, A.Lop`es, and P.Bousquet.A statistical and geometrical edge detector for SAR images.IEEE Trans.Geosci.Remote Sensing, 1988,26 (6): 764 – 773;
Two is likelihood ratio LR methods, sees C.J.Oliver, D.Blacknell, and R.G.White.Optimum edge detection in SAR.IEE Proceedings-Radar, Sonar and Navigation, 1996,143 (1); 31-40;
Three be index weighted mean than ROEWA method, see R. a.Lop`es, P.Marthon, and E.Cubero-Castan.An Optimal Multiedge Detector for SAR Image Segmentation.EEE Trans.Geosci.Remote Sensing, 1998,36 (3): 793-802.
Step 4. is according to edge strength g i, j, k, calculate the coefficient of diffusion div of pixel under different scale i, j, k.
With reference to Fig. 3, being implemented as follows of this step:
Initial row coordinate 4a) setting pixel (i, j) is i=1, and row coordinate is j=1, scale coefficient k=0;
4b) according to edge strength g i, j, k, calculate pixel (i, j) at yardstick λ mincoefficient of diffusion div under+k × Δ λ i, j, k:
div i , j , k = 1 - e - 3.315 / ( g i , j , k / gt ) 4 ,
Wherein, gt is the edge parameters of setting, and the span of gt is arrive for i value be 1 to I, j value be 1 to J time g i, j, 0maximal value.
4c) jump to corresponding steps according to scale size: work as λ min+ k × Δ λ < λ maxtime, make k=k+1, and return step 4b); Otherwise, perform step 4d);
4d) jump to corresponding steps according to pixel coordinate: as j ≠ J, make j=j+1, k=0, and return step 4b); As i ≠ I, j=J, make i=i+1, j=1, k=0, and return step 4b); Work as i=I, during j=J, perform step 5.
Step 5. is according to the coefficient of diffusion div of each pixel under each yardstick i, j, k, calculate the row edge parameters matrix A of each yardstick m,kwith column border parameter matrix A ' n,k.
With reference to Fig. 4, being implemented as follows of this step:
Initial row coordinate 5a) setting image is m=1, and row coordinate is n=1, scale coefficient k=0;
5b) calculate yardstick λ minthe row edge parameters matrix A that the m of+k × Δ λ is capable m,k:
Wherein: a i, i-1=-Δ λ × (div m, i, k+ div m, i-1, k),
a i,i+1=-Δλ×(div m,i,k+div m,i+1,k),
a i,i=1+Δλ×(2×div m,i,k+div m,i-1,k+div m,i+1,k),1≤i≤J;
5c) jump to corresponding steps according to line number: as m ≠ I, make m=m+1, and return step 5b); As m=I, perform step 5d);
5d) calculate yardstick λ minthe column border parameter matrix A ' of n-th row of+k × Δ λ n,k:
Wherein: a ' j, j-1=-Δ λ × (div j, n, k+ div j-1, n, k),
a′ j,j+1=-Δλ×(div j,n,k+div j+1,n,k),
a′ j,j=1+Δλ×(2×div j,n,k+div j-1,n,k+div j+1,n,k),1≤j≤I;
5e) jump to corresponding steps according to columns: as n ≠ J, make n=n+1, and return step 5d); As n=J, perform step 5f);
5f) jump to corresponding steps according to scale size: work as λ min+ k × Δ λ ≠ λ maxtime, make m=1, n=1, k=k+1, and return step 5b); Work as λ min+ k × Δ λ=λ maxtime, perform step 6.
Step 6. is according to the row edge parameters matrix A of different scale m,kwith column border parameter matrix A ' n,k, utilize additive operator to divide strategy, calculate corresponding scalogram U kand comparison diagram U k'.
With reference to Fig. 5, being implemented as follows of this step:
Initial row coordinate 6a) setting image is m=1, and row coordinate is n=1, scale coefficient k=0;
6b) set initial gauges figure U kfor image SI, initial comparison diagram U k' be image SI, calculation times cn=1;
6c) utilize row edge parameters matrix A m,kwith scalogram U km row element, calculate row scalogram by Thomas algorithm m row element;
6d) utilize row edge parameters matrix A m, k+1with comparison diagram U k' m row element, calculate row comparison diagram by Thomas algorithm m row element;
6e) jump to corresponding steps according to line number: as m ≠ I, make m=m+1, and return step 6c); As m=I, perform step 6f);
6f) utilize column border parameter matrix A ' n,kwith scalogram U kthe n-th column element, by Thomas algorithm calculated column scalogram the n-th column element;
6g) utilize column border parameter matrix A ' n, k+1with comparison diagram U k' the n-th column element, by Thomas algorithm calculated column comparison diagram the n-th column element;
6h) jump to corresponding steps according to columns: as n ≠ J, make n=n+1, and return step 6f); As n=J, perform step 6i);
6i) calculate yardstick λ minthe scalogram U of+k × Δ λ kand comparison diagram U k':
U k = ( U k r + U k c ) / 2
U k &prime; = ( U k &prime; r + U k &prime; c ) / 2 ;
6j) jump to corresponding steps according to calculation times: as cn < (λ min+ k × Δ λ)/2, make m=1, n=1, cn=cn+1, and return step 6c); Otherwise, perform step 6k);
6k) jump to corresponding steps according to scale size: work as λ min+ k × Δ λ ≠ λ maxduring-Δ λ, make m=1, n=1, k=k+1, and return step 6b); Work as λ min+ k × Δ λ=λ maxduring-Δ λ, perform step 7.
Thomas algorithm is that British mathematician Lu Ailin Thomas proposes, it solves tri-diagonal system by the Gaussian elimination of reduced form, and specific algorithm is shown in H.R.Schwarz.Numerische Mathematik.Stuttgart, Germany:Teubner, 1988,43-45.
Step 7. is according to scalogram U kand comparison diagram U k', calculate initial conspicuousness matrix Y t.
Initial row coordinate 7a) setting pixel (i, j) is i=1, and row coordinate is j=1;
7b) establish k successively value from 0 to (λ maxminall integers of)/Δ λ-1, calculate each pixel (i, j) at yardstick λ minyardstick significance measure S under+k × Δ λ i, j, k:
S i , j , k = &Sigma; x = 0 , . . . , 255 p i , j , k ( x ) log p i , j , k ( x ) p i , j , k &prime; ( x ) + &Sigma; x = 0 , . . . , 255 p i , j , k &prime; ( x ) log p i , j , k &prime; ( x ) p i , j , k ( x )
Wherein x is the pixel value of 0 to 255 changes, p i, j, kx () is scalogram U kon centered by pixel (i, j), (λ min+ k × Δ λ) × (λ min+ k × Δ λ) pixel value is the pixel probability of x in square window, p i' , j, kx () is comparison diagram U k' on centered by pixel (i, j), (λ min+ k × Δ λ) × (λ min+ k × Δ λ) pixel value is the pixel probability of x in square window;
7c) find out 7b) (the λ that tries to achieve maxminone maximum in)/Δ λ yardstick significance measure, if the yardstick R that this yardstick significance measure is corresponding i,jfor λ minor λ max-Δ λ, then pixel (i, j) does not have salient region, no longer defines the significance measure of this pixel, otherwise pixel (i, j) has salient region, its significance measure S i,jfor yardstick R i,jyardstick significance measure, and by row vector (i, j, R i,j, S i,j) add initial conspicuousness matrix Y t;
7d) jump to corresponding steps according to pixel coordinate: as j ≠ J, make j=j+1, and return step 7b); As i ≠ I, j=J, make i=i+1, j=1, and return step 7b); Work as i=I, during j=J, perform step 8.
Step 8. is according to initial conspicuousness matrix Y t, obtain stable conspicuousness matrix Y s.
With reference to Fig. 6, being implemented as follows of this step:
8a) choose initial conspicuousness matrix Y tin row corresponding to the maximum significance measure of front ε % build new conspicuousness matrix Y t', 0 < ε≤100, ε value is no more than 20;
8b) setting area conspicuousness ratio sr, its value is not less than 0.3, and establishes stable conspicuousness matrix Y sfor empty matrix;
8c) select new conspicuousness matrix Y tthe pixel alternatively point that ' middle significance measure is maximum, structure is centered by the first two element of its corresponding row, 3rd element is the square window of the length of side, calculates the ratio sr ' of pixel number and the interior total pixel number of window in this square window with salient region;
8d) above-mentioned sr ' and sr are compared, if sr ' < sr, then by candidate point be expert at from new conspicuousness matrix Y t' in remove; Otherwise, candidate point is expert at and adds stable conspicuousness matrix Y s, then by candidate point and square window all have the pixel of salient region be expert at from new conspicuousness matrix Y t' in remove;
8e) judge new conspicuousness matrix Y t' whether be empty, then stop if it is empty, and stable output conspicuousness matrix Y s; Otherwise, return step 8c).
Step 9. is extracted and is stablized conspicuousness matrix Y severy a line in SAR image, draw corresponding square salient region, wherein the first two element of every a line is the ranks coordinate at square region center, and the 3rd element is the length of side of square region.
Effect of the present invention can be further illustrated by following emulation:
1. experiment condition
Experiment simulation environment is: MATLAB R2011b, Intel (R) Core i5-3470 CPU 3.2GHz, Window7 professional version.
2. experiment content and result:
Experiment 1, respectively the detection window of 9 × 9 and 17 × 17 is acted on actual measurement SAR image, recycling additive operator division strategy obtains the scalogram of corresponding yardstick, result as shown in Figure 7, wherein Fig. 7 (a) is actual measurement SAR image, Fig. 7 (b) to be yardstick be 9 scalogram, Fig. 7 (c) to be yardstick be 17 scalogram.
As can be seen from Fig. 7 (b), because the edge scale of vehicle and landform is greater than 9 × 9 detection window sizes, when the edge strength that this window obtains is applied to Anisotropic diffusion, although homogenous region thickens, the marginal information of vehicle and landform obtains good maintenance.
As can be seen from Fig. 7 (c), because vehicle edge yardstick is less than 17 × 17 detection window sizes, when the edge strength of the vehicle edge obtained by this window is applied to Anisotropic diffusion, vehicle edge thickens, and the landform edge of correspondence is greater than detection window size all the time, so still maintain landform marginal information.
Experiment 2, use the inventive method and existing SM salient region detection algorithm (Laurent.Itti respectively, Christof Koch and Ernst Niebur.A Model of Saliency-Based Visual Attention for Rapid Scene Analysis.IEEE Trans.on Pattern Analysis and Machine Intelligence, 1998,20 (11): 1254-1259) salient region detection is carried out to the low resolution actual measurement SAR image containing vehicle target.
Optimum configurations is: homogenous region R is row-coordinate 58-225, the rectangular area of row coordinate 109-268; False-alarm probability p fa=0.1; Out to out λ max=33, smallest dimension λ min=5, yardstick interval delta λ=4; ε=15; Region significance ratio sr=1.
Testing result is as Fig. 8, wherein Fig. 8 (a) is the low resolution actual measurement SAR image containing vehicle target, Fig. 8 (b) is that Fig. 8 (c) is the salient region testing result of the inventive method to Fig. 8 (a) with SM salient region detection algorithm to the salient region testing result of Fig. 8 (a).
As can be seen from Fig. 8 (a), vehicle target has conspicuousness in entire image.
As can be seen from Fig. 8 (b), SM salient region detection algorithm only detects 6 vehicle targets in 13 vehicle targets, and part homogenous region has been detected as target, although in addition for low-resolution image, the method can mark vehicle target position more accurately, but can not accurately mark target zone.
As can be seen from Fig. 8 (c), the present invention can accurately detect the position of all vehicle targets, and does not occur flase drop in homogenous region, and the target zone marked conforms to the actual size of vehicle target.
Experiment 3, carries out salient region detection with the inventive method and existing SM salient region detection algorithm to the high resolving power actual measurement SAR image containing vehicle target respectively.
Optimum configurations is: homogenous region R is row-coordinate 35-74, the rectangular area of row coordinate 180-209; False-alarm probability p fa=0.1; Out to out λ max=57, smallest dimension λ min=9, yardstick interval delta λ=8; ε=10; Region significance ratio sr=0.8.
Testing result is as Fig. 9, wherein Fig. 9 (a) is the high resolving power actual measurement SAR image containing vehicle target, Fig. 9 (b) is that Fig. 9 (c) is the salient region testing result of the inventive method to Fig. 9 (a) with SM salient region detection algorithm to the salient region testing result of Fig. 9 (a).
As can be seen from Fig. 9 (a), vehicle target has conspicuousness in entire image.
As can be seen from Fig. 9 (b), although SM salient region detection algorithm detects all 13 vehicle targets, but be also subject to the impact of topography variation, topography variation region flase drop is become target, in addition the method can accurately not obtain the position of vehicle target, cannot mark target zone accurately yet.
As can be seen from Fig. 9 (c), there is flase drop in the impact that the present invention is not subject to topography variation, and detected the position of all vehicle targets accurately, and can accurately mark the target zone conformed to the actual size of vehicle target.
Experiment 4, carries out salient region detection with the inventive method and existing SM salient region detection algorithm to the actual measurement SAR image containing ship target respectively.
Optimum configurations is: homogenous region R is row-coordinate 287-335, the rectangular area of row coordinate 226-292; False-alarm probability p fa=0.1; Out to out λ max=31, smallest dimension λ min=7, yardstick interval delta λ=2; ε=6.6; Region significance ratio sr=0.31.
Testing result is as Figure 10, wherein Figure 10 (a) is the actual measurement SAR image containing ship target, Figure 10 (b) is that Figure 10 (c) is the salient region testing result of the inventive method to Figure 10 (a) with SM salient region detection algorithm to the salient region testing result of Figure 10 (a).
As can be seen from Figure 10 (a), ship target has conspicuousness in entire image.
The position of ship target cannot be marked accurately as can be seen from Figure 10 (b), SM salient region detection algorithm.
As can be seen from Figure 10 (c), for the ship target of strip, this method not only can accurately detect the position of all ship targets, and can utilize the overlapping target zone going out to conform to the actual size of ship target of multiple salient region.

Claims (9)

1., based on the SAR image salient region detection method in Anisotropic diffusion space, comprise the following steps:
(1) input the SAR image SI of width I × J size, utilize one piece of rectangle homogenous region R on this image, calculate the equivalent number ENL of this image;
(2) given false-alarm probability p fa, calculate initial edge thresholding T according to equivalent number ENL;
(3) out to out λ is set max, smallest dimension λ minwith yardstick interval delta λ, and set k successively value as from 0 to (λ maxminall integers of)/Δ λ, at yardstick λ minunder+k × Δ λ, utilize (λ min+ k × Δ λ) × (λ min+ k × Δ λ) detection window, calculate the edge strength g of each pixel (i, j) i, j, k, wherein, wherein i is the row at pixel place, and j is the row at pixel place, 1≤i≤I, 1≤j≤J;
(4) according to edge strength g i, j, k, calculate at yardstick λ minthe coefficient of diffusion div of each pixel (i, j) under+k × Δ λ i, j, k;
(5) set m successively value as from 0 to I all integers, n successively value is all integers from 0 to J, according to yardstick λ minthe coefficient of diffusion of+k × Δ λ, calculates yardstick λ minthe row edge parameters matrix A that the m of+k × Δ λ is capable m,kwith the column border parameter matrix A ' of the n-th row n,k;
(6) establish k successively value from 0 to (λ maxminall integers of)/Δ λ-1, according to yardstick λ minthe row edge parameters matrix of+k × Δ λ and column border parameter matrix, utilize additive operator to divide strategy, calculate yardstick λ minthe scalogram U of+k × Δ λ kand comparison diagram U ' k;
(7) according to the scalogram U that (6) calculate kand comparison diagram U ' k, calculate initial conspicuousness matrix Y t:
7a) calculate each pixel (i, j) at yardstick λ minyardstick significance measure S under+k × Δ λ i, j, k;
7b) find out 7a) (the λ that tries to achieve maxminone maximum in)/Δ λ yardstick significance measure, if the yardstick R that this yardstick significance measure is corresponding i,jfor λ minor λ max-Δ λ, then pixel (i, j) does not have salient region, no longer defines the significance measure of this pixel, otherwise pixel (i, j) has salient region, its significance measure S i,jfor yardstick R i,jyardstick significance measure, and by row vector (i, j, R i,j, S i,j) add initial conspicuousness matrix Y t;
(8) initial conspicuousness matrix Y is chosen tin row corresponding to the maximum significance measure of front ε % build new conspicuousness matrix Y t', 0 < ε≤100, then obtain stable conspicuousness matrix Y by alternative manner s, extract and stablize conspicuousness matrix Y sthe first two element of every a line as the pixel count of the square length of side, draws corresponding square salient region as the ranks coordinate at center and the 3rd element in SAR image.
2. method according to claim 1, wherein said step 1) equivalent number ENL, be calculated as follows:
ENL = mean ( m , n ) &Element; R ( x m , n ) 2 var ( m , n ) &Element; R ( x m , n ) ,
Wherein, mean () averages, and var () asks variance, and (m, n) ∈ R represents that pixel (m, n) is contained in the R of region, x m,nrepresent the pixel value of pixel (m, n).
3. method according to claim 1, wherein said step 2) in calculate initial edge thresholding T, be calculated as follows:
T = Qinv ( 1 - p fa , ENL ) ENL ,
Wherein, Qinv () is inverse incomplete gamma functions.
4. method according to claim 1, wherein said step 3) in calculate the edge strength g of each pixel (i, j) i, j, k, obtain as follows:
(3a) centered by pixel (i, j), (λ is utilized min+ k × Δ λ) × (λ min+ k × Δ λ) all pixels in detection window, calculate this pixel at yardstick λ minlower average μ under+k × Δ λ i, j, kwith upper average μ ' i, j, k:
&mu; i , j , k = &epsiv; i , j , k - ENL &times; &psi; i , j , k 2 / ( ENL + 1 ) - &epsiv; i , j , k 2
&mu; i , j , k &prime; = &epsiv; i , j , k + ENL &times; &psi; i , j , k 2 / ( ENL + 1 ) - &epsiv; i , j , k 2
Wherein, ε i, j, kfor (λ centered by pixel (i, j) min+ k × Δ λ) × (λ min+ k × Δ λ) average pixel value of pixel in detection window, for (λ centered by pixel (i, j) min+ k × Δ λ) × (λ min+ k × Δ λ) mean value of the pixel value square of pixel in detection window;
(3b) initial edge thresholding T and lower average μ is utilized i, j, k, calculate pixel (i, j) at yardstick λ minedge thresholding T under+k × Δ λ i, j, k:
T i,j,k=μ i,j,k×T;
(3c) statistics (λ centered by pixel (i, j) min+ k × Δ λ) × (λ min+ k × Δ λ) be more than or equal to T in detection window i, j, kpixel number num i, j, kbe less than T i, j, kpixel number num ' i, j, k, calculate pixel (i, j) at yardstick λ minedge strength g under+k × Δ λ i, j, k:
g i,j,k=(μ′ i,j,ki,j,k-1)×min(num i,j,k/(λ min+k×Δλ) 2,num' i,j,k/(λ min+k×Δλ) 2)
Wherein, min () represents the reckling got in the two.
5. method according to claim 1, wherein said step 4) in calculate at yardstick λ minthe coefficient of diffusion div of each pixel (i, j) under+k × Δ λ i, j, k, be calculated as follows:
div i , j , k = 1 - e - 3.315 / ( g i , j , k / gt ) 4
Wherein, gt is the edge parameters of setting, and the span of gt is arrive for i value be 1 to I, j value be 1 to J time g i, j, 0maximal value.
6. method according to claim 1, wherein said step 5) in slipstick degree λ minthe row edge parameters matrix A that the m of+k × Δ λ is capable m,kwith the column border parameter matrix A ' of the n-th row n,k, be calculated as follows:
Wherein: a i, i-1=-Δ λ × (div m, i, k+ div m, i-1, k),
a i,i+1=-Δλ×(div m,i,k+div m,i+1,k),
a i,i=1+Δλ×(2×div m,i,k+div m,i-1,k+div m,i+1,k),1≤i≤J,
a′ j,j-1=-Δλ×(div j,n,k+div j-1,n,k),
a′ j,j+1=-Δλ×(div j,n,k+div j+1,n,k),
a′ j,j=1+Δλ×(2×div j,n,k+div j-1,n,k+div j+1,n,k),1≤j≤I。
7. method according to claim 1, wherein said step 6) in slipstick degree λ minthe scalogram U of+k × Δ λ kand comparison diagram U ' k, obtain as follows:
(6a) initial gauges figure U is set kfor image SI, initial comparison diagram U ' kfor image SI;
(6b) row edge parameters matrix A is utilized m,kwith scalogram U km row element, by Thomas algorithm, calculate row scalogram m row element, 1≤m≤I;
(6c) row edge parameters matrix A is utilized m, k+1with comparison diagram U ' km row element, by Thomas algorithm, calculate row comparison diagram m row element;
(6d) column border parameter matrix A ' is utilized n,kwith scalogram U kthe n-th column element, by Thomas algorithm, calculated column scalogram the n-th column element, 1≤n≤J;
(6e) column border parameter matrix A ' is utilized n, k+1with comparison diagram U ' kthe n-th column element, by Thomas algorithm, calculated column comparison diagram the n-th column element;
(6f) yardstick λ is calculated minthe scalogram U of+k × Δ λ kand comparison diagram U ' k:
U k = ( U k r + U k c ) / 2
U k &prime; = ( U k &prime; r + U k &prime; c ) / 2 ;
(6g) whether determining step (6b) calculates k time to (6f), if then stop, and exports scalogram U kand comparison diagram U ' k; Otherwise, get back to step (6b).
8. method according to claim 1, wherein said step 7a) in calculate each pixel (i, j) at yardstick λ minyardstick significance measure S under+k × Δ λ i, j, k, be calculated as follows:
S i , j , k = &Sigma; x = 0 , . . . , 255 p i , j , k ( x ) log p i , j , k ( x ) p i , j , k &prime; ( x ) + &Sigma; x = 0 , . . . , 255 p i , j , k &prime; ( x ) log p i , j , k &prime; ( x ) p i , j , k ( x )
Wherein x is the pixel value of 0 to 255 changes, p i, j, kx () is scalogram U kon centered by pixel (i, j), (λ min+ k × Δ λ) × (λ min+ k × Δ λ) pixel value is the pixel probability of x in square window, p ' i, j, kx () is comparison diagram U ' kon centered by pixel (i, j), (λ min+ k × Δ λ) × (λ min+ k × Δ λ) pixel value is the pixel probability of x in square window.
9. method according to claim 1, wherein said step 8) in obtain stable conspicuousness matrix Y by alternative manner s, obtain as follows:
(8a) setting area conspicuousness ratio sr, and establish stable conspicuousness matrix Y sfor empty matrix;
(8b) new conspicuousness matrix Y is selected tthe pixel alternatively point that ' middle significance measure is maximum, structure is centered by the first two element of its corresponding row, 3rd element is the square window of the length of side, calculates the ratio sr ' of pixel number and the interior total pixel number of window in this square window with salient region;
(8c) above-mentioned sr ' is compared with sr, if sr ' < sr, then by candidate point be expert at from conspicuousness matrix Y t' in remove; Otherwise, candidate point is expert at and adds stable conspicuousness matrix Y s, then by pixel be expert ats with salient region all in candidate point and square window from conspicuousness matrix Y t' in remove;
(8d) conspicuousness matrix Y is judged t' whether be empty, then stop if it is empty, and stable output conspicuousness matrix Y s; Otherwise, get back to step (8b).
CN201510254252.1A 2015-05-18 2015-05-18 SAR image salient region detection method based on Anisotropic diffusion space Active CN104899873B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510254252.1A CN104899873B (en) 2015-05-18 2015-05-18 SAR image salient region detection method based on Anisotropic diffusion space

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510254252.1A CN104899873B (en) 2015-05-18 2015-05-18 SAR image salient region detection method based on Anisotropic diffusion space

Publications (2)

Publication Number Publication Date
CN104899873A true CN104899873A (en) 2015-09-09
CN104899873B CN104899873B (en) 2017-10-24

Family

ID=54032518

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510254252.1A Active CN104899873B (en) 2015-05-18 2015-05-18 SAR image salient region detection method based on Anisotropic diffusion space

Country Status (1)

Country Link
CN (1) CN104899873B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107301643A (en) * 2017-06-06 2017-10-27 西安电子科技大学 Well-marked target detection method based on robust rarefaction representation Yu Laplce's regular terms

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103500453A (en) * 2013-10-13 2014-01-08 西安电子科技大学 SAR(synthetic aperture radar) image significance region detection method based on Gamma distribution and neighborhood information
US20150117783A1 (en) * 2013-10-24 2015-04-30 Adobe Systems Incorporated Iterative saliency map estimation

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103500453A (en) * 2013-10-13 2014-01-08 西安电子科技大学 SAR(synthetic aperture radar) image significance region detection method based on Gamma distribution and neighborhood information
US20150117783A1 (en) * 2013-10-24 2015-04-30 Adobe Systems Incorporated Iterative saliency map estimation

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
QIANG ZHANG等: "Multiple-Scale Salient-Region Detection of SAR Image Based on Gamma Distribution and Local Intensity Variation", 《IEEE GEOSCIENCE AND REMOTE SENSING LETTERS》 *
谢惠杰 等: "尺度自适应的SAR图像显著性检测方法", 《计算机工程与应用》 *
贺良杰 等: "基于局部对比和全局稀有度的显著性检测", 《计算机应用研究》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107301643A (en) * 2017-06-06 2017-10-27 西安电子科技大学 Well-marked target detection method based on robust rarefaction representation Yu Laplce's regular terms
CN107301643B (en) * 2017-06-06 2019-08-06 西安电子科技大学 Well-marked target detection method based on robust rarefaction representation Yu Laplce's regular terms

Also Published As

Publication number Publication date
CN104899873B (en) 2017-10-24

Similar Documents

Publication Publication Date Title
CN101975940B (en) Segmentation combination-based adaptive constant false alarm rate target detection method for SAR image
CN101980293B (en) Method for detecting MTF of hyperspectral remote sensing system based on edge image
CN105335966B (en) Multiscale morphology image division method based on local homogeney index
CN103971364B (en) Remote sensing image variation detecting method on basis of weighted Gabor wavelet characteristics and two-stage clusters
CN110210448B (en) Intelligent face skin aging degree identification and evaluation method
CN103605953A (en) Vehicle interest target detection method based on sliding window search
CN106934795A (en) The automatic testing method and Forecasting Methodology of a kind of glue into concrete beam cracks
CN103500453B (en) Based on the SAR image salient region detection method of Gamma distribution and neighborhood information
CN104134080A (en) Method and system for automatically detecting roadbed collapse and side slope collapse of road
CN102129573A (en) SAR (Synthetic Aperture Radar) image segmentation method based on dictionary learning and sparse representation
CN104112279B (en) A kind of object detection method and device
CN103207987A (en) Indicating value identification method of dial instrument
CN105389799B (en) SAR image object detection method based on sketch map and low-rank decomposition
CN112013921B (en) Method, device and system for acquiring water level information based on water level gauge measurement image
CN101551851A (en) Infrared image target identification method
CN106127205A (en) A kind of recognition methods of the digital instrument image being applicable to indoor track machine people
CN101770583B (en) Template matching method based on global features of scene
JP4946878B2 (en) Image identification apparatus and program
CN112183301B (en) Intelligent building floor identification method and device
CN104318559A (en) Quick feature point detecting method for video image matching
Yuan et al. Combining maps and street level images for building height and facade estimation
CN103268496A (en) Target identification method of SAR (synthetic aperture radar) images
CN102360503A (en) SAR (Specific Absorption Rate) image change detection method based on space approach degree and pixel similarity
CN104574312A (en) Method and device of calculating center of circle for target image
CN106291550A (en) The polarization SAR Ship Detection of core is returned based on local scattering mechanism difference

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20230620

Address after: Room 124, Room 70102, Xinglinggu Podium Building, Entrepreneurship Research and Development Park, No. 69 Jinye Road, High tech Zone, Xi'an City, Shaanxi Province, 710076

Patentee after: Xi'an Huoyanwei Optoelectronic Technology Co.,Ltd.

Address before: 710071 No. 2 Taibai South Road, Shaanxi, Xi'an

Patentee before: XIDIAN University

TR01 Transfer of patent right