CN107944357A - Multi-source Remote Sensing Images cloud detection method of optic based on evidence fusion adaptive threshold - Google Patents

Multi-source Remote Sensing Images cloud detection method of optic based on evidence fusion adaptive threshold Download PDF

Info

Publication number
CN107944357A
CN107944357A CN201711113064.2A CN201711113064A CN107944357A CN 107944357 A CN107944357 A CN 107944357A CN 201711113064 A CN201711113064 A CN 201711113064A CN 107944357 A CN107944357 A CN 107944357A
Authority
CN
China
Prior art keywords
cloud
mrow
msub
msubsup
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201711113064.2A
Other languages
Chinese (zh)
Other versions
CN107944357B (en
Inventor
方薇
乔延利
张冬英
易维宁
黄红莲
杜丽丽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei Institutes of Physical Science of CAS
Original Assignee
Hefei Institutes of Physical Science of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei Institutes of Physical Science of CAS filed Critical Hefei Institutes of Physical Science of CAS
Priority to CN201711113064.2A priority Critical patent/CN107944357B/en
Publication of CN107944357A publication Critical patent/CN107944357A/en
Application granted granted Critical
Publication of CN107944357B publication Critical patent/CN107944357B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/50Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Astronomy & Astrophysics (AREA)
  • Remote Sensing (AREA)
  • Image Analysis (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)

Abstract

The present invention relates to the multi-source Remote Sensing Images cloud detection method of optic based on evidence fusion adaptive threshold, solves to be difficult to merge by multi-data source compared with prior art and combines the defects of carrying out cloud detection with adaptive threshold.The present invention comprises the following steps:The selection and pretreatment of data source images;To " cloud ", " non-cloud " two class event estimation just distribution;Classification thresholds primary election and definite trust section;Optimum Classification based on Sequential Decision tree and evidence fusion;Export cloud detection classification results.The present invention is by the way of multiband optical sensor and the associated working of multi-angle Polarimeter, the multi-data source data for needing to use in detecting " cloud " merge, the accuracy problem of " cloud " and " non-cloud " classification is effectively solved, improves nicety of grading during cloud edge detection.

Description

Multi-source Remote Sensing Images cloud detection method of optic based on evidence fusion adaptive threshold
Technical field
The present invention relates to technical field of remote sensing image processing, the multi-source specifically based on evidence fusion adaptive threshold Remote sensing images cloud detection method of optic.
Background technology
In satellite image imaging process, cloud block is to cause the principal element of quality of image degradation.Although with ground Thing is compared, and cloud usually has higher reflectivity and relatively low temperature, it appears that can be distinguished using simple window threshold value.But A variety of cloud forms such as cirrus, cumulus, cloud of water droplets, ice cloud etc. so that cloud lacks enough contrasts with clutter reflections rate.The mankind Environment Change, greenhouse effects and aerosol caused by activity cause the reflection effect of cloud to aggravate change.Equally, covered ground is also deposited Spectrum distortion, the ecological evolutions such as " the different spectrum of jljl, same object different images " phenomena such as, cloud edge is even more problem in detecting.
The progress of optics and electronic technology have developed the remote sensing detection sensor of more renewals, and remote sensing images development has Multisensor, multidate, high-resolution trend, not only detecting band scope is wider, new imaging technique such as polarization imaging pair It is sensitiveer in the detection of atmospheric aerosol, water content, along with the development of computer big data treatment technology, using multiple biographies Sensor data source, which carrys out resolution of complex target, possibility.
Since vegetation is relatively low in the reflectivity of red wave band, multispectral optical sensor is examined using the reflectivity of 670nm wave bands The cloud pixel that underlying surface is land is surveyed, increases the contrast of cloud and atural object.For underlying surface then 870nm wave bands are used for ocean more It is good.In general, cloud detection has the emissivity of higher and the physical characteristic of relatively low temperature frequently with " cloud " than " non-cloud ".Research shows Multi-angle polarization optics sensor is suitable for obtaining the much informations such as cloud particle size and shape, the Earthwatch of such as external transmitting Multi-angle polarization camera POLDER3 is loaded on satellite PARASOL, cloud detection effect is good.When underlying surface is land, use 490nm wave bands carry out cloud detection, and the polarized reflectance of the wave band is scattered essentially from atmospheric molecule, its Reflectivity for Growing Season is (as planted Quilt and soil) than relatively low, angle of scattering is that 80 ° to 120 ° of polarization camera is for atmospheric aerosol, cirrus and equal containing small water droplet cloud Very strong polarized reflectance is presented, estimates atmospheric molecule optical thickness using it, cloud has lower atmospheric molecule optics than non-cloud Thickness.Infrared 865nm wave bands Polarimeter nearby has stronger " cumulus " and " fine " separating capacity for 140 ° in angle of scattering.Cause And cloud detection development trend is to use multiband, a variety of remote sensors to obtain more information.The quick hair of IT technologies Exhibition, a variety of optimisation techniques and method are provided for multi-source data processing and analysis.
Threshold method is the common two classes target differentiating approach of remote sensing image.Current threshold value, which chooses one, to be passed through according to history Test it is definite, second, determining that the former is known as Supervised classification, and the latter is known as unsupervised segmentation according to image information in itself.The former Method is fairly simple, it can be difficult to taking into account the information of present image, thus can not tackle the temporal and spatial orientation of image picture elements, have People sets threshold value to reach dynamic threshold according to atural object reflectivity, but needs substantial amounts of priori and database accumulation, so-called Adaptive threshold simply replaces global image with multiwindow local threshold value.The defects of latter portion compensate for the former, still It is difficult to accurately select threshold value, for this reason, some is combined using optics gray scale with texture, some is combined using optics with microwave sounding Method, its starting point are desirable to obtain the characteristic information of more clouds, so as to improve the accuracy of Threshold segmentation, these, which are improved, increases Add the complexity calculated, be unfavorable for the popularization of application.Other methods have:Unsupervised clustering such as K-means methods are, it is necessary to a large amount of Calculate and careful initial seed selects;Supervised classification has Bayesian decision, seeks double luminance thresholds etc. with maximum between-cluster variance, Need a large amount of training samples and priori.The detection of deep learning CNN neutral nets is the hot spot of recent research, and shortcoming is flood tide Learning sample and time-consuming, the physical significance of classification is not clear and definite enough.Since threshold method is relatively easy, explicit physical meaning, is being permitted More occasions are applied, and with the development of computer technology, various improved methods are just under study for action.
The content of the invention
The purpose of the present invention is to solve be difficult to merge by multi-data source in the prior art to combine with adaptive threshold The defects of carrying out cloud detection, there is provided a kind of to be solved based on the multi-source Remote Sensing Images cloud detection method of optic of evidence fusion adaptive threshold The above problem.
To achieve these goals, technical scheme is as follows:
Multi-source Remote Sensing Images cloud detection method of optic based on evidence fusion adaptive threshold, comprises the following steps:
11) selection and pretreatment of data source images:The first kind, secondary sources source images are chosen, and are pre-processed;
12) to " cloud ", " non-cloud " two class event estimation just distribution:Training sample is utilized respectively to " cloud ", " non-cloud " two class Event carries out just distribution estimation, calculates the histogram that " cloud " or " non-cloud " event is just distributed;
13) classification thresholds primary election and definite trust section:Remote sensing images " cloud " and " non-cloud " are solved using information divergence Two class target classifications just threshold value, and section is trusted by the average and standard deviation of preliminary classification, threshold value neighborhood;
14) Optimum Classification based on Sequential Decision tree and evidence fusion:Rough segmentation based on preliminary classification average and standard deviation Class, and the disaggregated classification using D-S theories to two class target data source threshold value degree of beliefs progress fusion calculation, realize two class targets Optimum Classification;
15) cloud detection classification results are exported:Sorted cloud detection image is exported.
The selection and pretreatment of the data source images comprise the following steps:
21) define and obtain the reflectivity R that the multi-spectral remote sensing image that primary sources source is λ=670nm wave bands obtains Image, is denoted as F1(x, y)=R670
22) define and obtain secondary sources source and visited for the multi-angle polarization of 80 ° of λ=490nm, angle of scattering 120 ° of < Θ < The atmospheric molecule opticalthicknessτ remote sensing images that instrument obtains are surveyed, are denoted as F2(x, y)=τ490
23) two class data source images are removed with the pretreatment in solar flare region.
It is described that to " cloud ", the estimation of " non-cloud " two class event, just distribution comprises the following steps:
31) the first training sample is takenThe history reflectivity of λ=670nm wave bands is defined as,
Wherein, (x, y) is and F1The latitude and longitude coordinates of identical pixel, history reflectivityFor (x, y) place previous year All fine day months in identical month, form " non-cloud " event and are just distributed, be denoted as p670
32) the second training sample is takenThe history for being defined as multi-angle polarization λ=490nm wave bands is big Gas molecular optics thickness,
Wherein, (x, y) is and F1The latitude and longitude coordinates of identical pixel,For in (x, y) place previous year identical month All fine day months, form " non-cloud " event and are just distributed, be denoted as q490
33) training sample is estimated as the first distribution of " cloud " event,
Take F5=| F1-F3|、F6=| F2-F4|,
Wherein:F5" cloud " event distribution is just estimated in λ=670nm wave band albedo images, is denoted as q670,
F6" cloud " event distribution in λ=490nm wave band atmospheric molecule optical thickness images is polarized for multi-angle just to estimate to remember For p490
34) to F1、F2、F3、F4、F5、F6Image chooses 32 × 32 pixel window respectively;
F is calculated first3、F5、F4、F6The histogram that four width images " cloud " or " non-cloud " event are just distributed, to F3、F5、F4、F6 The image window pixel value is divided into n=256 grades by R or τ, i.e., each tonal gradation of multispectral image at intervals of i= (MaxR670-MinR670)/256, each tonal gradation of polarization image at intervals of i=(Max τ490-Minτ490)/256;
The four width images window histograms is calculated, obtains image reflectance or the frequency of each grade of atmospheric optics molecular thickness Number is defined as
The classification thresholds primary election and definite trust section comprise the following steps:
41) information divergence is solved, asks for initial threshold T,
411) to primary sources source, F is used3、F5Histogram, obtain " non-cloud " and " cloud " first distribution,
The probability distribution of " non-cloud " is defined as P, P={ p1,p2...pi...p256, wherein pi=h (i)/(32 × 32),
The probability distribution of " cloud " is defined as Q, Q={ q1,q2...qi...q256, wherein qi=h (i)/(32 × 32);
412) two category information divergence D are calculated to primary sources sourceKL(P, Q), its calculation expression are as follows:
Do T=1,2 ... 256
End, wherein, piAnd qiIt is not zero;
Select information divergence minimum M in (DKL(P, Q)) corresponding T is as primary sources source F1The initial threshold of pixel classification Value, is denoted as T1
413) to secondary sources source, F is used4、F6Histogram, obtain " non-cloud " and " cloud " first distribution,
The probability distribution of " non-cloud " is defined as Q, Q={ q1,q2...qi...q256, wherein qi=h (i)/(32 × 32);
The probability distribution of " cloud " is defined as P, P={ p1,p2...pi...p256, wherein pi=h (i)/(32 × 32);
414) two category information divergence D are calculated to secondary sources sourceKL(P, Q), its calculation expression are as follows:
Do T=1,2 ... 256
End, wherein, piAnd qiIt is not zero;
Select information divergence minimum M in (DKL(P, Q)) corresponding T is as secondary sources source F2The classification of window pixel is just Beginning threshold value, is denoted as T2
42) the trust section of threshold value T neighborhoods is defined,
Definition γ is constant value, if T=γ values near zones (α, β) are trust section of the object event with respect to T,
Wherein, primary sources source i >=β is the supporting evidence section of " cloud ", and i≤α is " non-cloud " evidence section;
Secondary sources source i >=β is " non-cloud " evidence section, and i≤α is the supporting evidence section of " cloud ";
43) obtain primary sources source and trust section α and β value,
Calculate primary sources source F1Window histogram, use T1By primary sources source F1The window pixel initially divide Class is labeled as " non-cloud "" cloud "Calculate primary sources source F1 " cloud-cloud " and " non-cloud-clear " two classification target average and standard deviation of the window,
The average of " cloud-cloud " isStandard deviation is
The average of " non-cloud-clear " isStandard deviation is
44) obtain secondary sources source and trust section α and β value,
Calculate secondary sources source F2Window histogram, use T2By secondary sources source F2The window pixel initially divide Class is labeled as " cloud "" non-cloud "Calculate secondary sources source F2 " cloud-cloud " and " non-cloud-clear " two classification target average and standard deviation,
The average of " cloud-cloud " isStandard deviation is
The average of " non-cloud-clear " isStandard deviation is
The Optimum Classification based on Sequential Decision tree and evidence fusion comprises the following steps:
51) α and β value in section are trusted based on the α and β value, secondary sources source for obtaining primary sources source trust section Rough sort is carried out to " cloud " and " non-cloud " target sequences,
511) by F3(x, y) is emptied,
F3(x, y)={ }, F3For storing rough sort intermediate result;
512) for primary sources source F1, handle as follows:
Take threshold valueIfThen the pixel is determined as " non-cloud ", assignment F3(x,y) =0;
Take threshold valueIfThen the pixel is determined as " cloud ", assignment F3(x, y)= 1;
IfThen the pixel is determined as " unknown ", assignment F3(x, y)=254;
513) for secondary sources source F2, handle as follows:
Position F3(x, y)=0 and F3The cell coordinate of (x, y)=254, poll F2The pixel value of same coordinate on image;
Take threshold valueIfThen the pixel is determined as " cloud ", assignment F3(x, y)= 1;
Take threshold valueIfThen the pixel is determined as " non-cloud ", assignment F3(x,y) =0;
IfThen the pixel is determined as " unknown ", assignment F3(x, y)=254;
52) F is positioned3The cell coordinate of (x, y)=254, poll F1、F2The pixel value gray scale of the coordinate on image histogram Grade ω ∈ (0,255) and its frequency h1(ω)、h2(ω), calculates it and belongs to the trust of two class targets " cloud " and " non-cloud " respectively Degree;
Based on joint degree of belief be finely divided class, using D-S evidence theory ask the first kind, secondary sources source " cloud " or The joint confidence level of " non-cloud " target classification, is " cloud ", " non-cloud " or " unknown " to " unknown " Image segmentation class, that is, puts F3(x, Y)=1, F3(x, y)=0 keeps F3(x, y)=254 are constant;
53) F of the order after disaggregated classification1(x, y)=F3(x, y),
Judge F1Pixel is " cloud " event at (x, y)=1, obtains sorted window cloud detection image F1
54) F is traveled through with 32 × 32 pixel windows1、F2、F3、F4、F5、F6, repeat above-mentioned 34) to 53) processing step, acquisition Final classification chart is as F1
It is described class is finely divided based on joint degree of belief to comprise the following steps:
61) primary sources source F is calculated3The degree of belief CF of the coordinate pixel of (x, y)=2541
Set CFclear(α)=1, CFclear(γ)=CFcloud(γ)=0.5, CFcloud(β)=1,
For reflectivity level ω ∈ (alpha, gamma), the degree of belief that the pixel belongs to " non-cloud " is calculated, formula is as follows:
For reflectivity level ω ∈ (γ, β), the degree of belief that the pixel belongs to " cloud " is calculated, formula is as follows:
62) secondary sources source F is calculated3The degree of belief probability CF of the coordinate pixel of (x, y)=2542
Set CFcloud(α)=1, CFclear(γ)=CFcloud(γ)=0.5, CFclear(β)=1,
For atmospheric molecule optical thickness grade ω ∈ (alpha, gamma), the degree of belief that the pixel belongs to " cloud " is calculated:
For atmospheric molecule optical thickness grade ω ∈ (γ, β), the degree of belief that the pixel belongs to " non-cloud " is calculated:
63) M is sets(Ak) for s-th of data source for object event Ak(k=1,2...K) Credibility probability distribution Value, wherein s ∈ [1,2];
For primary sources source F1(x, y) and secondary sources source F2(x, y), k ∈ [1,2],
A1Represent " cloud ", A2Represent " non-cloud ",
M1(A1) for the first data source belong to the confidence level of " cloud ", M2(A1) it is the confidence level that the second data source belongs to " cloud ",
M1(A2) for primary sources source belong to the confidence level of " non-cloud ", M2(A2) belong to " non-cloud " for secondary sources source Confidence level,
M1(A2)=1-M1(A1), M2(A2)=1-M2(A1), wherein M () is Mass functions;
64) by 61) step, 62) pixel of acquired first, second class data source coordinate (x, the y) position of step belongs to two Classification target degree of belief, makes Mass functions M1() and M2() is expressed as the degree of belief letter in the first kind and secondary sources source Number:
M1(non-cloud)=1-P1
M2(non-cloud)=1-P2
65) according to Demster composition rules, try to achieve cell coordinate for (x, y) belong to after " cloud " or " non-cloud " fusion can Brief inference, it is comprised the following steps that:
651) normalization factor is sought,
It is detection target to define B, C, is " cloud " or " non-cloud ", then:
K=∑s M1(B)M2(C)=M1(cloud) M2(cloud)+M1(non-cloud) M2(non-cloud)=P1P2+(1-P1)(1-P2)
Wherein, B ∩ C ≠ 0;
652) the fusion credit assignment value of " cloud " and " non-cloud " is sought,
Wherein, B ∩ C=clouds;
Wherein, B ∩ C=non-cloud;
653) combination degree of belief disaggregated classification is pressed,
If judge
Then pixel F3(x, y)=254, are " unknown ";
Otherwise, if M ({ cloud }) > M ({ non-cloud }), pixel F3(x, y)=1, is " cloud ";
Otherwise, pixel F3(x, y)=0, then be " non-cloud ".
Beneficial effect
The multi-source Remote Sensing Images cloud detection method of optic based on evidence fusion adaptive threshold of the present invention, compared with prior art By the way of multiband optical sensor and the associated working of multi-angle Polarimeter, need to use in detecting " cloud " more Data source data is merged, and effectively solves the accuracy problem of " cloud " and " non-cloud " classification, improves point during cloud edge detection Class precision.
The present invention provides a kind of adaptive thresholding value setting method and sequential decision process for multi-source Remote Sensing Images cloud detection, from And improve intelligence and the accuracy of detection of cloud detection classification;The threshold test and Sequential Decision method of the present invention has dynamic certainly Adaptability, generalization ability is strong, can be applied to the cloud covering detection process of astronautics and airborne remote sensing data.
The present invention uses multi-data source remote sensing image, initial threshold is asked for semi-supervised information divergence, with reference to statistics The Optimum Classification of Sequential Decision is realized with fusion method, there is cloud detection explicit physical meaning, data processing and analysis simply Advantage.
Brief description of the drawings
Fig. 1 is the method precedence diagram of the present invention;
Fig. 2 is primary sources source " cloud " degree of belief in the present invention and reflectivity level graph of a relation.
Embodiment
The effect of to make to architectural feature of the invention and being reached, has a better understanding and awareness, to preferable Embodiment and attached drawing coordinate detailed description, are described as follows:
A kind of as shown in Figure 1, multi-source Remote Sensing Images cloud detection based on evidence fusion adaptive threshold of the present invention Method, comprises the following steps:
The first step, the selection and pretreatment of data source images.The first kind, secondary sources source images are chosen, and are carried out pre- Processing.Here, the wave band of first, second data source by suitable for underlying surface be land in case of, if underlying surface is ocean When, data source wavelength need to be selected separately, but cloud detection method of optic proposed by the invention is equally applicable, i.e., method in itself with wave band without Close, is unrelated with data source number.The selection of wavelength does not limit the constraints of the present invention.It is comprised the following steps that:
(1) define and obtain the reflectivity R that the multi-spectral remote sensing image that primary sources source is λ=670nm wave bands obtains Image, is denoted as F1(x, y)=R670
(2) define and obtain secondary sources source and visited for the multi-angle polarization of 80 ° of λ=490nm, angle of scattering 120 ° of < Θ < The atmospheric molecule opticalthicknessτ remote sensing images that instrument obtains are surveyed, are denoted as F2(x, y)=τ490
(3) two class data source images are removed with the pretreatment in solar flare region by existing method.
Second step, to " cloud ", " non-cloud " two class event estimation just distribution.
It is utilized respectively training sample and just distribution estimation is carried out to " cloud ", " non-cloud " two class event, calculates " cloud " or " non- The histogram that cloud " event is just distributed.It is comprised the following steps that:
(1) the first training sample is takenThe history reflectivity of λ=670nm wave bands is defined as,
Wherein, (x, y) is and F1The latitude and longitude coordinates of identical pixel, history reflectivityFor (x, y) place previous year All fine day months in identical month, form " non-cloud " event and are just distributed, be denoted as p670
(2) the second training sample is takenThe history for being defined as multi-angle polarization λ=490nm wave bands is big Gas molecular optics thickness,
Wherein, (x, y) is and F1The latitude and longitude coordinates of identical pixel,For in (x, y) place previous year identical month All fine day months, form " non-cloud " event and are just distributed, be denoted as q490
(3) training sample is estimated as the first distribution of " cloud " event,
Take F5=| F1-F3|、F6=| F2-F4|,
Wherein:F5" cloud " event distribution is just estimated in λ=670nm wave band albedo images, is denoted as q670,
F6" cloud " event distribution in λ=490nm wave band atmospheric molecule optical thickness images is polarized for multi-angle just to estimate to remember For p490
(4) to F1、F2、F3、F4、F5、F6Image chooses 32 × 32 windows respectively, divides window to multiple images by traditional approach Mouth carries out traversal processing respectively.F is calculated first3、F5、F4、F6The histogram that four width images " cloud " or " non-cloud " event are just distributed, To F3、F5、F4、F6The image window pixel value is divided into n=256 grades, i.e., between each tonal gradation of multispectral image by R or τ It is divided into i=(MaxR670-MinR670)/256, each tonal gradation of polarization image at intervals of i=(Max τ490-Minτ490)/ 256;
The four width images window histograms is calculated, obtains image reflectance or the frequency of each grade of atmospheric optics molecular thickness Number is defined as
The characteristics of above-mentioned steps (1)-(4) fully demonstrate semi-supervised method, on the one hand using being classified as areal Sample of the average of on same month, upper 1 fine day as " non-cloud " event distribution, and use " cloud " event distribution in current remote sensing figure Past experience, i.e., be combined and realize the selection of initial threshold by sample with Current Content.
3rd step, classification thresholds primary election and trust section determine.Remote sensing images " cloud " and " non-are solved using information divergence Two class target classifications of cloud " just threshold value, and section is trusted by the average and standard deviation of preliminary classification, threshold value neighborhood.It is logical Above-mentioned semi-supervised method is crossed, the multispectral optical sensor and a multi-angle Polarimeter that first two steps are obtained obtain " clear sky " history remote sensing images, the common initial distribution for producing " non-cloud " and " cloud " event, two classification target information of calculating dissipate Degree and its initial threshold, use information divergence make initial threshold, optimize the basis of threshold value as preliminary classification and next step.It has Body step is as follows:
(1) information divergence is solved, asks for initial threshold T.
A1, to primary sources source, use F3、F5Histogram, obtain " non-cloud " and " cloud " first distribution,
The probability distribution of " non-cloud " is defined as P, P={ p1,p2...pi...p256, wherein pi=h (i)/(32 × 32),
The probability distribution of " cloud " is defined as Q, Q={ q1,q2...qi...q256, qi=h (i)/(32 × 32);
A2, calculate two category information divergence D to primary sources sourceKL(P, Q), its calculation expression are as follows:
Do T=1,2 ... 256
End, wherein, piAnd qiIt is not zero, because P and Q is separate at this time, without mutual information.
Select information divergence minimum M in (DKL(P, Q)) corresponding T is as primary sources source F1The initial threshold of pixel classification Value, is denoted as T1
A3, to secondary sources source, use F4、F6Histogram, obtain " non-cloud " and " cloud " first distribution,
The probability distribution of " non-cloud " is defined as Q, Q={ q1,q2...qi...q256, wherein qi=h (i)/(32 × 32);
The probability distribution of " cloud " is defined as P, P={ p1,p2...pi...p256, wherein pi=h (i)/(32 × 32),
A4, calculate two category information divergence D to secondary sources sourceKL(P, Q), its calculation expression are as follows:
Do T=1,2 ... 256
End, wherein, piAnd qiIt is not zero;
Select information divergence minimum M in (DKL(P, Q)) corresponding T is as secondary sources source F2The classification of window pixel is just Beginning threshold value, is denoted as T2
(2) the trust section of threshold value T neighborhoods is defined,
Definition γ is constant value, T=γ, i.e. initial threshold, F1(x, y)=γ F2The pixel of (x, y)=γ can both classify For " cloud " and it can be categorized as " non-cloud ", therefore γ peripheries are that the confusion region classified is also to trust section, determine the zone boundary Credit assignment in parameter and region can improve nicety of grading.
If T=γ values near zones (α, β) are trust section of the object event with respect to T, the confidence model in primary sources source Enclose as shown in Fig. 2, primary sources source i >=β is the supporting evidence section of " cloud ", i≤α is " non-cloud " evidence section.Due to A kind of data source comes from multispectral optical sensor, and " cloud " reflectivity that it is obtained is higher than " non-cloud " reflectivity, secondary sources Source comes from multi-angle Polarimeter, and " cloud " the atmospheric molecule optical thickness that it is obtained is less than " non-cloud ".Therefore, secondary sources With this similarly, secondary sources source i >=β is " non-cloud " evidence section in source, and i≤α is the supporting evidence section of " cloud ".
(3) obtain primary sources source and trust section α and β value.
Here, for initial threshold sorted " cloud " and " non-cloud ", the class average mean plus-minuss two obtained with first classification Times standard deviation (2 σ) value sequentially classifies first, second class data source as rough sort threshold value;For the image of normal distribution, Above-mentioned 2 σ sections can obtain 95.4% rough sort precision.
Calculate primary sources source F1Window histogram, use T1By primary sources source F1The window pixel initially divide Class is labeled as " non-cloud "" cloud "Calculate primary sources source F1 " cloud-cloud " and " non-cloud-clear " two classification target average and standard deviation of the window,
The average of " cloud-cloud " isStandard deviation is
The average of " non-cloud-clear " isStandard deviation is
(4) obtain secondary sources source and trust section α and β value,
Calculate secondary sources source F2Window histogram, use T2By secondary sources source F2The window pixel it is initial Key words sorting is " cloud "" non-cloud "Calculate secondary sources source F2" cloud-cloud " and " non-cloud-clear " two classification target average and standard deviation of the window,
The average of " cloud-cloud " isStandard deviation is
The average of " non-cloud-clear " isStandard deviation is
Here, the calculating by trusting section α and β value, the γ values originally set are optimized, the present invention is not straight Connect using γ as final threshold value to classify, but with it carry out just classification, obtain " cloud " and " non-cloud " two class event average with Standard deviation, then are directly used by gamma taxonomy, improves confidence level for remote sensing figure rough sort, the rough sort ratio with α, the β for trusting section.By Only carried out in disaggregated classification in (α-γ-β) minizone, so calculation amount very little.
4th step, the Optimum Classification based on Sequential Decision tree and evidence fusion.Using D-S theories to two class target data sources Threshold value neighbour's threshold degree of belief carries out fusion calculation, realizes two classification target Optimum Classifications.
Fuzzy region with reference to two class sensors, is merged, disaggregated classification with MASS herein.For two class data fusions Degree of belief probability (MASS functions) is combined using D-S evidence theory to be asked for.The pixel in section is trusted for being located at after rough sort, point Its degree of belief is not calculated for the first kind and secondary sources source, then tries to achieve joint degree of belief with evidence theory, is realized into one Walk disaggregated classification.It is comprised the following steps that:
(1) α and β value in section are trusted based on the α and β value, secondary sources source for obtaining primary sources source trust section Rough sort is carried out to " cloud " and " non-cloud " target sequences,
A1, by F3(x, y) is emptied,
F3(x, y)={ }, F3For storing rough sort intermediate result;
A2, for primary sources source F1, handle as follows:
Take threshold value(primary sources source F1The trust interval value α of (x, y)),
IfThen the pixel is determined as " non-cloud ", assignment F3(x, y)=0;
Take threshold value(primary sources source F1The trust interval value β of (x, y)), ifThen the pixel is determined as " cloud ", assignment F3(x, y)=1;
IfThen the pixel is determined as " unknown ", assignment F3(x, y)=254;
A3, for secondary sources source F2, handle as follows:
Position F3(x, y)=0 and F3The cell coordinate of (x, y)=254, poll F2The pixel value of same coordinate on image;
Take threshold value(secondary sources source F2The trust interval value α of (x, y)),
IfThen the pixel is determined as " cloud ", assignment F3(x, y)=1;
Take threshold value(secondary sources source F2The trust interval value β of (x, y)),
IfThen the pixel is determined as " non-cloud ", assignment F3(x, y)=0;
IfThen the pixel is determined as " unknown ", assignment F3(x, y)=254.
A4, positioning F3The cell coordinate of (x, y)=254, poll F1,F2The pixel value gray scale of the coordinate on image histogram Grade ω ∈ (0,255) and its frequency h1(ω),h2(ω), calculates it and belongs to two classification target degree of beliefs respectively.Then, use D-S evidence theory seeks the joint confidence level of the first kind, secondary sources source " cloud " or " non-cloud " target classification, to " unknown " pixel Disaggregated classification is " cloud ", and " non-cloud " or " unknown ", that is, put F3(x, y)=1, F3(x, y)=0 keeps F3(x, y)=254 are constant.
(2) class is finely divided based on joint degree of belief, calculates F1、F2Unknown pixel calculates it and belongs to two classification targets respectively Degree of belief.The joint confidence level of the first kind, secondary sources source " cloud " or " non-cloud " target classification is sought using D-S evidence theory, To " unknown " Image segmentation class.
By upper step, it will trust the pixel of section (α, β) outside and distinguish, but also there are scope between α and β The undetermined classification of pixel, though this partial section is smaller, " non-cloud " or " cloud " is directly regarded as according to conventional method, just Nicety of grading during cloud edge detection can not be improved, therefore the pixel in section need to be also trusted for being located at after rough sort, passes through connection Close degree of belief probability to compare, further disaggregated classification.The corresponding degree of belief in (α, β) section is not linear relationship shown in Fig. 2, so not Degree of belief simply can be distributed with the linear scale of the distance of ω to α or β and the distance of α to γ or β to γ, instead referred to ω, alpha, gamma, the histogram frequency class distribution of β, i.e., introduce influences of the ω to classifiable event probability in credit assignment, improve thin The accuracy of classification.
It is comprised the following steps that:
A1, calculate primary sources source F3The degree of belief CF of the coordinate pixel of (x, y)=2541
Set CFclear(α)=1, CFclear(γ)=CFcloud(γ)=0.5, CFcloud(β)=1,
For reflectivity level ω ∈ (alpha, gamma), the degree of belief that the pixel belongs to " non-cloud " is calculated, formula is as follows:
For reflectivity level ω ∈ (γ, β), the degree of belief that the pixel belongs to " cloud " is calculated, formula is as follows:
A2, calculate secondary sources source F3The degree of belief probability CF of the coordinate pixel of (x, y)=2542
Set CFcloud(α)=1, CFclear(γ)=CFcloud(γ)=0.5, CFclear(β)=1,
For atmospheric molecule optical thickness grade ω ∈ (alpha, gamma), the degree of belief that the pixel belongs to " cloud " is calculated:
For atmospheric molecule optical thickness grade ω ∈ (γ, β), the degree of belief that the pixel belongs to " non-cloud " is calculated:
A3, setting Ms(Ak) for s-th of data source for object event Ak(k=1,2...K) Credibility probability distribution Value, wherein s ∈ [1,2];
For primary sources source F1(x, y) and secondary sources source F2(x, y), k ∈ [1,2],
A1Represent " cloud ", A2Represent " non-cloud ",
M1(A1) for the first data source belong to the confidence level of " cloud ", M2(A1) it is the confidence level that the second data source belongs to " cloud ",
M1(A2) for primary sources source belong to the confidence level of " non-cloud ", M2(A2) belong to " non-cloud " for secondary sources source Confidence level,
M1(A2)=1-M1(A1), M2(A2)=1-M2(A1), wherein M () is Mass functions;
A4, be directed to F3Pixel is determined as the situation of " unknown " at the coordinate of (x, y)=254, assert position F1(x, y) and F2The value of (x, y) must be in T neighborhoods and trust at the ω in section, and first, second class data source has been obtained by above-mentioned A1, A2 step The pixel of certain coordinate (x, y) position belongs to two classification target degree of beliefs, makes Mass functions M1() and M2() is expressed as first Class and the degree of belief function in secondary sources source::
M1(non-cloud)=1-P1
M2(non-cloud)=1-P2
A5, foundation Demster composition rules, try to achieve pixel F1(x, y) belongs to the confidence level after " cloud " or " non-cloud " fusion Distribution, it is comprised the following steps that:
A51, seek normalization factor,
It is detection target to define B, C, is " cloud " or " non-cloud ", then:
K=∑s M1(B)M2(C)=M1(cloud) M2(cloud)+M1(non-cloud) M2(non-cloud)=P1P2+(1-P1)(1-P2)
Wherein, B ∩ C ≠ 0;
A52, the fusion credit assignment value for asking " cloud " and " non-cloud ",
Wherein, B ∩ C=clouds;
Wherein, B ∩ C=non-cloud;
A53, disaggregated classification threshold classification,
If judge
Then pixel F1(x, y)=254, are " unknown ";
Otherwise, if M ({ cloud }) > M ({ non-cloud }), pixel F1(x, y)=1, is " cloud ";
Otherwise, pixel F1(x, y)=0, then be " non-cloud ".
ε people is chooses, general optional ε=0.05, or voluntarily determines a value according to actual conditions, when less than this numerical value When, " cloud " and " non-cloud " confidence level difference are too small, it is believed that " unknown ".
(3) F of the order after disaggregated classification1(x, y)=F3(x, y),
After above-mentioned processing, classification results are put back into F1, i.e. F1Store final classification result.
Judge F1Pixel is " cloud " event at (x, y)=1, obtains the window cloud detection image F after thick, disaggregated classification1
(4) F is traveled through with 32 × 32 pixel windows in a traditional way1、F2、F3、F4、F5、F6, as the prior art, repeat Classification results are stored in F by (4) of above-mentioned second step to (3) processing step of the 4th step1, final classification chart is obtained as F1
5th step, exports cloud detection classification results:Sorted cloud detection image is exported.
Basic principle, main feature and the advantages of the present invention of the present invention has been shown and described above.The technology of the industry Personnel are it should be appreciated that the present invention is not limited to the above embodiments, and what is described in the above embodiment and the description is only the present invention Principle, various changes and modifications of the present invention are possible without departing from the spirit and scope of the present invention, such as introduces more More optical regions or polarization angle, method shown in the present invention can be applicable in, thus, these changes and improvements both fall within requirement and protect In the scope of the present invention of shield.The protection domain of application claims is defined by appended claims and its equivalent.

Claims (6)

1. a kind of multi-source Remote Sensing Images cloud detection method of optic based on evidence fusion adaptive threshold, it is characterised in that including following Step:
11) selection and pretreatment of data source images:The first kind, secondary sources source images are chosen, and are pre-processed;
12) to " cloud ", " non-cloud " two class event estimation just distribution:Training sample is utilized respectively to " cloud ", " non-cloud " two class event Just distribution estimation is carried out, calculates the histogram that " cloud " or " non-cloud " event is just distributed;
13) classification thresholds primary election and definite trust section:Two classes of remote sensing images " cloud " and " non-cloud " are solved using information divergence Target classification just threshold value, and section is trusted by the average and standard deviation of preliminary classification, threshold value neighborhood;
14) Optimum Classification based on Sequential Decision tree and evidence fusion:Rough sort based on preliminary classification average and standard deviation, with And using D-S theories two class target data source threshold value degree of beliefs are carried out with the disaggregated classification of fusion calculation, realize that two classification targets are excellent Change classification;
15) cloud detection classification results are exported:Sorted cloud detection image is exported.
2. the multi-source Remote Sensing Images cloud detection method of optic according to claim 1 based on evidence fusion adaptive threshold, it is special Sign is that the selection and pretreatment of the data source images comprise the following steps:
21) the reflectivity R images that the multi-spectral remote sensing image that primary sources source is λ=670nm wave bands obtains are defined and obtain, It is denoted as F1(x, y)=R670
22) define and obtain secondary sources source as λ=490nm, the multi-angle Polarimeter of 80 ° of 120 ° of < Θ < of angle of scattering The atmospheric molecule opticalthicknessτ remote sensing images of acquisition, are denoted as F2(x, y)=τ490
23) two class data source images are removed with the pretreatment in solar flare region.
3. the multi-source Remote Sensing Images cloud detection method of optic according to claim 1 based on evidence fusion adaptive threshold, it is special Sign is that described just distribution comprises the following steps to " cloud ", the estimation of " non-cloud " two class event:
31) the first training sample is takenThe history reflectivity of λ=670nm wave bands is defined as,
Wherein, (x, y) is and F1The latitude and longitude coordinates of identical pixel, history reflectivityIt is identical for (x, y) place previous year All fine day months in month, form " non-cloud " event and are just distributed, be denoted as p670
32) the second training sample is takenIt is defined as the history atmospheric molecule of multi-angle polarization λ=490nm wave bands Optical thickness,
Wherein, (x, y) is and F1The latitude and longitude coordinates of identical pixel,To own in (x, y) place previous year identical month Fine day month, forms " non-cloud " event and is just distributed, be denoted as q490
33) training sample is estimated as the first distribution of " cloud " event,
Take F5=| F1-F3|、F6=| F2-F4|,
Wherein:F5" cloud " event distribution is just estimated in λ=670nm wave band albedo images, is denoted as q670,
F6" cloud " event distribution in λ=490nm wave band atmospheric molecule optical thickness images is polarized for multi-angle just to estimate to be denoted as p490
34) to F1、F2、F3、F4、F5、F6Image chooses 32 × 32 pixel window respectively;
Calculate F3、F5、F4、F6The histogram that four width images " cloud " or " non-cloud " event are just distributed, to F3、F5、F4、F6The image window Mouthful pixel value is divided into n=256 grades by R or τ, i.e., each tonal gradation of multispectral image at intervals of i=(MaxR670- MinR670)/256, each tonal gradation of polarization image at intervals of i=(Max τ490-Minτ490)/256;
The four width images window histograms is calculated, the frequency for obtaining image reflectance or each grade of atmospheric optics molecular thickness is determined Justice is
4. the multi-source Remote Sensing Images cloud detection method of optic according to claim 1 based on evidence fusion adaptive threshold, it is special Sign is that the classification thresholds primary election and definite trust section comprise the following steps:
41) information divergence is solved, asks for initial threshold T,
411) to primary sources source, F is used3、F5Histogram, obtain " non-cloud " and " cloud " first distribution,
The probability distribution of " non-cloud " is defined as P, P={ p1,p2...pi...p256, wherein pi=h (i)/(32 × 32),
The probability distribution of " cloud " is defined as Q, Q={ q1,q2...qi...q256, wherein qi=h (i)/(32 × 32);
412) two category information divergence D are calculated to primary sources sourceKL(P, Q), its calculation expression are as follows:
Do T=1,2 ... 256
<mrow> <msub> <mi>D</mi> <mrow> <mi>K</mi> <mi>L</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi>P</mi> <mo>,</mo> <mi>Q</mi> <mo>)</mo> </mrow> <mo>=</mo> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>T</mi> </munderover> <msub> <mi>p</mi> <mi>i</mi> </msub> <mi>l</mi> <mi>o</mi> <mi>g</mi> <mfrac> <msub> <mi>p</mi> <mi>i</mi> </msub> <msub> <mi>q</mi> <mi>i</mi> </msub> </mfrac> <mo>+</mo> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>i</mi> <mo>=</mo> <mi>T</mi> <mo>+</mo> <mn>1</mn> </mrow> <mn>256</mn> </munderover> <msub> <mi>q</mi> <mi>i</mi> </msub> <mi>l</mi> <mi>o</mi> <mi>g</mi> <mfrac> <msub> <mi>q</mi> <mi>i</mi> </msub> <msub> <mi>p</mi> <mi>i</mi> </msub> </mfrac> </mrow>
End, wherein, piAnd qiIt is not zero;
Select information divergence minimum M in (DKL(P, Q)) corresponding T is as primary sources source F1Pixel classification initial threshold, It is denoted as T1
413) to secondary sources source, F is used4、F6Histogram, obtain " non-cloud " and " cloud " first distribution,
The probability distribution of " non-cloud " is defined as Q, Q={ q1,q2...qi...q256, wherein qi=h (i)/(32 × 32);
The probability distribution of " cloud " is defined as P, P={ p1,p2...pi...p256, wherein pi=h (i)/(32 × 32);
414) two category information divergence D are calculated to secondary sources sourceKL(P, Q), its calculation expression are as follows:
Do T=1,2 ... 256
<mrow> <msub> <mi>D</mi> <mrow> <mi>K</mi> <mi>L</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi>P</mi> <mo>,</mo> <mi>Q</mi> <mo>)</mo> </mrow> <mo>=</mo> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>T</mi> </munderover> <msub> <mi>p</mi> <mi>i</mi> </msub> <mi>l</mi> <mi>o</mi> <mi>g</mi> <mfrac> <msub> <mi>p</mi> <mi>i</mi> </msub> <msub> <mi>q</mi> <mi>i</mi> </msub> </mfrac> <mo>+</mo> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>i</mi> <mo>=</mo> <mi>T</mi> <mo>+</mo> <mn>1</mn> </mrow> <mn>256</mn> </munderover> <msub> <mi>q</mi> <mi>i</mi> </msub> <mi>l</mi> <mi>o</mi> <mi>g</mi> <mfrac> <msub> <mi>q</mi> <mi>i</mi> </msub> <msub> <mi>p</mi> <mi>i</mi> </msub> </mfrac> </mrow>
End, wherein, piAnd qiIt is not zero;
Select information divergence minimum M in (DKL(P, Q)) corresponding T is as secondary sources source F2The initial threshold of window pixel classification Value, is denoted as T2
42) the trust section of threshold value T neighborhoods is defined,
Definition γ is constant value, if T=γ values near zones (α, β) are trust section of the object event with respect to T,
Wherein, primary sources source i >=β is the supporting evidence section of " cloud ", and i≤α is " non-cloud " evidence section;
Secondary sources source i >=β is " non-cloud " evidence section, and i≤α is the supporting evidence section of " cloud ";
43) obtain primary sources source and trust section α and β value,
Calculate primary sources source F1Window histogram, use T1By primary sources source F1The window pixel preliminary classification mark It is denoted as " non-cloud "" cloud "Calculate primary sources source F1The window " cloud-cloud " and " non-cloud-clear " two classification target average and standard deviation of mouth,
The average of " cloud-cloud " isStandard deviation is
The average of " non-cloud-clear " isStandard deviation is
<mrow> <mi>&amp;alpha;</mi> <mo>=</mo> <msubsup> <mi>mean</mi> <mrow> <mi>c</mi> <mi>l</mi> <mi>e</mi> <mi>a</mi> <mi>r</mi> </mrow> <mn>1</mn> </msubsup> <mo>+</mo> <mn>2</mn> <msubsup> <mi>&amp;sigma;</mi> <mrow> <mi>c</mi> <mi>l</mi> <mi>e</mi> <mi>a</mi> <mi>r</mi> </mrow> <mn>1</mn> </msubsup> <mo>,</mo> <mi>&amp;beta;</mi> <mo>=</mo> <msubsup> <mi>mean</mi> <mrow> <mi>c</mi> <mi>l</mi> <mi>o</mi> <mi>u</mi> <mi>d</mi> </mrow> <mn>1</mn> </msubsup> <mo>-</mo> <mn>2</mn> <msubsup> <mi>&amp;sigma;</mi> <mrow> <mi>c</mi> <mi>l</mi> <mi>o</mi> <mi>u</mi> <mi>d</mi> </mrow> <mn>1</mn> </msubsup> <mo>;</mo> </mrow>
44) obtain secondary sources source and trust section α and β value,
Calculate secondary sources source F2Window histogram, use T2By secondary sources source F2The window pixel preliminary classification mark It is denoted as " cloud "" non-cloud "Calculate secondary sources source F2's " cloud-cloud " and " non-cloud-clear " two classification target average and standard deviation,
The average of " cloud-cloud " isStandard deviation is
The average of " non-cloud-clear " isStandard deviation is
<mrow> <mi>&amp;alpha;</mi> <mo>=</mo> <msubsup> <mi>mean</mi> <mrow> <mi>c</mi> <mi>l</mi> <mi>o</mi> <mi>u</mi> <mi>d</mi> </mrow> <mn>2</mn> </msubsup> <mo>+</mo> <mn>2</mn> <msubsup> <mi>&amp;sigma;</mi> <mrow> <mi>c</mi> <mi>l</mi> <mi>o</mi> <mi>u</mi> <mi>d</mi> </mrow> <mn>2</mn> </msubsup> <mo>,</mo> <mi>&amp;beta;</mi> <mo>=</mo> <msubsup> <mi>mean</mi> <mrow> <mi>c</mi> <mi>l</mi> <mi>e</mi> <mi>a</mi> <mi>r</mi> </mrow> <mn>2</mn> </msubsup> <mo>-</mo> <mn>2</mn> <msubsup> <mi>&amp;sigma;</mi> <mrow> <mi>c</mi> <mi>l</mi> <mi>e</mi> <mi>a</mi> <mi>r</mi> </mrow> <mn>2</mn> </msubsup> <mo>.</mo> </mrow>
5. the multi-source Remote Sensing Images cloud detection method of optic according to claim 1 based on evidence fusion adaptive threshold, it is special Sign is that the Optimum Classification based on Sequential Decision tree and evidence fusion comprises the following steps:
51) α and β value pair in section are trusted based on the α and β value, secondary sources source for obtaining primary sources source trust section " cloud " and " non-cloud " target sequences carry out rough sort,
511) by F3(x, y) is emptied,
F3(x, y)={ }, F3For storing rough sort intermediate result;
512) for primary sources source F1, handle as follows:
Take threshold valueIfThen the pixel is determined as " non-cloud ", assignment F3(x, y)=0;
Take threshold valueIfThen the pixel is determined as " cloud ", assignment F3(x, y)=1;
IfThen the pixel is determined as " unknown ", assignment F3(x, y)=254;
513) for secondary sources source F2, handle as follows:
Position F3(x, y)=0 and F3The cell coordinate of (x, y)=254, poll F2The pixel value of same coordinate on image;
Take threshold valueIfThen the pixel is determined as " cloud ", assignment F3(x, y)=1;
Take threshold valueIfThen the pixel is determined as " non-cloud ", assignment F3(x, y)=0;
IfThen the pixel is determined as " unknown ", assignment F3(x, y)=254;
52) F is positioned3The cell coordinate of (x, y)=254, poll F1、F2The pixel value tonal gradation of the coordinate on image histogram ω ∈ (0,255) and its frequency h1(ω)、h2(ω), calculates it and belongs to the degree of belief of two class targets " cloud " and " non-cloud " respectively;
Class is finely divided based on joint degree of belief, the first kind, secondary sources source " cloud " or " non-cloud " are asked using D-S evidence theory The joint confidence level of target classification, is " cloud ", " non-cloud " or " unknown " to " unknown " Image segmentation class, that is, puts F3(x, y)=1, F3 (x, y)=0 keeps F3(x, y)=254 are constant;
53) F of the order after disaggregated classification1(x, y)=F3(x, y),
Judge F1Pixel is " cloud " event at (x, y)=1, obtains sorted window cloud detection image F1
54) F is traveled through with 32 × 32 pixel windows1、F2、F3、F4、F5、F6, repeat above-mentioned 34) final to 53) processing step, acquisition Classification chart as F1
6. the multi-source Remote Sensing Images cloud detection method of optic according to claim 5 based on evidence fusion adaptive threshold, it is special Sign is, described to be finely divided class based on joint degree of belief and comprise the following steps:
61) primary sources source F is calculated3The degree of belief CF of the coordinate pixel of (x, y)=2541
Set CFclear(α)=1, CFclear(γ)=CFcloud(γ)=0.5, CFcloud(β)=1,
For reflectivity level ω ∈ (alpha, gamma), the degree of belief that the pixel belongs to " non-cloud " is calculated, formula is as follows:
<mrow> <msubsup> <mi>CF</mi> <mrow> <mi>c</mi> <mi>l</mi> <mi>e</mi> <mi>a</mi> <mi>r</mi> </mrow> <mn>1</mn> </msubsup> <mrow> <mo>(</mo> <mi>&amp;omega;</mi> <mo>)</mo> </mrow> <mo>=</mo> <mo>|</mo> <mi>h</mi> <mrow> <mo>(</mo> <mi>&amp;omega;</mi> <mo>)</mo> </mrow> <mo>-</mo> <mi>h</mi> <mrow> <mo>(</mo> <mi>&amp;gamma;</mi> <mo>)</mo> </mrow> <mo>|</mo> <mo>&amp;times;</mo> <mrow> <mo>(</mo> <msub> <mi>CF</mi> <mrow> <mi>c</mi> <mi>l</mi> <mi>e</mi> <mi>a</mi> <mi>r</mi> </mrow> </msub> <mo>(</mo> <mi>&amp;alpha;</mi> <mo>)</mo> <mo>-</mo> <msub> <mi>CF</mi> <mrow> <mi>c</mi> <mi>l</mi> <mi>e</mi> <mi>a</mi> <mi>r</mi> </mrow> </msub> <mo>(</mo> <mi>&amp;gamma;</mi> <mo>)</mo> <mo>)</mo> </mrow> <mo>/</mo> <mo>|</mo> <mi>h</mi> <mrow> <mo>(</mo> <mi>&amp;alpha;</mi> <mo>)</mo> </mrow> <mo>-</mo> <mi>h</mi> <mrow> <mo>(</mo> <mi>&amp;gamma;</mi> <mo>)</mo> </mrow> <mo>|</mo> <mo>;</mo> <msubsup> <mi>CF</mi> <mrow> <mi>c</mi> <mi>l</mi> <mi>o</mi> <mi>u</mi> <mi>d</mi> </mrow> <mn>1</mn> </msubsup> <mo>=</mo> <mn>1</mn> <mo>-</mo> <msubsup> <mi>CF</mi> <mrow> <mi>c</mi> <mi>l</mi> <mi>e</mi> <mi>a</mi> <mi>r</mi> </mrow> <mn>1</mn> </msubsup> </mrow>
For reflectivity level ω ∈ (γ, β), the degree of belief that the pixel belongs to " cloud " is calculated, formula is as follows:
<mrow> <msubsup> <mi>CF</mi> <mrow> <mi>c</mi> <mi>l</mi> <mi>o</mi> <mi>u</mi> <mi>d</mi> </mrow> <mn>1</mn> </msubsup> <mrow> <mo>(</mo> <mi>&amp;omega;</mi> <mo>)</mo> </mrow> <mo>=</mo> <mo>|</mo> <mi>h</mi> <mrow> <mo>(</mo> <mi>&amp;omega;</mi> <mo>)</mo> </mrow> <mo>-</mo> <mi>h</mi> <mrow> <mo>(</mo> <mi>&amp;gamma;</mi> <mo>)</mo> </mrow> <mo>|</mo> <mo>&amp;times;</mo> <mrow> <mo>(</mo> <msub> <mi>CF</mi> <mrow> <mi>c</mi> <mi>l</mi> <mi>o</mi> <mi>u</mi> <mi>d</mi> </mrow> </msub> <mo>(</mo> <mi>&amp;beta;</mi> <mo>)</mo> <mo>-</mo> <msub> <mi>CF</mi> <mrow> <mi>c</mi> <mi>l</mi> <mi>o</mi> <mi>u</mi> <mi>d</mi> </mrow> </msub> <mo>(</mo> <mi>&amp;gamma;</mi> <mo>)</mo> <mo>)</mo> </mrow> <mo>/</mo> <mo>|</mo> <mi>h</mi> <mrow> <mo>(</mo> <mi>&amp;beta;</mi> <mo>)</mo> </mrow> <mo>-</mo> <mi>h</mi> <mrow> <mo>(</mo> <mi>&amp;gamma;</mi> <mo>)</mo> </mrow> <mo>|</mo> <mo>,</mo> <msubsup> <mi>CF</mi> <mrow> <mi>c</mi> <mi>l</mi> <mi>e</mi> <mi>a</mi> <mi>r</mi> </mrow> <mn>1</mn> </msubsup> <mo>=</mo> <mn>1</mn> <mo>-</mo> <msubsup> <mi>CF</mi> <mrow> <mi>c</mi> <mi>l</mi> <mi>o</mi> <mi>u</mi> <mi>d</mi> </mrow> <mn>1</mn> </msubsup> <mo>;</mo> </mrow>
62) secondary sources source F is calculated3The degree of belief probability CF of the coordinate pixel of (x, y)=2542
Set CFcloud(α)=1, CFclear(γ)=CFcloud(γ)=0.5, CFclear(β)=1,
For atmospheric molecule optical thickness grade ω ∈ (alpha, gamma), the degree of belief that the pixel belongs to " cloud " is calculated:
<mrow> <msubsup> <mi>CF</mi> <mrow> <mi>c</mi> <mi>l</mi> <mi>o</mi> <mi>u</mi> <mi>d</mi> </mrow> <mn>2</mn> </msubsup> <mrow> <mo>(</mo> <mi>&amp;omega;</mi> <mo>)</mo> </mrow> <mo>=</mo> <mo>|</mo> <mi>h</mi> <mrow> <mo>(</mo> <mi>&amp;omega;</mi> <mo>)</mo> </mrow> <mo>-</mo> <mi>h</mi> <mrow> <mo>(</mo> <mi>&amp;gamma;</mi> <mo>)</mo> </mrow> <mo>|</mo> <mo>&amp;times;</mo> <mrow> <mo>(</mo> <msub> <mi>CF</mi> <mrow> <mi>c</mi> <mi>l</mi> <mi>o</mi> <mi>u</mi> <mi>d</mi> </mrow> </msub> <mo>(</mo> <mi>&amp;alpha;</mi> <mo>)</mo> <mo>-</mo> <msub> <mi>CF</mi> <mrow> <mi>c</mi> <mi>l</mi> <mi>o</mi> <mi>u</mi> <mi>d</mi> </mrow> </msub> <mo>(</mo> <mi>&amp;gamma;</mi> <mo>)</mo> <mo>)</mo> </mrow> <mo>/</mo> <mo>|</mo> <mi>h</mi> <mrow> <mo>(</mo> <mi>&amp;alpha;</mi> <mo>)</mo> </mrow> <mo>-</mo> <mi>h</mi> <mrow> <mo>(</mo> <mi>&amp;gamma;</mi> <mo>)</mo> </mrow> <mo>|</mo> <mo>,</mo> <msubsup> <mi>CF</mi> <mrow> <mi>c</mi> <mi>l</mi> <mi>e</mi> <mi>a</mi> <mi>r</mi> </mrow> <mn>2</mn> </msubsup> <mo>=</mo> <mn>1</mn> <mo>-</mo> <msubsup> <mi>CF</mi> <mrow> <mi>c</mi> <mi>l</mi> <mi>o</mi> <mi>u</mi> <mi>d</mi> </mrow> <mn>2</mn> </msubsup> <mo>;</mo> </mrow>
For atmospheric molecule optical thickness grade ω ∈ (γ, β), the degree of belief that the pixel belongs to " non-cloud " is calculated:
<mrow> <msubsup> <mi>CF</mi> <mrow> <mi>c</mi> <mi>l</mi> <mi>e</mi> <mi>a</mi> <mi>r</mi> </mrow> <mn>2</mn> </msubsup> <mrow> <mo>(</mo> <mi>&amp;omega;</mi> <mo>)</mo> </mrow> <mo>=</mo> <mo>|</mo> <mi>h</mi> <mrow> <mo>(</mo> <mi>&amp;omega;</mi> <mo>)</mo> </mrow> <mo>-</mo> <mi>h</mi> <mrow> <mo>(</mo> <mi>&amp;gamma;</mi> <mo>)</mo> </mrow> <mo>|</mo> <mo>&amp;times;</mo> <mrow> <mo>(</mo> <msub> <mi>CF</mi> <mrow> <mi>c</mi> <mi>l</mi> <mi>e</mi> <mi>a</mi> <mi>r</mi> </mrow> </msub> <mo>(</mo> <mi>&amp;beta;</mi> <mo>)</mo> <mo>-</mo> <msub> <mi>CF</mi> <mrow> <mi>c</mi> <mi>l</mi> <mi>e</mi> <mi>a</mi> <mi>r</mi> </mrow> </msub> <mo>(</mo> <mi>&amp;gamma;</mi> <mo>)</mo> <mo>)</mo> </mrow> <mo>/</mo> <mo>|</mo> <mi>h</mi> <mrow> <mo>(</mo> <mi>&amp;beta;</mi> <mo>)</mo> </mrow> <mo>-</mo> <mi>h</mi> <mrow> <mo>(</mo> <mi>&amp;gamma;</mi> <mo>)</mo> </mrow> <mo>|</mo> <mo>,</mo> <msubsup> <mi>CF</mi> <mrow> <mi>c</mi> <mi>l</mi> <mi>o</mi> <mi>u</mi> <mi>d</mi> </mrow> <mn>2</mn> </msubsup> <mo>=</mo> <mn>1</mn> <mo>-</mo> <msubsup> <mi>CF</mi> <mrow> <mi>c</mi> <mi>l</mi> <mi>e</mi> <mi>a</mi> <mi>r</mi> </mrow> <mn>2</mn> </msubsup> <mo>;</mo> </mrow>
63) M is sets(Ak) for s-th of data source for object event AkThe Credibility probability apportioning cost of (k=1,2...K), its Middle s ∈ [1,2];
For primary sources source F1(x, y) and secondary sources source F2(x, y), k ∈ [1,2],
A1Represent " cloud ", A2Represent " non-cloud ",
M1(A1) for the first data source belong to the confidence level of " cloud ", M2(A1) it is the confidence level that the second data source belongs to " cloud ",
M1(A2) for primary sources source belong to the confidence level of " non-cloud ", M2(A2) it is that secondary sources source belongs to the credible of " non-cloud " Degree,
M1(A2)=1-M1(A1), M2(A2)=1-M2(A1), wherein M () is Mass functions;
64) by 61) step, 62) pixel of acquired first, second class data source coordinate (x, the y) position of step belongs to two classifications Target degree of belief, makes Mass functions M1() and M2() is expressed as the degree of belief function in the first kind and secondary sources source:
M1(non-cloud)=1-P1
M2(non-cloud)=1-P2
65) according to Demster composition rules, try to achieve cell coordinate and belong to the confidence level after " cloud " or " non-cloud " fusion for (x, y) Distribution, it is comprised the following steps that:
651) normalization factor is sought,
It is detection target to define B, C, is " cloud " or " non-cloud ", then:
K=∑s M1(B)M2(C)=M1(cloud) M2(cloud)+M1(non-cloud) M2(non-cloud)=P1P2+(1-P1)(1-P2)
Wherein, B ∩ C ≠ 0;
652) the fusion credit assignment value of " cloud " and " non-cloud " is sought,
Wherein, B ∩ C=clouds;
Wherein, B ∩ C=non-cloud;
653) combination degree of belief disaggregated classification is pressed,
If judge
Then pixel F3(x, y)=254, are " unknown ";
Otherwise, if M ({ cloud }) > M ({ non-cloud }), pixel F3(x, y)=1, is " cloud ";
Otherwise, pixel F3(x, y)=0, then be " non-cloud ".
CN201711113064.2A 2017-11-13 2017-11-13 Multi-source remote sensing image cloud detection method based on evidence fusion adaptive threshold Active CN107944357B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711113064.2A CN107944357B (en) 2017-11-13 2017-11-13 Multi-source remote sensing image cloud detection method based on evidence fusion adaptive threshold

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711113064.2A CN107944357B (en) 2017-11-13 2017-11-13 Multi-source remote sensing image cloud detection method based on evidence fusion adaptive threshold

Publications (2)

Publication Number Publication Date
CN107944357A true CN107944357A (en) 2018-04-20
CN107944357B CN107944357B (en) 2020-02-14

Family

ID=61933846

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711113064.2A Active CN107944357B (en) 2017-11-13 2017-11-13 Multi-source remote sensing image cloud detection method based on evidence fusion adaptive threshold

Country Status (1)

Country Link
CN (1) CN107944357B (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108627879A (en) * 2018-05-10 2018-10-09 中南林业科技大学 A kind of multi-source meteorological satellite cloud detection method of optic
CN108805861A (en) * 2018-04-28 2018-11-13 中国人民解放军国防科技大学 Remote sensing image cloud detection method based on deep learning
CN109001161A (en) * 2018-05-07 2018-12-14 安徽师范大学 A kind of pollution cloud classification recognition methods based on polarization image
CN109272053A (en) * 2018-10-12 2019-01-25 国网湖南省电力有限公司 The data fusion method and system of polar-orbiting satellite monitoring aerosol optical depth
CN109801233A (en) * 2018-12-27 2019-05-24 中国科学院西安光学精密机械研究所 A kind of Enhancement Method suitable for true color remote sensing image
CN111491131A (en) * 2019-01-29 2020-08-04 斯特拉德视觉公司 Method and apparatus for integrating object detection information detected by each object detector
CN111582037A (en) * 2020-04-10 2020-08-25 天津大学 Foundation cloud atlas cloud classification recognition system and method based on rough set theory
CN112526638A (en) * 2020-11-27 2021-03-19 中国气象局气象探测中心 Cloud boundary identification method based on multi-source observation data and related equipment
CN114022790A (en) * 2022-01-10 2022-02-08 成都国星宇航科技有限公司 Cloud layer detection and image compression method and device in remote sensing image and storage medium
CN115830471A (en) * 2023-01-04 2023-03-21 安徽大学 Multi-scale feature fusion and alignment domain self-adaptive cloud detection method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140268094A1 (en) * 2013-03-15 2014-09-18 Digitalglobe, Inc. Using parallax in remote sensing to determine cloud feature height
CN105913402A (en) * 2016-05-20 2016-08-31 上海海洋大学 Multi-remote sensing image fusion denoising method based on DS evidence theory
CN106294705A (en) * 2016-08-08 2017-01-04 长安大学 A kind of batch remote sensing image preprocess method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140268094A1 (en) * 2013-03-15 2014-09-18 Digitalglobe, Inc. Using parallax in remote sensing to determine cloud feature height
CN105913402A (en) * 2016-05-20 2016-08-31 上海海洋大学 Multi-remote sensing image fusion denoising method based on DS evidence theory
CN106294705A (en) * 2016-08-08 2017-01-04 长安大学 A kind of batch remote sensing image preprocess method

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108805861A (en) * 2018-04-28 2018-11-13 中国人民解放军国防科技大学 Remote sensing image cloud detection method based on deep learning
CN109001161A (en) * 2018-05-07 2018-12-14 安徽师范大学 A kind of pollution cloud classification recognition methods based on polarization image
CN108627879A (en) * 2018-05-10 2018-10-09 中南林业科技大学 A kind of multi-source meteorological satellite cloud detection method of optic
CN109272053A (en) * 2018-10-12 2019-01-25 国网湖南省电力有限公司 The data fusion method and system of polar-orbiting satellite monitoring aerosol optical depth
CN109272053B (en) * 2018-10-12 2021-11-02 国网湖南省电力有限公司 Data fusion method and system for monitoring optical thickness of aerosol by polar orbit satellite
CN109801233B (en) * 2018-12-27 2020-09-29 中国科学院西安光学精密机械研究所 Method for enhancing true color remote sensing image
CN109801233A (en) * 2018-12-27 2019-05-24 中国科学院西安光学精密机械研究所 A kind of Enhancement Method suitable for true color remote sensing image
CN111491131B (en) * 2019-01-29 2021-06-11 斯特拉德视觉公司 Method and apparatus for integrating object detection information detected by each object detector
CN111491131A (en) * 2019-01-29 2020-08-04 斯特拉德视觉公司 Method and apparatus for integrating object detection information detected by each object detector
CN111582037A (en) * 2020-04-10 2020-08-25 天津大学 Foundation cloud atlas cloud classification recognition system and method based on rough set theory
CN112526638A (en) * 2020-11-27 2021-03-19 中国气象局气象探测中心 Cloud boundary identification method based on multi-source observation data and related equipment
CN114022790A (en) * 2022-01-10 2022-02-08 成都国星宇航科技有限公司 Cloud layer detection and image compression method and device in remote sensing image and storage medium
CN115830471A (en) * 2023-01-04 2023-03-21 安徽大学 Multi-scale feature fusion and alignment domain self-adaptive cloud detection method
CN115830471B (en) * 2023-01-04 2023-06-13 安徽大学 Multi-scale feature fusion and alignment domain self-adaptive cloud detection method

Also Published As

Publication number Publication date
CN107944357B (en) 2020-02-14

Similar Documents

Publication Publication Date Title
CN107944357A (en) Multi-source Remote Sensing Images cloud detection method of optic based on evidence fusion adaptive threshold
Zhang et al. A feature difference convolutional neural network-based change detection method
Wang et al. A random forest classifier based on pixel comparison features for urban LiDAR data
CN110852245B (en) Precipitation particle classification method of double-polarization weather radar based on discrete attribute BNT
Peng et al. Object-based change detection from satellite imagery by segmentation optimization and multi-features fusion
Troya-Galvis et al. Unsupervised quantification of under-and over-segmentation for object-based remote sensing image analysis
Ok et al. 2-D delineation of individual citrus trees from UAV-based dense photogrammetric surface models
Nurmasari et al. Oil palm plantation detection in Indonesia using Sentinel-2 and Landsat-8 optical satellite imagery (case study: Rokan Hulu regency, Riau Province)
S Bhagat Use of remote sensing techniques for robust digital change detection of land: A review
Feng et al. Impervious surface extraction based on different methods from multiple spatial resolution images: a comprehensive comparison
Wieland et al. Object-based urban structure type pattern recognition from Landsat TM with a Support Vector Machine
Duan et al. Mapping the soil types combining multi-temporal remote sensing data with texture features
Norman et al. Spatio-statistical optimization of image segmentation process for building footprint extraction using very high-resolution WorldView 3 satellite data
Ayanlade Remote sensing approaches for land use and land surface temperature assessment: a review of methods
Ozdarici Ok et al. A segment-based approach to classify agricultural lands by using multi-temporal optical and microwave data
Tamim et al. Automatic detection of Moroccan coastal upwelling zones using sea surface temperature images
Li et al. A new combination classification of pixel-and object-based methods
Nik Effendi et al. Unlocking the potential of hyperspectral and LiDAR for above-ground biomass (AGB) and tree species classification in tropical forests
Rajendiran et al. Pixel level feature extraction and machine learning classification for water body extraction
Zhang et al. Crop type mapping with temporal sample migration
CN111882573B (en) Cultivated land block extraction method and system based on high-resolution image data
Chen et al. A supervoxel-based vegetation classification via decomposition and modelling of full-waveform airborne laser scanning data
Zheng et al. Extraction of impervious surface with Landsat based on machine learning in Chengdu urban, China
Jadhav et al. Segmentation analysis using particle swarm optimization-self organizing map algorithm and classification of remote sensing data for agriculture
Zhao et al. Improving object-oriented land use/cover classification from high resolution imagery by spectral similarity-based post-classification

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant