CN112131968A - Double-time-phase remote sensing image change detection method based on DCNN - Google Patents

Double-time-phase remote sensing image change detection method based on DCNN Download PDF

Info

Publication number
CN112131968A
CN112131968A CN202010903557.1A CN202010903557A CN112131968A CN 112131968 A CN112131968 A CN 112131968A CN 202010903557 A CN202010903557 A CN 202010903557A CN 112131968 A CN112131968 A CN 112131968A
Authority
CN
China
Prior art keywords
remote sensing
change detection
double
sensing image
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010903557.1A
Other languages
Chinese (zh)
Inventor
王鑫
吕安
张香梁
吕国芳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hohai University HHU
Original Assignee
Hohai University HHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hohai University HHU filed Critical Hohai University HHU
Priority to CN202010903557.1A priority Critical patent/CN112131968A/en
Publication of CN112131968A publication Critical patent/CN112131968A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Biophysics (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Biomedical Technology (AREA)
  • Health & Medical Sciences (AREA)
  • Astronomy & Astrophysics (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a double-time-phase remote sensing image change detection method based on DCNN, which generates a double-time-phase characteristic diagram by inputting a double-time-phase remote sensing image data set into a deep convolution neural network, bilinear interpolation is carried out on the double time phase characteristic graphs to ensure that the size of the double time phase characteristic graphs is the same as that of the remote sensing images in the double time phase remote sensing image data set, the Euclidean distance between the double time phase characteristic graphs after the bilinear interpolation is calculated, generating a difference image according to the Euclidean distance, extracting the characteristic vector of each pixel block in the difference image, constructing a characteristic vector space according to each characteristic vector, clustering the feature vector space, generating a coarse change detection graph according to the clustering result, performing morphological filtering on the coarse change detection graph, the change detection image is generated, the process of change detection of the remote sensing image to be detected is effectively simplified, and the detection effect and the detection efficiency are improved.

Description

Double-time-phase remote sensing image change detection method based on DCNN
Technical Field
The invention relates to the technical field of image processing, in particular to a double-time-phase remote sensing image change detection method based on DCNN.
Background
Change detection techniques are a core process for many applications that utilize remotely sensed images. It identifies changes in the earth's surface by processing two images covering the same geographic area acquired at different times. Change detection has a wide range of uses, including land use and land cover change detection, risk assessment, environmental surveys, and the like.
Based on artificial features, various algorithms have been proposed to solve the problem of change detection, such as image quantification, principal component analysis, change vector analysis, expectation maximization, markov random fields, etc. To calculate these manually designed features, careful and careful selection of dimensions, proportions and orientations is required. In addition, the selection of the characteristics is also a difficulty in detecting the change of the remote sensing image.
A remote sensing image change detection method and device disclosed in publication No. CN108830828A, respectively filtering the first remote sensing image and the second remote sensing image by using a preset filter to obtain a first filtered image and a second filtered image; and calculating a difference image according to the first filtering image and the second filtering image, wherein the difference image is used for identifying the remote sensing image change of the two different time phases. Although the method solves the problem that the accuracy of remote sensing image change detection is low due to the fact that noise exists in the remote sensing image when the remote sensing image change is detected by the existing difference method, the generated difference image is not high in accuracy and prone to cause error accumulation, and a good change detection effect cannot be achieved. The publication number CN107992891A is based on a spectral vector analysis multispectral remote sensing image change detection method, two preprocessed multispectral remote sensing images in the same region and different time are input, a principal component analysis method is used for carrying out dimensionality reduction on a difference space constructed by using a change vector analysis method, and a first principal component is taken to obtain a first difference image; solving included angle information between spectrum vectors of the double-time-phase remote sensing image to obtain a second difference map; respectively solving the information entropies of the two difference graphs, further obtaining fusion weights through calculation, and obtaining a better difference graph through fusion in a weighted summation mode; carrying out spatial feature description; and performing clustering analysis in a spectral clustering mode to obtain a change detection result. Although the method can inhibit the interference of factors such as illumination, radiation and the like on the change information to a certain extent, the method has higher requirements on the preprocessing of the original remote sensing image and needs more complicated preprocessing steps because the principal component analysis method is directly utilized to construct the difference space. Therefore, the technical problems of complex detection process and poor detection effect exist in the traditional scheme.
Disclosure of Invention
Aiming at the problems, the invention provides a double-time-phase remote sensing image change detection method based on DCNN.
In order to achieve the purpose of the invention, the invention provides a double time-phase remote sensing image change detection method based on DCNN, which comprises the following steps:
s10, constructing a double-time-phase remote sensing image data set comprising the remote sensing image to be detected;
s30, inputting the double-time-phase remote sensing image data set into a pre-constructed deep convolutional neural network to generate a double-time-phase characteristic diagram corresponding to each remote sensing image in the double-time-phase remote sensing image data set;
s40, carrying out bilinear interpolation on the double-time phase characteristic diagram to enable the size of the double-time phase characteristic diagram to be the same as that of the remote sensing image in the double-time phase remote sensing image data set;
s50, calculating Euclidean distance between the bi-temporal phase feature maps after bilinear interpolation, generating a difference image according to the Euclidean distance, extracting feature vectors of pixel blocks in the difference image, and constructing a feature vector space according to the feature vectors;
s60, clustering the feature vector space, and generating a coarse change detection graph according to the clustering result;
s70, morphologically filtering the coarse change detection map to generate a change detection map.
In one embodiment, before step S30, the method further includes:
s20, constructing a deep convolutional neural network of which the main body structure is based on VGG19, and loading weight parameters pre-trained on ImageNet as the weight parameters of the deep convolutional neural network so as to complete the construction of the deep convolutional neural network.
In one embodiment, calculating the euclidean distance between the bilinear interpolated two-phase feature maps comprises:
Figure BDA0002660594970000021
in the formula (I), the compound is shown in the specification,
Figure BDA0002660594970000022
representing the pixel values of the feature map at time T1 in the two-time temporal feature map,
Figure BDA0002660594970000023
representing the pixel value, DI, of the feature map at time T2 in a two-time-phase feature mapi(x, y) represents the Euclidean distance.
In one embodiment, clustering the feature vector space, and generating the coarse change detection graph according to the clustering result includes:
dividing the feature vector space into two clusters by using k-means clustering with k being 2, calculating the minimum Euclidean distance between the feature vectors of the two clusters and the mean feature vector, and allocating each pixel to one of the two clusters to generate a coarse change detection map.
In one embodiment, morphologically filtering the coarse change detection map to generate the change detection map comprises:
and carrying out corrosion operation on the coarse change detection graph, and filtering noise pixel points in the coarse change detection graph to generate a change detection graph.
The DCNN-based double-temporal remote sensing image change detection method comprises the steps of constructing a double-temporal remote sensing image data set comprising remote sensing images to be detected, inputting the double-temporal remote sensing image data set into a pre-constructed deep convolutional neural network to generate double temporal characteristic maps corresponding to the remote sensing images in the double-temporal remote sensing image data set, carrying out bilinear interpolation on the double temporal characteristic maps to enable the size of the double temporal characteristic maps to be the same as that of the remote sensing images in the double temporal remote sensing image data set, calculating Euclidean distances between the double temporal characteristic maps after the bilinear interpolation, generating difference images according to the Euclidean distances, extracting pixel block characteristic vectors of each pixel in the difference images, constructing characteristic vector spaces according to the characteristic vectors, clustering the characteristic vector spaces, generating a coarse change detection map according to clustering results, and carrying out morphological filtering on the coarse change detection map, the change detection image is generated, change detection of the remote sensing image to be detected is achieved, the process of change detection of the remote sensing image to be detected is effectively simplified, and the detection effect and the detection efficiency are improved.
Drawings
Fig. 1 is a flowchart of a DCNN-based dual-temporal remote sensing image change detection method according to an embodiment;
FIG. 2 is a frame diagram of a DCNN-based dual-temporal remote sensing image change detection according to an embodiment;
FIG. 3 is a deep neural network framework diagram of an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
Referring to fig. 1, fig. 1 is a flowchart of a DCNN-based dual-temporal remote sensing image change detection method according to an embodiment, including the following steps:
and S10, constructing a double-time-phase remote sensing image data set comprising the remote sensing image to be detected.
The steps can construct a double-time phase remote sensing image data set of the remote sensing image to be detected at two moments (such as the T1 moment and the T2 moment), and produce a corresponding sample label set.
And S30, inputting the double-time-phase remote sensing image data set into a pre-constructed deep convolutional neural network to generate a double-time-phase characteristic diagram corresponding to each remote sensing image included in the double-time-phase remote sensing image data set.
And S40, carrying out bilinear interpolation on the double-time phase characteristic diagram to enable the size of the double-time phase characteristic diagram to be the same as that of the remote sensing image in the double-time phase remote sensing image data set.
S50, calculating Euclidean distance between the bilinear interpolated two time phase feature maps, generating a difference image according to the Euclidean distance, extracting feature vectors of each pixel block in the difference image, and constructing a feature vector space according to each feature vector.
And S60, clustering the feature vector space, and generating a coarse change detection graph according to the clustering result.
S70, morphologically filtering the coarse change detection map to generate a change detection map.
The DCNN-based double-temporal remote sensing image change detection method comprises the steps of constructing a double-temporal remote sensing image data set comprising remote sensing images to be detected, inputting the double-temporal remote sensing image data set into a pre-constructed deep convolutional neural network to generate double temporal characteristic maps corresponding to the remote sensing images in the double-temporal remote sensing image data set, carrying out bilinear interpolation on the double temporal characteristic maps to enable the size of the double temporal characteristic maps to be the same as that of the remote sensing images in the double temporal remote sensing image data set, calculating Euclidean distances between the double temporal characteristic maps after the bilinear interpolation, generating difference images according to the Euclidean distances, extracting pixel block characteristic vectors of each pixel in the difference images, constructing characteristic vector spaces according to the characteristic vectors, clustering the characteristic vector spaces, generating a coarse change detection map according to clustering results, and carrying out morphological filtering on the coarse change detection map, the change detection image is generated, change detection of the remote sensing image to be detected is achieved, the process of change detection of the remote sensing image to be detected is effectively simplified, and the detection effect and the detection efficiency are improved.
In one embodiment, before step S30, the method further includes:
s20, constructing a deep convolutional neural network of which the main body structure is based on VGG19, and loading weight parameters pre-trained on ImageNet as the weight parameters of the deep convolutional neural network so as to complete the construction of the deep convolutional neural network.
The ImageNet is a large visualization database containing a large number of images.
Specifically, the specific structure of the deep convolutional neural network based on the VGG19 may include:
(2.1) in the first large layer, respectively defining two convolution layers with convolution kernel size of 3 x 64, step size of 1 and activation function of linear rectification function, and a pooling layer, wherein the pooling method is selected as maximum pooling;
(2.2) in the second large layer, two convolution layers with convolution kernel size of 3 x 128, step size of 1 and activation function of linear rectification function and a pooling layer are respectively defined, and the pooling method is selected as maximum pooling;
(2.3) in the third large layer, four convolution layers with the convolution kernel size of 3 x 256, the step size of 1 and the activity function of a linear rectification function and a pooling layer are respectively defined, and the pooling method is selected as maximum pooling;
(2.4) in the fourth large layer, four convolution layers with the convolution kernel size of 3 x 512, the step size of 1, the activation function of a linear rectification function and a pooling layer are respectively defined, and the pooling method is selected as maximum pooling;
(2.5) in the fifth layer, four convolution layers with convolution kernel size of 3 x 512, step size of 1, and linear rectification function as activation function and one pooling layer are defined, and the pooling method is selected as maximum pooling;
(2.6) the sixth layer is a full connection layer;
(2.7) the seventh layer is a full connection layer;
(2.8) the eighth layer is a fully-connected layer.
In one example, the following process may be performed after the two-time phase remote sensing image dataset is input into the deep convolutional neural network:
let the ith image in the dataset be a three channel image, which is expressed as follows:
Imagei(x,y,z)={(x,y,z)|0≤x<h,0≤y<w,0≤z<3}
wherein h represents the height of the image, w represents the width of the image, the image is sent into a deep convolutional neural network, and a feature map output by the fourth large layer of the network is extracted:
featurei(x,y,z)={(x,y,z)|0≤x<h′,0≤y<w′,0≤z<512}
wherein h 'represents the feature map height of the fourth large layer output of the deep convolutional neural network, and w' represents the feature map width of the fourth large layer output of the deep convolutional neural network.
In one embodiment, calculating the euclidean distance between the bilinear interpolated two-phase feature maps comprises:
Figure BDA0002660594970000051
in the formula (I), the compound is shown in the specification,
Figure BDA0002660594970000052
representing the pixel values of the feature map at time T1 in the two-time temporal feature map,
Figure BDA0002660594970000053
representing the pixel value, DI, of the feature map at time T2 in a two-time-phase feature mapi(x, y) represents the Euclidean distance.
In one example, extracting feature vectors of respective pixel blocks in the difference image, and constructing a feature vector space according to the respective feature vectors may include:
(6.1) Difference image DIiCan be expressed as follows:
Figure BDA0002660594970000054
wherein n is 256;
(6.2) on the above-mentioned difference image DIiIn (1) generationThe non-overlapping block of h x h pixels is set to xdWherein h is selected as a constant greater than 2:
Figure BDA0002660594970000055
(6.3) mixing xdExhibition row vector x'dAll x 'are'dSet forming vector set XdAt XdThe feature vector space is established using a principal component analysis algorithm:
Figure BDA0002660594970000061
wherein x'dThe total number of the elements is h multiplied by h,
Figure BDA0002660594970000062
wherein XdIs h in shape2×h2To XdAll the characteristics are centralized to generate X'd
Figure BDA0002660594970000063
To Xd' solving covariance matrix:
Figure BDA0002660594970000064
wherein, cov (f)p,fq)=E{[fp-E(fp)][fq-E(fq)]Calculating an eigenvalue lambda and a corresponding eigenvector u of the covariance matrix C by using matrix knowledge Cu-lambdau, arranging the eigenvalues in a descending order, and selecting a preceding pair of eigenvalue eigenvectors to form an eigenvector space;
(6.4) adding DIiProjecting blocks of h pixels around each pixel into (6.2)Feature vector space, building the entire DIiFeature vector space above FVS:
Figure BDA0002660594970000071
wherein the content of the first and second substances,
Figure BDA0002660594970000072
in one embodiment, clustering the feature vector space, and generating the coarse change detection graph according to the clustering result includes:
dividing the feature vector space into two clusters by using k-means clustering with k being 2, calculating the minimum Euclidean distance between the feature vectors of the two clusters and the mean feature vector, and allocating each pixel to one of the two clusters to generate a coarse change detection map.
Further, morphologically filtering the coarse change detection map to generate the change detection map comprises:
and carrying out corrosion operation on the coarse change detection graph, and filtering noise pixel points in the coarse change detection graph to generate a change detection graph.
In one embodiment, the process of constructing a dual-temporal remote sensing image dataset including remote sensing images to be detected may include:
constructing a remote sensing Image dataset Image [ Image ]0,Image1,...,Imagei]And making a corresponding sample tag set Lable ═ Lable0,Lable1,...,Lablei]Where i represents the maximum number of images contained in the constructed Image dataset and corresponding tagset, ImageiRepresenting the ith image, Lable, in the constructed remote sensing image datasetiRepresenting the ith label image in the created sample label set. Each original image (such as a remote sensing image to be detected) has a label image corresponding to the original image. The label may be a binary image, where changed pixels are displayed as white and unchanged pixels are displayed as black.
In one embodiment, the bilinear interpolation of the bi-temporal feature map includes:
bilinear interpolation is performed according to the following expression:
Figure BDA0002660594970000073
the characteristic graph after bilinear interpolation is as follows:
featurei(x,y,z)={(x,y,z)|0≤x<h,0≤y<w,0≤z<512}
where h denotes the height of the original image and w denotes the width of the original image. x and y respectively represent the abscissa and the ordinate of a pixel point to be solved in the characteristic diagram, and q11、q12、q21、q22Respectively representing four pixel points, x, nearest to the pixel point to be solved1、x2、y1、y2The abscissa and ordinate of the four pixel points are respectively represented. Therefore, the final calculation result is the pixel value of the pixel point to be solved.
In the embodiment, the feature map of the original double-temporal image is acquired through the deep neural network, and due to the function of the pooling layer in the deep neural network, the size of the image is reduced by half every time the image passes through one pooling layer. In order to make the size of the output image consistent with that of the input image and to make the detection precision consistent, bilinear interpolation is respectively carried out on the obtained two-time phase feature maps, so that the size of the feature maps is enlarged to the size of the original image.
In an embodiment, the framework of the DCNN-based dual-temporal remote sensing image change detection method may refer to fig. 2, and includes the following processes:
(1) and building a deep convolutional neural network with a main body structure of VGG19, and selecting a weight parameter pre-trained on ImageNet as a weight parameter of the deep convolutional neural network. Constructing a remote sensing Image dataset Image [ Image ]0,Image1,...,Imagei]And making a corresponding sample tag set Lable ═ Lable0,Lable1,...,Lablei]Where i represents the constructed image dataset and the corresponding targetMaximum number of images, Image, contained in signature setiRepresenting the ith image, Lable, in the constructed remote sensing image datasetiRepresenting the ith label image in the created sample label set. The constructed deep convolutional neural network is shown in the attached figure 2, and the parameters of each layer are set as follows:
(a) in the first large layer, two convolution layers with convolution kernel size of 3 × 64, step size of 1 and linear rectification function as activation function and one pooling layer are defined separately, and the pooling method is selected as maximum pooling;
(b) in the second large layer, two convolution layers with convolution kernel size of 3 × 128, step size of 1 and linear rectification function as activation function and one pooling layer are defined, and the pooling method is selected as maximum pooling;
(c) in the third large layer, four convolution layers with convolution kernel size of 3 × 256, step size of 1 and linear rectification function as activation function and one pooling layer are defined, and the pooling method is selected as maximum pooling;
(d) in the fourth large layer, four convolution layers with convolution kernel size of 3 × 512, step size of 1 and linear rectification function as activation function and one pooling layer are defined, and the pooling method is selected as maximum pooling;
(e) in the fifth layer, four convolution layers with convolution kernel size of 3 × 512, step size of 1 and linear rectification function as activation function and one pooling layer are defined, and the pooling method is selected as maximum pooling;
(f) the sixth layer is a full connection layer;
(g) the seventh layer is a full connection layer;
(h) the eighth layer is a fully connected layer.
(2) And inputting the constructed remote sensing image data set into the constructed deep neural network to generate the characteristic diagram.
(2.1) in this embodiment, the size of the remote sensing image in the data set is 256 × 256, wherein the ith image is a three-channel image, which is expressed as follows:
Imagei(x,y,z)={(x,y,z)|0≤x<h,0≤y<w,0≤z<3}
where h denotes the height of the image, w denotes the width of the image, both values being 256;
(2.2) the characteristic graph output by the deep convolutional neural network is as follows:
featurei(x,y,z)={(x,y,z)|0≤x<h′,0≤y<w′,0≤z<512}
wherein h 'represents the feature map height of the deep convolutional neural network output, and w' represents the feature map width of the deep convolutional neural network output. In the embodiment, a feature map output by the fourth layer of the deep convolutional neural network is extracted, so that the values of h 'and w' are both 16;
(2.3) carrying out bilinear interpolation on the feature map output by the fourth large layer of the deep convolutional neural network according to the following expression:
Figure BDA0002660594970000091
the size of the image is enlarged to the size of an original input image, and the characteristic graph after interpolation is as follows:
featurei(x,y,z)={(x,y,z)|0≤x<h,0≤y<w,0≤z<512}
where h denotes the height of the original image and w denotes the width of the original image, with a value of 256.
(3) Calculating Euclidean distance on the characteristic diagram according to the following expression to generate a difference image:
Figure BDA0002660594970000092
in the formula (I), the compound is shown in the specification,
Figure BDA0002660594970000093
representing the pixel values of the feature map at time T1 in the two-time temporal feature map,
Figure BDA0002660594970000094
representing the pixel value, DI, of the feature map at time T2 in a two-time-phase feature mapi(x, y) represents the Euclidean distance.
(4) And constructing a feature vector space on the difference image by using a principal component analysis algorithm, clustering the feature vectors by using a k-means clustering algorithm after generating the feature vector space to generate a coarse variation graph, and carrying out corrosion operation on the coarse variation graph to generate a final variation detection graph. And comparing the predicted values of all the images in the test set with the Label values corresponding to the original remote sensing images to obtain the detection accuracy of the whole test set.
The embodiment has the following beneficial effects:
(1) the original remote sensing image is directly subjected to feature extraction through the deep convolutional neural network, so that the traditional complex step of manually extracting features is avoided, and the extracted features are high in precision.
(2) The difference image is constructed through the features extracted by the deep neural network, so that the influence of noise in the original image can be eliminated, the error accumulation can be avoided, and the performance of change detection is improved.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
It should be noted that the terms "first \ second \ third" referred to in the embodiments of the present application merely distinguish similar objects, and do not represent a specific ordering for the objects, and it should be understood that "first \ second \ third" may exchange a specific order or sequence when allowed. It should be understood that "first \ second \ third" distinct objects may be interchanged under appropriate circumstances such that the embodiments of the application described herein may be implemented in an order other than those illustrated or described herein.
The terms "comprising" and "having" and any variations thereof in the embodiments of the present application are intended to cover non-exclusive inclusions. For example, a process, method, apparatus, product, or device that comprises a list of steps or modules is not limited to the listed steps or modules but may alternatively include other steps or modules not listed or inherent to such process, method, product, or device.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (5)

1. A double-time phase remote sensing image change detection method based on DCNN is characterized by comprising the following steps:
s10, constructing a double-time-phase remote sensing image data set comprising the remote sensing image to be detected;
s30, inputting the double-time-phase remote sensing image data set into a pre-constructed deep convolutional neural network to generate a double-time-phase characteristic diagram corresponding to each remote sensing image in the double-time-phase remote sensing image data set;
s40, carrying out bilinear interpolation on the double-time phase characteristic diagram to enable the size of the double-time phase characteristic diagram to be the same as that of the remote sensing image in the double-time phase remote sensing image data set;
s50, calculating Euclidean distance between the bi-temporal phase feature maps after bilinear interpolation, generating a difference image according to the Euclidean distance, extracting feature vectors of pixel blocks in the difference image, and constructing a feature vector space according to the feature vectors;
s60, clustering the feature vector space, and generating a coarse change detection graph according to the clustering result;
s70, morphologically filtering the coarse change detection map to generate a change detection map.
2. The DCNN-based two-time-phase remote sensing image change detection method according to claim 1, wherein before step S30, the method further includes:
s20, constructing a deep convolutional neural network of which the main body structure is based on VGG19, and loading weight parameters pre-trained on ImageNet as the weight parameters of the deep convolutional neural network so as to complete the construction of the deep convolutional neural network.
3. The DCNN-based dual-temporal remote sensing image change detection method according to claim 1, wherein calculating the euclidean distance between the bilinear interpolated dual-temporal feature maps comprises:
Figure FDA0002660594960000011
in the formula (I), the compound is shown in the specification,
Figure FDA0002660594960000012
representing the pixel values of the feature map at time T1 in the two-time temporal feature map,
Figure FDA0002660594960000013
representing the pixel value, DI, of the feature map at time T2 in a two-time-phase feature mapi(x, y) represents the Euclidean distance.
4. The DCNN-based dual-temporal remote sensing image change detection method of claim 1, wherein clustering the eigenvector space and generating a coarse change detection map according to the clustering result comprises:
dividing the feature vector space into two clusters by using k-means clustering with k being 2, calculating the minimum Euclidean distance between the feature vectors of the two clusters and the mean feature vector, and allocating each pixel to one of the two clusters to generate a coarse change detection map.
5. The DCNN-based dual-temporal remote sensing image change detection method of claim 1, wherein the performing morphological filtering on the coarse change detection map to generate the change detection map comprises:
and carrying out corrosion operation on the coarse change detection graph, and filtering noise pixel points in the coarse change detection graph to generate a change detection graph.
CN202010903557.1A 2020-09-01 2020-09-01 Double-time-phase remote sensing image change detection method based on DCNN Pending CN112131968A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010903557.1A CN112131968A (en) 2020-09-01 2020-09-01 Double-time-phase remote sensing image change detection method based on DCNN

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010903557.1A CN112131968A (en) 2020-09-01 2020-09-01 Double-time-phase remote sensing image change detection method based on DCNN

Publications (1)

Publication Number Publication Date
CN112131968A true CN112131968A (en) 2020-12-25

Family

ID=73847093

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010903557.1A Pending CN112131968A (en) 2020-09-01 2020-09-01 Double-time-phase remote sensing image change detection method based on DCNN

Country Status (1)

Country Link
CN (1) CN112131968A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112784777A (en) * 2021-01-28 2021-05-11 西安电子科技大学 Unsupervised hyperspectral image change detection method based on antagonistic learning
CN114120141A (en) * 2021-11-23 2022-03-01 深圳航天智慧城市***技术研究院有限公司 All-weather remote sensing monitoring automatic analysis method and system thereof

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103971364A (en) * 2014-04-04 2014-08-06 西南交通大学 Remote sensing image variation detecting method on basis of weighted Gabor wavelet characteristics and two-stage clusters
CN108596108A (en) * 2018-04-26 2018-09-28 中国科学院电子学研究所 Method for detecting change of remote sensing image of taking photo by plane based on the study of triple semantic relation

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103971364A (en) * 2014-04-04 2014-08-06 西南交通大学 Remote sensing image variation detecting method on basis of weighted Gabor wavelet characteristics and two-stage clusters
CN108596108A (en) * 2018-04-26 2018-09-28 中国科学院电子学研究所 Method for detecting change of remote sensing image of taking photo by plane based on the study of triple semantic relation

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112784777A (en) * 2021-01-28 2021-05-11 西安电子科技大学 Unsupervised hyperspectral image change detection method based on antagonistic learning
CN112784777B (en) * 2021-01-28 2023-06-02 西安电子科技大学 Unsupervised hyperspectral image change detection method based on countermeasure learning
CN114120141A (en) * 2021-11-23 2022-03-01 深圳航天智慧城市***技术研究院有限公司 All-weather remote sensing monitoring automatic analysis method and system thereof

Similar Documents

Publication Publication Date Title
Zhang et al. A feature difference convolutional neural network-based change detection method
Huang et al. Spatial and spectral image fusion using sparse matrix factorization
CN110097101B (en) Remote sensing image fusion and coastal zone classification method based on improved reliability factor
Wang et al. Fast subpixel mapping algorithms for subpixel resolution change detection
CN110458192B (en) Hyperspectral remote sensing image classification method and system based on visual saliency
CN111680579B (en) Remote sensing image classification method for self-adaptive weight multi-view measurement learning
CN111914909B (en) Hyperspectral change detection method based on space-spectrum combined three-direction convolution network
CN110363236B (en) Hyperspectral image extreme learning machine clustering method for embedding space-spectrum combined hypergraph
Kumar et al. Improving image classification in a complex wetland ecosystem through image fusion techniques
CN107341505B (en) Scene classification method based on image significance and Object Bank
CN112131968A (en) Double-time-phase remote sensing image change detection method based on DCNN
Fu et al. Fusion of hyperspectral and multispectral images accounting for localized inter-image changes
CN113298129B (en) Polarized SAR image classification method based on superpixel and graph convolution network
CN111127316A (en) Single face image super-resolution method and system based on SNGAN network
CN111460966B (en) Hyperspectral remote sensing image classification method based on metric learning and neighbor enhancement
CN110111276A (en) Based on sky-spectrum information deep exploitation target in hyperspectral remotely sensed image super-resolution method
CN115359366A (en) Remote sensing image target detection method based on parameter optimization
CN112131969A (en) Remote sensing image change detection method based on full convolution neural network
CN115240072A (en) Hyperspectral multi-class change detection method based on multidirectional multi-scale spectrum-space residual convolution neural network
CN114563378A (en) Method, device, medium and equipment for quantitatively describing space distribution of cyanobacterial bloom in lakes and reservoirs
CN109300115B (en) Object-oriented multispectral high-resolution remote sensing image change detection method
Teodoro et al. Identification of beach hydromorphological patterns/forms through image classification techniques applied to remotely sensed data
CN111383203A (en) Panchromatic and multispectral remote sensing image fusion method based on regional fitting
CN112329818B (en) Hyperspectral image non-supervision classification method based on graph convolution network embedded characterization
CN107358625B (en) SAR image change detection method based on SPP Net and region-of-interest detection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20201225