CN111222576B - High-resolution remote sensing image classification method - Google Patents

High-resolution remote sensing image classification method Download PDF

Info

Publication number
CN111222576B
CN111222576B CN202010018383.0A CN202010018383A CN111222576B CN 111222576 B CN111222576 B CN 111222576B CN 202010018383 A CN202010018383 A CN 202010018383A CN 111222576 B CN111222576 B CN 111222576B
Authority
CN
China
Prior art keywords
layer
sample image
image block
class
remote sensing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010018383.0A
Other languages
Chinese (zh)
Other versions
CN111222576A (en
Inventor
石程
吕志勇
杨秀红
尤珍臻
都双丽
石俊飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian University of Technology
Original Assignee
Xian University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian University of Technology filed Critical Xian University of Technology
Priority to CN202010018383.0A priority Critical patent/CN111222576B/en
Publication of CN111222576A publication Critical patent/CN111222576A/en
Application granted granted Critical
Publication of CN111222576B publication Critical patent/CN111222576B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • G06F18/24133Distances to prototypes
    • G06F18/24137Distances to cluster centroïds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Astronomy & Astrophysics (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a high-resolution remote sensing image classification method, which comprises the following steps: inputting an image; constructing a training sample image block set; constructing a 7-layer deep convolution neural network and training the deep convolution neural network; calculating class center characteristics; calculating the difference characteristic of the training sample image block; constructing a distance measure learning network and training the distance measure learning network; constructing a test sample image block set; calculating the depth characteristics of the image blocks of the test samples; calculating the difference characteristic of the image block of the test sample; calculating the distance value between the image block of the test sample and the central feature of each class; selecting a test sample image block and adding the test sample image block into a training sample image block set; repeating the steps 3-11 until reaching the preset iteration steps, and outputting a test sample image block a q Depth feature G of q (ii) a Training a softmax classifier; classification is completed. By the method, the training samples with large data volume in the high-resolution remote sensing image classification can be obtained, and meanwhile, the classification precision is high.

Description

High-resolution remote sensing image classification method
Technical Field
The invention belongs to the technical field of image processing, and relates to a high-resolution remote sensing image classification method.
Background
One of the main contents of the high-resolution remote sensing image processing is the classification of the ground object target. The classification is an analysis technology for describing the target or the category of the ground object, the main task of the classification is a process of giving a category mark to each pixel point of a data body to generate a thematic map, and the classification is one of important ways for people to extract useful information from a remote sensing image. The special map generated after classification can clearly reflect the spatial distribution of the ground features, so that people can know and find the rules of the ground features conveniently, and the high-resolution remote sensing image has real use value and is effectively put into practical application.
The feature extraction of the high-resolution remote sensing image is an important part of the high-resolution remote sensing image classification, and has great influence on the classification precision. At present, the high-resolution remote sensing image feature extraction methods used in the market mainly include a feature extraction method based on spectral information, a feature extraction method based on spatial information, and a feature extraction method combining the spatial information and the spectral information.
In the feature extraction method based on spectral information, since the spectral response of the feature is affected by many factors, such as solar illuminance, atmospheric transparency, wind speed, etc., which are generally difficult to measure accurately, the spectral response curve of the feature obtained through actual measurement may be greatly different from the actual curve. Therefore, the classification accuracy often obtained by the feature extraction method based on the spectral information is not ideal.
The feature extraction method based on the spatial information is a feature extraction method based on manual experience, and needs to know the features of the image in advance and then correspondingly select a proper method, so that the method can achieve a good classification effect only by needing good priori knowledge.
The method improves the classification accuracy by combining a feature extraction method of space and spectral information and by means of the spectral and spatial information of the high-resolution remote sensing image, and although the method overcomes the problem of ground object misclassification caused by only using spectral information or spatial information to a certain extent, better classification accuracy can be obtained by still needing more priori knowledge.
The neural network is an effective method for extracting the space spectrum characteristic, is also an active characteristic learning method, does not need prior knowledge of an image, and is a typical neural network such as a BP (back propagation) neural network, a wavelet neural network and a ridgelet neural network. However, the neural networks are all shallow neural networks, and all the neural networks only comprise 3 layers, in order to better mine the characteristics of deeper layers of images, a model of a deep neural network is provided, typical deep neural networks include a self-coding deep network, a limited boltzmann machine deep network, a deep convolution network and the like, and the networks can obtain higher classification accuracy, so that the neural networks are widely applied to high-resolution remote sensing image classification. However, the current classification method based on the deep neural network is improved on the network structure, and the influence of the number of training samples on the classification precision is not considered. In fact, the influence of the number of training samples on the classification precision is very important, and the increase of the number of the training samples can greatly improve the classification precision of the high-resolution remote sensing image. However, the acquisition mode of the training samples is mainly manual on-site detection, the marking method is influenced by field environment, personal experience and the like, the number of marked samples is greatly limited, and particularly, the marking of samples with large data volume is difficult to realize.
Disclosure of Invention
The invention aims to provide a high-resolution remote sensing image classification method, which solves the problems of difficulty in obtaining and low classification precision of high-resolution remote sensing images in the prior art.
The technical scheme adopted by the invention is that the high-resolution remote sensing image classification method is implemented according to the following steps:
step 1, inputting a high-resolution remote sensing image and class information of each pixel in the image, and respectively randomly selecting a plurality of pixels of each class as training samples;
step 2, constructing training sample image blocks a p And training sample image block set S 1
Step 3, constructing a three-dimensional deep convolution neural network, carrying out normalization processing on training sample image blocks, then using the training sample image blocks as the input of the three-dimensional deep convolution neural network, and training the three-dimensional deep convolution neural network to obtain the trained three-dimensional deep convolution neural network and the depth feature F of the training sample p
Step 4, according toTraining sample image block a obtained in step 3 p Depth feature of (F) p And class mark for calculating class center characteristic (C) of each class 1 ,C 2 ,...,C h ,...,C H ) Wherein H is the number of categories;
step 5, training sample image block a p Depth feature of (F) p Respectively calculating the absolute value of the difference between the central feature of each class and the central feature of each class to obtain difference features
Figure BDA0002359796880000031
Step 6, constructing a distance measure learning network, wherein the network parameter is alpha = (alpha) 12 ,...,α Z ) Characterizing the difference obtained in step 5
Figure BDA0002359796880000032
As input to the network, the output of the network is a depth feature F p A distance value from the center of each class->
Figure BDA0002359796880000033
Training the network to obtain a trained distance measure learning network, wherein Z is the dimension of each difference characteristic;
step 7, constructing a test sample image block b q And testing the sample image block set S 2
Step 8, testing sample image block b obtained in step 7 q Inputting the test sample image block b into the deep convolutional neural network trained in the step 3 q Depth feature G of q
Step 9, respectively calculating the image blocks b of the test samples q Depth feature G of q And class-centered features (C) of each class 1 ,C 2 ,...,C h ,...,C H ) To obtain a difference characteristic
Figure BDA0002359796880000034
Step 10, obtaining the difference value obtained in the step 9Inputting the characteristics into the distance measure learning network in the step 6 to obtain a test sample image block b q Distance from the center of each class
Figure BDA0002359796880000035
And will place the distance->
Figure BDA0002359796880000036
The position corresponding to the minimum value is used as a test sample image block b q Using the distance value of the corresponding position as the image block b of the test sample q The predicted distance of (a);
step 11, according to the prediction categories of the test sample image blocks, selecting a plurality of test sample image blocks with minimum prediction distances for each category, and adding the test sample image blocks into a training sample image block set S 1 And testing the sample image blocks from a set S of test sample image blocks 2 Deleting;
step 12, repeating the steps 3-11 until reaching the preset iteration step number, and outputting the test sample image block a q Depth feature G of q
Step 13, training sample image block a p Depth feature of (F) p And the class mark is used as the input of the softmax classifier, and the softmax classifier is trained to obtain a trained classifier;
step 14, testing sample image block b q Depth feature G of q And as the input of the trained softmax classifier, obtaining the class mark of each test sample image block, and finishing classification.
The invention is also characterized in that:
step 2 specifically is to set the spectral dimension of the high-resolution remote sensing image input in step 1 as V, select a window with the size of 21 x 21 on each dimension of the image by taking each pixel as a center to obtain the spatial information of the pixel on the dimension, and combine the spatial information on all dimensions into a three-dimensional training sample image block a p ,p∈S 1 ,S 1 Is a set of training sample image blocks.
The three-dimensional deep convolutional neural network in the step 3 comprises 7 layers, wherein the 1 st layer is an input layer, the 2 nd layer is a convolutional layer, the 3 rd layer is a down-sampling layer, the 4 th layer is a convolutional layer, the 5 th layer is a down-sampling layer, the 6 th layer is a full-link layer, and the 7 th layer is a softmax classifier;
the layer 2 includes 20 filters;
the 3 rd layer is a 2 x 2 maximum downsampling operation layer;
the layer 4 includes 40 filters;
the 5 th layer is a 2 x 2 maximum downsampling operation layer;
the layer 6 contains 100 node units.
Training the three-dimensional deep convolution neural network in the step 3, and specifically implementing the following steps:
step 3.1, initializing the filters of the two convolution layers by using random initialization;
step 3.2, each normalized training sample image block a p As the input of the input layer, the class label of the sample is obtained through forward propagation;
step 3.3, taking the cross entropy of the network output class mark and the training sample real class mark as a cost function;
step 3.4, minimizing the cost function by adopting a back propagation algorithm to obtain a trained network, and outputting the depth feature F of the 6 th layer p
Class center feature C in step 4 h The calculation formula of (2) is as follows:
Figure BDA0002359796880000051
wherein, C h Is a class h class center-like feature, Ω h Is the set of h-th class training sample image blocks, m is Ω h The number of inner samples.
Difference feature in step 5
Figure BDA0002359796880000052
The calculation formula of (2) is as follows:
Figure BDA0002359796880000053
the distance measure learning network in the step 6 comprises a first layer input layer and a second layer output layer, and the parameter of the distance measure learning network is alpha = (alpha) 12 ,...,α Z ) Wherein Z is a difference characteristic
Figure BDA0002359796880000058
Dimension (d) of (a).
Step 6, training the distance measure learning network, specifically comprising the following steps:
step 6.1, network parameter α = (α) using random initialization 12 ,...,α Z ) Carrying out initialization;
step 6.2, characterizing the difference
Figure BDA0002359796880000054
As input to the input layer, the distance ≥ from the network output is obtained via forward propagation>
Figure BDA0002359796880000055
Step 6.3, defining the true value of the distance: when the sample image block a is trained in formula (2) p Depth feature of (F) p Belonging to class h, difference characteristics
Figure BDA0002359796880000056
Set the true value of distance of (1) to 0; when F is present p Not belonging to the h-th class, the difference characteristic>
Figure BDA0002359796880000057
The true value of the distance of (d) is set to a fixed constant δ;
and 6.4, taking the root mean square error of the distance value obtained in the step 6.2 and the distance true value obtained in the step 6.3 as a cost function, and minimizing the cost function by adopting a back propagation algorithm to obtain the trained distance measure learning network.
Step 7 specifically includes that for pixel points except for the training sample in the high-resolution remote sensing image, a window with the size of 21 x 21 is selected on each dimension of the high-resolution remote sensing image by taking each pixel as a center, the spatial information of the sample on the dimension is obtained, and the spatial information on all dimensions is combined into a three-dimensional test sample image block b q ,q∈S 2 ,S 2 Is a collection of test samples.
The calculation formula of the difference characteristic in the step 9 is as follows:
Figure BDA0002359796880000061
wherein S 2 Is a collection of test samples.
The invention has the beneficial effects that:
(1) According to the high-resolution remote sensing image classification method, the problem that the influence of the number of samples on the classification result is not considered in the traditional high-resolution remote sensing image classification method is solved by combining the deep convolutional neural network and the distance measurement learning method, and the classification precision is improved;
(2) The high-resolution remote sensing image classification method provided by the invention learns the distance parameters by adopting a self-adaptive network method, overcomes the problem that the traditional distance measurement does not consider the data characteristics, and improves the accuracy of sample selection.
Drawings
FIG. 1 is a block flow diagram of a high resolution remote sensing image classification method of the present invention;
FIG. 2 is a diagram of an image used in an experiment and a real ground object classification chart of the high-resolution remote sensing image classification method of the invention;
FIG. 3 is a structural diagram of a deep convolutional neural network in the high resolution remote sensing image classification method of the present invention;
FIG. 4 is a structural diagram of a distance measure learning network of the high resolution remote sensing image classification method of the present invention;
FIG. 5 is a comparison of the classification results of FIG. 2 using a high resolution remote sensing image classification method of the present invention and a conventional classification method.
Detailed Description
The invention is described in detail below with reference to the drawings and the detailed description.
The invention discloses a high-resolution remote sensing image classification method, which is implemented according to the following steps as shown in figure 1:
step 1, inputting an image;
inputting a high-resolution remote sensing image, as shown in fig. 2, wherein 2 (a) is the input high-resolution remote sensing image, fig. 2 is colored, but the image is required to be black and white in an examination guideline, so that the image is converted into black and white to be visible, fig. 2 (b) is a class mark image corresponding to 2 (a), and each class in 2 (a) is respectively selected to be 200 pixels at random as a training sample;
step 2, constructing a training sample image block set;
setting the spectral dimension of the high-resolution remote sensing image input in the step 1 as V, selecting a window with the size of 21 x 21 for each dimension to obtain the spatial information of the pixel on the dimension, and forming the spatial information on all the dimensions into a three-dimensional training sample image block a p ,p∈S 1 ,S 1 Is a set of training sample image blocks;
step 3, constructing a 7-layer deep convolution neural network and training the neural network;
a 7-layer deep convolutional neural network as shown in fig. 3 was constructed: wherein the 1 st layer is an input layer, the 2 nd layer is a convolution layer, the 3 rd layer is a down-sampling layer, the 4 th layer is a convolution layer, the 5 th layer is a down-sampling layer, the 6 th layer is a full-link layer, and the 7 th layer is a softmax classifier; the input of the layer 1 is a normalized training sample image block a p (ii) a The layer 2 contains 20 filters of size 6 x 6; the layer 3 adopts a 2 x 2 maximum downsampling method; layer 4 contains 40 filters of size 5 × 5; the 5 th layer adopts a 2 multiplied by 2 maximum downsampling method; layer 6 contains 100 node units; the 7 th layer is a softmax classification layer.
Training a 7-layer deep convolutional neural network, which comprises the following steps:
step 3.1, initializing the filters of the two convolution layers by using random initialization;
step 3.2, each normalized training sample image block a p As the input of the input layer, the class label of the sample is obtained through forward propagation;
step 3.3, taking the cross entropy of the network output class mark and the training sample real class mark as a cost function;
step 3.4, minimizing the cost function by adopting a back propagation algorithm to obtain a trained network, and outputting the depth feature F of the 6 th layer p
Step 4, calculating class center characteristics;
training sample image block a obtained according to step 3 p Depth feature F of p Obtaining class center features (C) of each class 1 ,C 2 ,...,C h ,...,C H ) H is the number of classes, where class center feature C h The calculation formula of (2) is as follows:
Figure BDA0002359796880000081
wherein C is h Is a class h class-centered feature, Ω h Is a set of h-th class training sample image blocks, m is Ω h The number of inner samples;
step 5, calculating the difference characteristic of the training sample image block;
for training sample image block a p Depth feature F of p Calculating the class center feature (C) of each class 1 ,C 2 ,...,C h ,...,C H ) To obtain a difference characteristic
Figure BDA0002359796880000082
Wherein the difference characteristic->
Figure BDA0002359796880000083
The calculation formula of (c) is:
Figure BDA0002359796880000084
step 6, constructing a distance measure learning network and training the distance measure learning network;
a 2-layer distance measure learning network as shown in fig. 4 is constructed: wherein the 1 st layer is an input layer and the second layer is an output layer; the first layer inputs the difference characteristic obtained in the step 5
Figure BDA0002359796880000085
The output of the layer 2 is the distance between the training sample image block and each class of central feature->
Figure BDA0002359796880000086
The parameter of the network is α = (α) 12 ,...,α Z ) Wherein Z is a difference characteristic>
Figure BDA0002359796880000087
The dimension of (a);
training a 2-layer distance measure learning network, comprising the following steps:
step 6.1, network parameter α = (α) using random initialization 12 ,...,α Z ) Carrying out initialization;
step 6.2, characterizing the difference
Figure BDA0002359796880000091
As input to the input layer, the distance ≥ from the network output is obtained via forward propagation>
Figure BDA0002359796880000092
Step 6.3, defining the true value of the distance: if the training sample image block a in equation (2) p Depth feature of (F) p Belonging to class h, difference characteristics
Figure BDA0002359796880000093
Set the true value of distance of (1) to 0; if F p Not in class h, difference characteristic>
Figure BDA0002359796880000094
The true value of the distance is set as a fixed constant delta, and the value of the delta is different according to different set values of image contents;
step 6.4, taking the root mean square error of the distance value obtained in the step 6.2 and the distance true value obtained in the step 6.3 as a cost function, and minimizing the cost function by adopting a back propagation algorithm to obtain a trained distance measure learning network;
step 7, constructing a test sample image block set;
for pixel points except for the training sample in the high-resolution remote sensing image, taking each pixel as a center, selecting a window with the size of 21 multiplied by 21 in each dimension of the high-resolution remote sensing image to obtain the spatial information of the sample in the dimension, and forming the spatial information in all dimensions into a three-dimensional test sample image block b q ,q∈S 2 ,S 2 Is a collection of test samples;
step 8, calculating the depth characteristics of the image blocks of the test samples;
testing sample image block b obtained in step 7 q Inputting the depth characteristic G into the network trained in the step 3, and carrying out forward propagation to obtain the depth characteristic G of the image block of the test sample q
Step 9, calculating the difference characteristic of the image block of the test sample;
for each test sample image block b q Depth feature G of q Calculating the class center feature (C) in step 4 1 ,C 2 ,...,C h ,...,C H ) To obtain a difference characteristic
Figure BDA0002359796880000095
The calculation formula of the difference characteristic is as follows:
Figure BDA0002359796880000101
step 10, calculating the distance value between the image block of the test sample and the central feature of each class;
will go to stepThe difference characteristic of the test sample image block obtained in the step 9
Figure BDA0002359796880000102
Inputting the distance measurement into the distance measurement learning network trained in the step 6, and obtaining a network parameter alpha = (alpha) according to training 12 ,...,α Z ) Obtaining the image block b of the test sample by forward calculation q And the distance of the center of each class->
Figure BDA0002359796880000103
Will be at distance
Figure BDA0002359796880000104
The position corresponding to the minimum value is used as a test sample image block b q The corresponding distance value is used as the image block b of the test sample q The predicted distance of (a);
step 11, selecting a test sample image block and adding the test sample image block into a training sample image block set;
grouping according to the prediction categories of all the test sample image blocks, selecting 200 test sample image blocks with the minimum prediction distance for each group, and adding the test sample image blocks into a training sample image block set S 1 And from a set S of test sample image blocks 2 Deleting;
step 12, repeating the steps 3-11 until reaching the preset iteration step number, and outputting the test sample image block a q Depth feature G of q
Step 13, training a softmax classifier;
the training sample image block a in the step 3 is processed p Depth feature of (F) p And the class mark is used as the input of the softmax classifier, and the softmax classification is trained to obtain the trained classifier;
the softmax classifier is:
Figure BDA0002359796880000105
whereinc p,h Indicates the probability, θ, that the p-th sample belongs to the h-th class h Is the parameter value of the softmax classifier;
step 14, testing sample image block b q Depth feature G of q As the input of the trained softmax classifier, the obtained test sample image block b q Obtaining probability belonging to each class, selecting the position corresponding to the maximum value of the probability value and setting the position as a test sample image block b q The class label of (1) completes the classification.
The classification effect by the method of the present invention is illustrated by the following simulation experiment:
simulation conditions are as follows:
the hardware conditions of the simulation of the invention are as follows: windows XP, SPI, CPU Pentium (R) 4, with a fundamental frequency of 2.4GHZ; the software platform is as follows: matlabR2016a, pyrrch;
the picture source selected for simulation is a high-resolution remote sensing image of Pavia University, and the image has 9 types of ground objects in total, as shown in fig. 2 (a), fig. 2 (b) is a class mark image corresponding to fig. 2 (a); respectively randomly selecting 200 pixel points of each type as initial training samples;
the simulation method respectively uses the method of the invention and the existing EMAPs, 3D CNNs, gabor-CNNs, MCNNs and Perceptual Loss methods;
simulation content and results
Simulation 1, the present invention and five existing methods are used to perform classification simulation on fig. 2 (a), and the result is shown in fig. 5, in which:
FIG. 5 (a) is a graph showing the results of classification by the EMAPs method,
FIG. 5 (b) is a graph showing the classification result by the 3D CNNs method,
FIG. 5 (c) is a graph showing the classification result by the Gabor-CNNs method,
figure 5 (d) is a graph of the results of classification using the MCNNs method,
FIG. 5 (e) is a diagram showing the classification result by the Perceptial Loss method,
FIG. 5 (f) is a graph showing the results of classification using the method of the present invention.
Fig. 5 is colored in itself but it is visible by converting it into black and white since the examination guideline requires the image to be black and white, and as can be seen from the classification result diagrams of fig. 5 (a) -5 (f), the classification method of the present invention is better in accuracy and classification effect.
The above experimental results show that: compared with the prior art, the method has obvious advantages in the aspect of solving the contradiction between the number of the training samples and the classification precision, and effectively improves the classification precision of the high-resolution remote sensing image under the limited training samples.

Claims (10)

1. A high-resolution remote sensing image classification method is characterized by being implemented according to the following steps:
step 1, inputting a high-resolution remote sensing image and class information of each pixel in the image, and respectively randomly selecting a plurality of pixels of each class as training samples;
step 2, constructing a training sample image block a p And training sample image block set S 1
Step 3, constructing a three-dimensional deep convolution neural network, carrying out normalization processing on training sample image blocks, then using the training sample image blocks as the input of the three-dimensional deep convolution neural network, and training the three-dimensional deep convolution neural network to obtain the trained three-dimensional deep convolution neural network and the depth feature F of the training sample p
Step 4, obtaining training sample image blocks a according to the step 3 p Depth feature of (F) p And class mark for calculating class center feature (C) of each class 1 ,C 2 ,...,C h ,...,C H ) Wherein H is the number of categories;
step 5, training sample image block a p Depth feature of (F) p Respectively calculating the absolute value of the difference between the central feature of each class and the central feature of each class to obtain difference features
Figure FDA0002359796870000011
Step 6, constructing a distance measure learning network, wherein the network parameter is alpha = (alpha) 12 ,...,α Z ) Characterizing the difference obtained in step 5
Figure FDA0002359796870000012
The output of the network is the depth feature F as input to the network p Distance value from the center of each class
Figure FDA0002359796870000013
Training the network to obtain a trained distance measure learning network, wherein Z is the dimension of each difference characteristic;
step 7, constructing a test sample image block b q And testing the sample image block set S 2
Step 8, testing sample image block b obtained in step 7 q Inputting the test sample image block b into the deep convolutional neural network trained in the step 3 q Depth feature G of q
Step 9, respectively calculating the image blocks b of the test samples q Depth feature G of q And class-centered features (C) of each class 1 ,C 2 ,...,C h ,...,C H ) To obtain a difference characteristic
Figure FDA0002359796870000014
Step 10, inputting the difference features obtained in the step 9 into the distance measure learning network in the step 6 to obtain the image block b of the test sample q Distance from the center of each class
Figure FDA0002359796870000021
And will be distant from
Figure FDA0002359796870000022
The position corresponding to the minimum value is used as a test sample image block b q Using the distance value of the corresponding position as the image block b of the test sample q The predicted distance of (a);
step 11, according to the prediction categories of the image blocks of the test samples, selecting a plurality of tests with minimum prediction distances for each categoryAdding training sample image block set S into sample image block 1 And testing the sample image blocks from a set S of test sample image blocks 2 Deleting;
step 12, repeating the steps 3-11 until reaching the preset iteration step number, and outputting the test sample image block a q Depth feature G of q
Step 13, training sample image block a p Depth feature of (F) p The class mark is used as the input of the softmax classifier, and the softmax classifier is trained to obtain a trained classifier;
step 14, testing sample image block b q Depth feature G of q And (4) as the input of the trained softmax classifier, obtaining the class mark of each test sample image block and finishing classification.
2. The method for classifying high-resolution remote sensing images according to claim 1, wherein: the step 2 is specifically to set the spectral dimension of the high-resolution remote sensing image input in the step 1 as V, select a window with the size of 21 × 21 on each dimension of the image by taking each pixel as a center to obtain spatial information of the pixel on the dimension, and combine the spatial information on all dimensions into a three-dimensional training sample image block a p ,p∈S 1 ,S 1 Is a set of training sample image blocks.
3. The method for classifying high-resolution remote sensing images according to claim 1, wherein: the three-dimensional deep convolutional neural network in the step 3 comprises 7 layers, wherein the 1 st layer is an input layer, the 2 nd layer is a convolutional layer, the 3 rd layer is a downsampling layer, the 4 th layer is a convolutional layer, the 5 th layer is a downsampling layer, the 6 th layer is a full-link layer, and the 7 th layer is a softmax classifier;
the layer 2 includes 20 filters;
the 3 rd layer is a 2 x 2 maximum downsampling operation layer;
the layer 4 includes 40 filters;
the 5 th layer is a 2 x 2 maximum downsampling operation layer;
the layer 6 contains 100 node units.
4. The method for classifying high-resolution remote sensing images according to claim 3, wherein: in the step 3, the three-dimensional deep convolution neural network is trained, and the method is specifically implemented according to the following steps:
step 3.1, initializing the filters of the two convolution layers by using random initialization;
step 3.2, every normalized training sample image block a p As the input of the input layer, the class label of the sample is obtained through forward propagation;
step 3.3, taking the cross entropy of the network output class mark and the training sample real class mark as a cost function;
step 3.4, minimizing the cost function by adopting a back propagation algorithm to obtain a trained network, and outputting the depth feature F of the 6 th layer p
5. The method for classifying high-resolution remote sensing images according to claim 1, wherein: class center feature C in said step 4 h The calculation formula of (2) is as follows:
Figure FDA0002359796870000031
wherein, C h Is a class h class center-like feature, Ω h Is a set of h-th class training sample image blocks, m is Ω h The number of inner samples.
6. The method for classifying high-resolution remote sensing images according to claim 1, wherein: the difference characteristic in the step 5
Figure FDA0002359796870000032
The calculation formula of (2) is as follows:
Figure FDA0002359796870000033
7. the method for classifying high-resolution remote sensing images according to claim 1, wherein: the distance measure learning network in the step 6 comprises a first layer input layer and a second layer output layer, and the parameter of the distance measure learning network is α = (α =) 12 ,...,α Z ) Wherein Z is a difference characteristic
Figure FDA0002359796870000046
Dimension (d) of (a).
8. The method for classifying high-resolution remote sensing images according to claim 7, wherein: the step 6 is to train the distance measurement learning network, and is specifically implemented according to the following steps:
step 6.1, network parameter α = (α) using random initialization 12 ,...,α Z ) Carrying out initialization;
step 6.2, characterizing the difference
Figure FDA0002359796870000041
The distance of the network output is obtained by forward propagation as the input of the input layer
Figure FDA0002359796870000042
Step 6.3, defining the true value of the distance: when the sample image block a is trained in formula (2) p Depth feature of (F) p Belonging to class h, difference characteristics
Figure FDA0002359796870000043
Set the true value of distance of (1) to 0; when F is present p Not of the h-th class, differential characteristics
Figure FDA0002359796870000044
The true value of the distance of (d) is set to a fixed constant δ;
and 6.4, taking the root mean square error of the distance value obtained in the step 6.2 and the distance true value obtained in the step 6.3 as a cost function, and minimizing the cost function by adopting a back propagation algorithm to obtain the trained distance measure learning network.
9. The method for classifying high-resolution remote sensing images according to claim 1, wherein: the step 7 is specifically that, for the pixel points in the high-resolution remote sensing image except the training sample, a window with the size of 21 x 21 is selected on each dimension of the high-resolution remote sensing image by taking each pixel as a center to obtain the spatial information of the sample on the dimension, and the spatial information on all dimensions is combined into a three-dimensional test sample image block b q ,q∈S 2 ,S 2 Is a collection of test samples.
10. The method for classifying high-resolution remote sensing images according to claim 1, wherein: the calculation formula of the difference characteristic in the step 9 is as follows:
Figure FDA0002359796870000045
wherein S 2 Is a collection of test samples.
CN202010018383.0A 2020-01-08 2020-01-08 High-resolution remote sensing image classification method Active CN111222576B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010018383.0A CN111222576B (en) 2020-01-08 2020-01-08 High-resolution remote sensing image classification method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010018383.0A CN111222576B (en) 2020-01-08 2020-01-08 High-resolution remote sensing image classification method

Publications (2)

Publication Number Publication Date
CN111222576A CN111222576A (en) 2020-06-02
CN111222576B true CN111222576B (en) 2023-03-24

Family

ID=70831069

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010018383.0A Active CN111222576B (en) 2020-01-08 2020-01-08 High-resolution remote sensing image classification method

Country Status (1)

Country Link
CN (1) CN111222576B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112257603B (en) * 2020-10-23 2022-06-17 深圳大学 Hyperspectral image classification method and related equipment
CN112766371B (en) * 2021-01-19 2023-01-24 西安理工大学 High-resolution remote sensing image supervision and classification method based on few training samples

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105069468A (en) * 2015-07-28 2015-11-18 西安电子科技大学 Hyper-spectral image classification method based on ridgelet and depth convolution network
CN107451616A (en) * 2017-08-01 2017-12-08 西安电子科技大学 Multi-spectral remote sensing image terrain classification method based on the semi-supervised transfer learning of depth
WO2018214195A1 (en) * 2017-05-25 2018-11-29 中国矿业大学 Remote sensing imaging bridge detection method based on convolutional neural network
CN109766858A (en) * 2019-01-16 2019-05-17 中国人民解放军国防科技大学 Three-dimensional convolution neural network hyperspectral image classification method combined with bilateral filtering

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105069468A (en) * 2015-07-28 2015-11-18 西安电子科技大学 Hyper-spectral image classification method based on ridgelet and depth convolution network
WO2018214195A1 (en) * 2017-05-25 2018-11-29 中国矿业大学 Remote sensing imaging bridge detection method based on convolutional neural network
CN107451616A (en) * 2017-08-01 2017-12-08 西安电子科技大学 Multi-spectral remote sensing image terrain classification method based on the semi-supervised transfer learning of depth
CN109766858A (en) * 2019-01-16 2019-05-17 中国人民解放军国防科技大学 Three-dimensional convolution neural network hyperspectral image classification method combined with bilateral filtering

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
结合ERDAS的基于BPNN遥感图像分类;刘伟;《兵工自动化》;20060815(第04期);全文 *
肺部CT图像病变区域检测方法;韩光辉等;《自动化学报》;20171215(第12期);全文 *

Also Published As

Publication number Publication date
CN111222576A (en) 2020-06-02

Similar Documents

Publication Publication Date Title
CN108985238B (en) Impervious surface extraction method and system combining deep learning and semantic probability
US11055574B2 (en) Feature fusion and dense connection-based method for infrared plane object detection
CN108564109B (en) Remote sensing image target detection method based on deep learning
CN110378196B (en) Road visual detection method combining laser point cloud data
CN108961235B (en) Defective insulator identification method based on YOLOv3 network and particle filter algorithm
CN105678332B (en) Converter steelmaking end point judgment method and system based on flame image CNN recognition modeling
CN110929607B (en) Remote sensing identification method and system for urban building construction progress
CN108564606B (en) Heterogeneous image block matching method based on image conversion
CN110599537A (en) Mask R-CNN-based unmanned aerial vehicle image building area calculation method and system
CN110188774B (en) Eddy current scanning image classification and identification method based on deep learning
CN107423760A (en) Based on pre-segmentation and the deep learning object detection method returned
CN110969088A (en) Remote sensing image change detection method based on significance detection and depth twin neural network
CN108765475B (en) Building three-dimensional point cloud registration method based on deep learning
CN109002848B (en) Weak and small target detection method based on feature mapping neural network
CN111291675B (en) Deep learning-based hyperspectral ancient painting detection and identification method
CN112434745A (en) Occlusion target detection and identification method based on multi-source cognitive fusion
CN111783884B (en) Unsupervised hyperspectral image classification method based on deep learning
CN109284779A (en) Object detecting method based on the full convolutional network of depth
CN111222576B (en) High-resolution remote sensing image classification method
CN111161224A (en) Casting internal defect grading evaluation system and method based on deep learning
CN110598564A (en) OpenStreetMap-based high-spatial-resolution remote sensing image transfer learning classification method
CN108256557B (en) Hyperspectral image classification method combining deep learning and neighborhood integration
CN110852369A (en) Hyperspectral image classification method combining 3D/2D convolutional network and adaptive spectrum unmixing
CN114048810A (en) Hyperspectral image classification method based on multilevel feature extraction network
CN111738052A (en) Multi-feature fusion hyperspectral remote sensing ground object classification method based on deep learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant