CN113869136A - Semi-supervised polarimetric SAR image classification method based on multi-branch network - Google Patents

Semi-supervised polarimetric SAR image classification method based on multi-branch network Download PDF

Info

Publication number
CN113869136A
CN113869136A CN202111032840.2A CN202111032840A CN113869136A CN 113869136 A CN113869136 A CN 113869136A CN 202111032840 A CN202111032840 A CN 202111032840A CN 113869136 A CN113869136 A CN 113869136A
Authority
CN
China
Prior art keywords
image
network
module
training sample
classification
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111032840.2A
Other languages
Chinese (zh)
Other versions
CN113869136B (en
Inventor
李明
辛欣悦
张鹏
吴艳
徐大治
郑佳
胡欣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN202111032840.2A priority Critical patent/CN113869136B/en
Publication of CN113869136A publication Critical patent/CN113869136A/en
Application granted granted Critical
Publication of CN113869136B publication Critical patent/CN113869136B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/047Probabilistic or stochastic networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Probability & Statistics with Applications (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a semi-supervised polarimetric SAR image classification method based on a multi-branch network, which comprises the following steps of; constructing a test sample set, a labeled training sample set and an unlabeled training sample set; constructing a semi-supervised polarimetric SAR image classification model H based on a multi-branch network; carrying out iterative training on the polarized SAR image classification model H; and obtaining a classification result of the PolSAR image. In the process of training the image classification model, the two networks in the advanced feature extraction module respectively extract advanced features of the marked training sample and the unmarked training sample, then the three classification modules classify the marked training sample in different modes, and the MFB fusion module can perform MFB fusion on each first advanced feature of the marked training sample and the second advanced feature of the corresponding position, so that the overfitting problem and the redundancy problem in the prior art are effectively solved, and the precision of the polarized SAR image classification is improved.

Description

Semi-supervised polarimetric SAR image classification method based on multi-branch network
Technical Field
The invention belongs to the technical field of image processing, relates to a polarized SAR image classification method, and particularly relates to a semi-supervised polarized SAR image classification method based on a multi-branch network. The method can be used for agricultural development, ocean monitoring, urban planning, geological exploration and the like.
Background
Synthetic Aperture Radar (SAR) is insensitive to weather conditions and illumination conditions, and compared with optical remote sensing, SAR is not affected by factors such as weather and cloud layers, and can acquire remote sensing data all day long. Polar synthetic aperture radar (polar SAR) alternately transmits and receives radar signals in a horizontal polarization mode and a vertical polarization mode, so that more complete and richer target information can be obtained, and a target is more comprehensively described. The PolSAR image classification aims at dividing PolSAR images into different ground object categories according to the difference of characteristics among classification units in the PolSAR images, and plays an important role in the aspects of agricultural development, ocean monitoring, city planning, geological exploration and the like.
The traditional PolSAR image classification algorithm needs to set a specific algorithm for a specific target according to a large amount of experience and strong professional knowledge, and is long in time consumption and difficult to popularize. In recent years, the PolSAR image classification method based on deep learning realizes data-driven PolSAR image classification, and the method can autonomously learn and extract the characteristics effective for classification from data without manually selecting the characteristics, designing a classifier and stronger professional knowledge. The PolSAR image classification based on deep learning can be divided into three categories, namely supervision classification, unsupervised classification and semi-supervision classification according to whether prior knowledge is needed in the classification process. The semi-supervised classification method is a classification method combining supervised classification and unsupervised classification, can simultaneously use marked data and unmarked data, and can bring higher classification precision on the premise of reducing the workload of acquiring priori knowledge. For example, the self-training algorithm and the joint training algorithm are both supervised training by using labeled data as a training set to obtain a classifier, then classifying unlabeled data by using the classifier, selecting unlabeled samples with high reliability and predictive labels thereof to be added into the training set according to a classification result, expanding the scale of the training set, and carrying out the unsupervised training again to obtain a new classifier.
For example, a patent application with publication number CN112966779A entitled "a polarisar image semi-supervised classification method" discloses a polarisar image semi-supervised classification method. The method comprises the steps of classifying PolSAR images by utilizing a Wishart classifier, an SVM classifier and a CV-CNN model on the basis of a small number of marked training samples, performing majority voting on classification results to generate a strong data set and a weak data set, using the strong data set as pseudo labels, reclassifying the weak data set by using three classifiers, integrating the three classification results in a majority voting mode, and finally combining the strong data set with the reclassification results to obtain a final classification result. The joint training algorithm adopted by the method fully utilizes the advantages of each classifier, obtains higher classification precision, but still has the over-fitting problem caused by training the network by using a small amount of labeled data and the redundancy problem caused by the multi-classifier fusion algorithm of majority voting, so the classification precision of the PolSAR image is still lower.
Disclosure of Invention
The invention aims to provide a semi-supervised polarimetric SAR image classification method based on a multi-branch network aiming at the defects in the prior art, and the method is used for solving the technical problem of low classification precision in the prior art.
In order to achieve the purpose, the technical scheme adopted by the invention comprises the following steps:
(1) obtaining a test sample set DtestLabeled training sample set
Figure BDA0003246031410000021
And label-free training sample set
Figure BDA0003246031410000022
(1a) Obtaining a ground object class L containing C ═ { L ═ LcL 1 is less than or equal to C, and S PolSAR images P ═ { P ═ P ≦ C }sS is more than or equal to 1 and less than or equal to S, and for each PolSAR image PsDividing the image block to obtain an image block set P' ═ Ps'|1≤s≤S},
Figure BDA0003246031410000023
Then obtaining each image block
Figure BDA0003246031410000024
Is characterized by
Figure BDA0003246031410000025
Wherein C is more than or equal to 2 and LcRepresents the c-th ground object type, S is more than or equal to 100, PsRepresents the s PolSAR image, Ps' represents PsCorresponding containing VsA subset of the number of image blocks,
Figure BDA0003246031410000026
represents Ps' in vsImage block, Vs≥1000;
(1b) Randomly selecting N PolSAR images in PolSAR image set P as test set Ptest={Ps1S1 is more than or equal to 1 and is more than or equal to N, and each PolSAR image P is processeds1Corresponding set of image blocks Ps1' of each image block
Figure BDA0003246031410000027
Is characterized by
Figure BDA0003246031410000028
As a test sample, P was obtainedtestCorresponding test sample set DtestThen taking the residual S-N PolSAR images in P as a training set Ptrain={Ps2L 1 is not less than S2 is not less than S-N, and for each PolSAR image Ps2Corresponding set of image blocks Ps2' of each image block
Figure BDA0003246031410000029
Clustering is carried out to obtain
Figure BDA00032460314100000210
Cluster mark of
Figure BDA0003246031410000031
Wherein
Figure BDA0003246031410000032
B is the number of clusters;
(1c) from the training set PtrainIn randomly selecting NlAmplitude PolSAR image Pl train={Ps21|1≤s21≤NlAnd for each PolSAR image Ps21Corresponding set of image blocks Ps21' of each image block
Figure BDA0003246031410000033
Carrying out real ground object class marking to obtain
Figure BDA0003246031410000034
Ground object mark
Figure BDA0003246031410000035
Then will be
Figure BDA0003246031410000036
Is characterized by
Figure BDA0003246031410000037
Ground object mark
Figure BDA0003246031410000038
And clustering labels
Figure BDA0003246031410000039
Composing a labeled training sample to obtain Pl trainCorresponding labeled training sample set
Figure BDA00032460314100000310
Wherein the content of the first and second substances,
Figure BDA00032460314100000311
(1d) will train set PtrainThe remaining S-N inlAmplitude PolSAR image
Figure BDA00032460314100000312
Each PolSAR image P ofs22Corresponding set of image blocks Ps22' of each image block
Figure BDA00032460314100000313
Is characterized by
Figure BDA00032460314100000314
And clustering labels
Figure BDA00032460314100000315
Forming a label-free training sample to obtain
Figure BDA00032460314100000316
Corresponding label-free training sample set
Figure BDA00032460314100000317
(2) Constructing a semi-supervised polarimetric SAR image classification model H based on a multi-branch network:
constructing a semi-supervised polarimetric SAR image classification model H comprising an advanced feature extraction module and a multi-branch processing module which are sequentially cascaded, wherein:
the advanced feature extraction module comprises a first network and a second network which are arranged in parallel, wherein the first network and the second network respectively comprise a plurality of parameter-shared convolution layers, a plurality of batch normalization layers and a plurality of activation layers, the first network further comprises a first full connection layer, and the second network further comprises a second full connection layer with different parameters from the first full connection layer;
the multi-branch processing module comprises an MFB fusion module, a first classification module, a second classification module and a third classification module which are arranged in parallel, and the output end of the MFB fusion module is cascaded with the third classification module; the MFB fusion module comprises a third full-connection layer, a matrix dot product module and a pooling layer which are sequentially cascaded; the first classification module comprises a fourth full connection layer and a Softmax activation layer which are cascaded; the second classification module comprises a fifth full connection layer and a Softmax activation layer which are cascaded, and the third classification module comprises a sixth full connection layer and a Softmax activation layer which are cascaded;
(3) performing iterative training on a semi-supervised polarimetric SAR image classification model H based on a multi-branch network:
(3a) the initial iteration number is I, the maximum iteration number is I, I is more than or equal to 200, and the image classification model of the ith iteration is Hi,HiThe weight parameter of is omegaiAnd let i equal to 1, Hi=H;
(3b) Will be derived from the labeled training sample set
Figure BDA0003246031410000041
With a replaced and randomly selected MlA marked training sample and from a set of unmarked training samples
Figure BDA0003246031410000042
With a replaced and randomly selected MuAn unlabeled training sample is used as a semi-supervised polarimetric SAR image classification model HiThe first network in the advanced feature extraction module respectively carries out advanced feature extraction on each marked training sample and each unmarked training sample to obtain a first advanced feature set of the marked training samples
Figure BDA0003246031410000043
And advanced feature sets of label-free training samples
Figure BDA0003246031410000044
Meanwhile, the second network carries out advanced feature extraction on each marked training sample to obtain a second advanced feature set of the marked training samples
Figure BDA0003246031410000045
Wherein
Figure BDA0003246031410000046
And
Figure BDA0003246031410000047
respectively represent the m-th1A first high-level feature output by the first network and a second high-level feature output by the second network for each labeled training sample,
Figure BDA0003246031410000048
denotes the m-th2Advanced features of each unmarked training sample output via the first network, wherein M is greater than or equal to 30l≤Nl×Vs,50≤Mu≤(S-N-Nl)×Vs
(3c) The first classification module pairs the first high-level feature set F1 of the labeled training sampleslAnd advanced feature set F1 of unlabeled training samplesuIs classified into F1lCorresponding first prediction label set
Figure BDA0003246031410000049
And F1uCorresponding second set of predicted labels
Figure BDA00032460314100000410
While the second classification module pairs the second high-level feature set of labeled training samples F2lIs classified to obtain the sum of F2lCorresponding third predictive tag set
Figure BDA00032460314100000411
MFB fusionModule pair first high-level feature set F1 with labeled training sampleslEach of the first high-level features in
Figure BDA00032460314100000412
With a second set of advanced features F2lSecond high level features of corresponding location
Figure BDA00032460314100000413
MFB fusion, third sort Module Pair
Figure BDA00032460314100000414
And
Figure BDA00032460314100000415
the fusion result of (2) is classified to obtain a fourth prediction label
Figure BDA00032460314100000416
Then AND F1lAnd F2lThe fourth prediction label set corresponding to the fusion result is
Figure BDA00032460314100000417
(3d) Using cross entropy loss function and passing each labeled training sample
Figure BDA00032460314100000418
Corresponding first prediction tag
Figure BDA00032460314100000419
And clustering labels
Figure BDA00032460314100000420
Calculate HiFirst loss value of
Figure BDA00032460314100000421
By each unlabeled training sample
Figure BDA00032460314100000422
Corresponding second preTest label
Figure BDA00032460314100000423
And clustering labels
Figure BDA00032460314100000424
Calculate HiSecond loss value of
Figure BDA0003246031410000051
By each marked training sample
Figure BDA0003246031410000052
Corresponding third prediction tag
Figure BDA0003246031410000053
And ground object mark
Figure BDA0003246031410000054
Calculate HiThird loss value of
Figure BDA0003246031410000055
By each marked training sample
Figure BDA0003246031410000056
Corresponding fourth prediction tag
Figure BDA0003246031410000057
And ground object mark
Figure BDA0003246031410000058
Calculate HiFourth loss value of
Figure BDA0003246031410000059
And will be
Figure BDA00032460314100000510
And
Figure BDA00032460314100000511
as HiTotal loss value ofLossi
Figure BDA00032460314100000512
(3e) Loss findingiFor weight parameter omegaiPartial derivatives of
Figure BDA00032460314100000513
And using a gradient descent method by
Figure BDA00032460314100000514
At HiThe weight parameter omega is subjected to counter propagationiUpdating is carried out;
(3f) judging whether I is more than or equal to I, if so, obtaining a trained semi-supervised polarimetric SAR image classification model H based on the multi-branch network*Otherwise, let i become i +1, and execute step (3 b);
(4) obtaining a classification result of the PolSAR image:
(4a) set D of test samplestestEach test specimen in (1)
Figure BDA00032460314100000515
Semi-supervised polarimetric SAR image classification model H based on multi-branch network and used as training*The input of (1); the first network and the second network in the advanced feature extraction module are respectively paired
Figure BDA00032460314100000516
Performing advanced feature extraction to obtain
Figure BDA00032460314100000517
Corresponding first high-level features
Figure BDA00032460314100000518
And second advanced features
Figure BDA00032460314100000519
(4b) MFB fusionModule pair test sample
Figure BDA00032460314100000520
Corresponding first high-level features
Figure BDA00032460314100000521
And second advanced features
Figure BDA00032460314100000522
MFB fusion, third sort Module Pair
Figure BDA00032460314100000523
And
Figure BDA00032460314100000524
classifying the fusion result to obtain a test sample
Figure BDA00032460314100000525
Predictive label of
Figure BDA00032460314100000526
Then VsThe prediction label is the PolSAR image P corresponding to the prediction labels1The classification result of (1).
Compared with the prior art, the invention has the following advantages:
1. the multi-branch processing module in the polarimetric SAR image classification model constructed by the invention comprises a first classification module, a second classification module and a third classification module, in the process of training the image classification model, two networks in the advanced feature extraction module respectively carry out advanced feature extraction on a marked training sample and an unmarked training sample, and then the three classification modules are used for carrying out classification in different modes, so that the problem of overfitting of the classification model caused by only using a small amount of marked data in the prior art is avoided, and the classification precision of the polarimetric SAR image is effectively improved.
2. The polarimetric SAR image classification model constructed by the invention also comprises an MFB fusion module, in the process of training the image classification model, the MFB fusion module performs MFB fusion on each first high-level feature of the marked training sample and a second high-level feature at a corresponding position, and then classifies the fusion result through a third classification module.
Drawings
FIG. 1 is a flow chart of an implementation of the present invention;
FIG. 2 is a schematic diagram of the overall structure of a polarized SAR image classification model constructed by the present invention;
FIG. 3 is a schematic diagram of an advanced feature extraction module employed in an embodiment of the present invention;
fig. 4 is a schematic structural diagram of a multi-branch processing module constructed by the present invention.
Detailed Description
The invention is described in further detail below with reference to the following figures and specific examples:
referring to fig. 1, the present invention includes the steps of:
step 1) obtaining a test sample set DtestLabeled training sample set
Figure BDA0003246031410000061
And label-free training sample set
Figure BDA0003246031410000062
Step 1a) obtaining a terrain category L including C ═ LcL 1 is less than or equal to C, and S PolSAR images P ═ { P ═ P ≦ C }sS is more than or equal to 1 and less than or equal to S, and for each PolSAR image PsDividing the image block to obtain an image block set P' ═ Ps'|1≤s≤S},
Figure BDA0003246031410000063
Then obtaining each image block
Figure BDA0003246031410000064
Is characterized by
Figure BDA0003246031410000065
Wherein C is more than or equal to 2 and LcRepresents the c-th ground object type, S is more than or equal to 100, PsRepresents the s PolSAR image, Ps' represents PsCorresponding containing VsA subset of the number of image blocks,
Figure BDA0003246031410000066
represents Ps' in vsImage block, VsMore than or equal to 1000; in the present embodiment, S is 200, C is 8, and V iss=1500;
Obtaining each image block
Figure BDA0003246031410000067
Is characterized by
Figure BDA0003246031410000068
The method comprises the following implementation steps: obtaining each image block
Figure BDA0003246031410000069
Of horizontally polarized component
Figure BDA00032460314100000610
Perpendicular polarization component
Figure BDA00032460314100000611
And cross polarization component
Figure BDA00032460314100000612
I.e. scattering matrix
Figure BDA00032460314100000613
And to
Figure BDA00032460314100000614
Pauli decomposition is carried out to obtain each image block
Figure BDA00032460314100000615
Is characterized by
Figure BDA00032460314100000616
Figure BDA0003246031410000071
Figure BDA0003246031410000072
Wherein [ ·]TRepresenting a transpose operation.
Step 1b) randomly selecting N PolSAR images in PolSAR image set P as test set Ptest={Ps1S1 is more than or equal to 1 and is more than or equal to N, and each PolSAR image P is processeds1Corresponding set of image blocks Ps1' of each image block
Figure BDA0003246031410000073
Is characterized by
Figure BDA0003246031410000074
As a test sample, P was obtainedtestCorresponding test sample set DtestThen taking the residual S-N PolSAR images in P as a training set Ptrain={Ps2L 1 is not less than S2 is not less than S-N, and for each PolSAR image Ps2Corresponding set of image blocks Ps2' of each image block
Figure BDA0003246031410000075
Clustering is carried out to obtain
Figure BDA0003246031410000076
Cluster mark of
Figure BDA0003246031410000077
Wherein
Figure BDA0003246031410000078
B is the number of clusters; in the present embodiment, N is 40, B is 10,
since selecting enough training samples will avoid overfitting of the network, in this embodiment, the ratio of the number of samples in the test sample set and the training sample set to the total number of samples is 20% and 80%, respectively;
for each PolSAR image Ps2Corresponding set of image blocks Ps2' of each image block
Figure BDA0003246031410000079
Clustering is carried out, and the implementation steps are as follows:
step 1b1) obtaining each image block
Figure BDA00032460314100000710
Of horizontally polarized component
Figure BDA00032460314100000711
Perpendicular polarization component
Figure BDA00032460314100000712
And cross polarization component
Figure BDA00032460314100000713
I.e. scattering matrix
Figure BDA00032460314100000714
And to
Figure BDA00032460314100000715
Pauli decomposition is carried out to obtain a three-dimensional Pauli feature vector
Figure BDA00032460314100000716
And pass through
Figure BDA00032460314100000717
And conjugate transpose thereof
Figure BDA00032460314100000718
Construction of
Figure BDA00032460314100000719
Corresponding coherence matrix
Figure BDA00032460314100000720
Figure BDA00032460314100000721
Wherein [ ·]HRepresents a conjugate transpose operation [ ·]*Represents a conjugate operation;
step 1b2), initializing the iteration times as Iter, the maximum iteration times as Iter, wherein Iter is more than or equal to 10, and obtaining each PolSAR image Ps2Corresponding set of image blocks Ps2All image blocks in' are randomly divided into B disjoint subsets
Figure BDA0003246031410000081
Each subset being
Figure BDA0003246031410000082
Corresponding to a mark
Figure BDA0003246031410000083
Let iter equal to 1; wherein, in the present embodiment, Iter ═ 20;
step 1b3) computing each subset
Figure BDA0003246031410000084
Corresponding mean coherence matrix ΣbAnd calculate sigmabAnd each image block
Figure BDA0003246031410000085
Corresponding coherence matrix
Figure BDA0003246031410000086
Wishart distance of
Figure BDA0003246031410000087
Figure BDA0003246031410000088
Figure BDA0003246031410000089
Wherein the content of the first and second substances,
Figure BDA00032460314100000810
representing subsets
Figure BDA00032460314100000811
Of (a) the k-th image block pkCorresponding coherence matrix, KbRepresenting subsets
Figure BDA00032460314100000812
The number of image blocks in (1), sigma (-) represents the summation operation, Tr (-) represents the tracing operation, [. ]]-1Representing an inversion operation;
step 1b4) will each image block
Figure BDA00032460314100000813
Into the subset with the smallest wishart distance,
Figure BDA00032460314100000814
cluster mark of
Figure BDA00032460314100000815
Marking the image blocks corresponding to the subset to which the image blocks belong, judging whether Iter is more than or equal to Iter, if so, obtaining each image block
Figure BDA00032460314100000816
Cluster mark of
Figure BDA00032460314100000817
Otherwise, let iter be iter +1 and perform step 1b 3).
As the coherent matrix in the PolSAR data conforms to the complex wishart distribution, the method for clustering by using the wishart distance is more suitable for the PolSAR image classification scene.
Step 1c) from the training set PtrainIn randomly selecting NlAmplitude PolSAR image Pl train={Ps21|1≤s21≤NlAnd for each PolSAR image Ps21Corresponding set of image blocks Ps21' of each image block
Figure BDA00032460314100000818
Carrying out real ground object class marking to obtain
Figure BDA00032460314100000819
Ground object mark
Figure BDA00032460314100000820
Then will be
Figure BDA00032460314100000821
Is characterized by
Figure BDA00032460314100000822
Ground object mark
Figure BDA00032460314100000823
And clustering labels
Figure BDA00032460314100000824
Composing a labeled training sample to obtain Pl trainCorresponding labeled training sample set
Figure BDA00032460314100000825
Wherein the content of the first and second substances,
Figure BDA00032460314100000826
wherein, in the present embodiment, Nl=48;
In an actual application scenario, it is time-consuming and labor-consuming to actually mark ground features on training samples, so in this embodiment, the proportion of the number of samples in the marked training sample set and the unmarked training sample set to the total number of training samples is 30% and 70%, respectively;
step 1d) training set PtrainS-N remaining in-NlAmplitude PolSAR image
Figure BDA0003246031410000091
Each PolSAR image P ofs22Corresponding set of image blocks Ps22' of each image block
Figure BDA0003246031410000092
Is characterized by
Figure BDA0003246031410000093
And clustering labels
Figure BDA0003246031410000094
Forming a label-free training sample to obtain
Figure BDA0003246031410000095
Corresponding label-free training sample set
Figure BDA0003246031410000096
Step 2) constructing a semi-supervised polarimetric SAR image classification model H based on a multi-branch network:
constructing a semi-supervised polarimetric SAR image classification model H comprising an advanced feature extraction module and a multi-branch processing module which are sequentially cascaded, wherein the structure of the semi-supervised polarimetric SAR image classification model H is shown in FIG. 2;
referring to fig. 3, the advanced feature extraction module includes a first network and a second network arranged in parallel, where the first network and the second network each include a plurality of convolution layers shared by parameters, a plurality of batch normalization layers, and a plurality of activation layers, the first network further includes a first fully-connected layer, and the second network further includes a second fully-connected layer having different parameters from the first fully-connected layer; in this embodiment, the number of convolutional layers in the first network and the second network included in the advanced feature extraction module is 2, the size of the first convolutional layer convolutional kernel is 3 × 3, the convolutional step is 1, the number of convolutional kernels is 16, the size of the second convolutional layer convolutional kernel is 5 × 5, the convolutional step is 1, and the number of convolutional kernels is 32; the number of batch normalization layers is 2; the number of the ReLu active layers in the first network and the second network is 2; the number of the first full-link layer neurons is 64, and the number of the second full-link layer neurons is 32; the specific structure of the first network is as follows: the first convolution layer → the first batch of normalization layers → the first ReLu activation layer → the second convolution layer → the second batch of normalization layers → the second ReLu activation layer → the first full connection layer; the second network has the same basic structure as the first network, and the first full connection layer in the first network is replaced by only the second full connection layer.
Referring to fig. 4, the multi-branch processing module includes an MFB fusion module, and a first classification module, a second classification module, and a third classification module arranged in parallel, where an output end of the MFB fusion module is cascaded with the third classification module; the MFB fusion module comprises a third full-connection layer, a matrix dot product module and a pooling layer which are sequentially cascaded; the first classification module comprises a fourth full connection layer and a Softmax activation layer which are cascaded; the second classification module comprises a fifth full connection layer and a Softmax activation layer which are cascaded, and the third classification module comprises a sixth full connection layer and a Softmax activation layer which are cascaded; in this embodiment, the number of neurons in the third fully-connected layer in the MFB fusion module included in the multi-branch processing module is 128; the number of the neurons of the fourth full connection layer in the first classification module is equal to the number B of clustering performed on each image block in the step (1B); the number of the neurons of the fifth full-connection layer contained in the second classification module and the number of the neurons of the sixth full-connection layer contained in the third classification module are equal to the number C of the ground object types.
Step 3) carrying out iterative training on a semi-supervised polarimetric SAR image classification model H based on a multi-branch network:
step 3a) initializing the iteration times as I, the maximum iteration times as I, I is more than or equal to 200, and the image classification model of the ith iteration is Hi,HiThe weight parameter of is omegaiAnd let i equal to 1, HiH; in the present embodiment, I is 300;
step 3b) will be selected from the labeled training sample set
Figure BDA0003246031410000101
With a replaced and randomly selected MlIs provided withLabeled training samples, and from unlabeled training sample sets
Figure BDA0003246031410000102
With a replaced and randomly selected MuAn unlabeled training sample is used as a semi-supervised polarimetric SAR image classification model HiThe first network in the advanced feature extraction module respectively carries out advanced feature extraction on each marked training sample and each unmarked training sample to obtain a first advanced feature set of the marked training samples
Figure BDA0003246031410000103
And advanced feature sets of label-free training samples
Figure BDA0003246031410000104
Meanwhile, the second network carries out advanced feature extraction on each marked training sample to obtain a second advanced feature set of the marked training samples
Figure BDA0003246031410000105
Wherein
Figure BDA0003246031410000106
And
Figure BDA0003246031410000107
respectively represent the m-th1A first high-level feature output by the first network and a second high-level feature output by the second network for each labeled training sample,
Figure BDA0003246031410000108
denotes the m-th2Advanced features of each unmarked training sample output via the first network, wherein M is greater than or equal to 30l≤Nl×Vs,50≤Mu≤(S-N-Nl)×Vs(ii) a Wherein, in the present embodiment, Ml=50,Mu=60;
Step 3c) the first classification module pairs the first set of high-level features of the labeled training samples F1lAnd label-free trainingHigh level feature set of training samples F1uIs classified into F1lCorresponding first prediction label set
Figure BDA0003246031410000109
And F1uCorresponding second set of predicted labels
Figure BDA00032460314100001010
While the second classification module pairs the second high-level feature set of labeled training samples F2lIs classified to obtain the sum of F2lCorresponding third predictive tag set
Figure BDA0003246031410000111
The MFB fusion module pairs the first high-level feature set F1 of the labeled training sampleslEach of the first high-level features in
Figure BDA0003246031410000112
With a second set of advanced features F2lSecond high level features of corresponding location
Figure BDA0003246031410000113
MFB fusion, third sort Module Pair
Figure BDA0003246031410000114
And
Figure BDA0003246031410000115
the fusion result of (2) is classified to obtain a fourth prediction label
Figure BDA0003246031410000116
Then AND F1lAnd F2lThe fourth prediction label set corresponding to the fusion result is
Figure BDA0003246031410000117
The first classification module, the second classification module and the third classification module are used for classifying in different modes, so that the problem of over-fitting of a classification model caused by using a small amount of labeled data in the prior art is solved, and the classification precision of the PolSAR image is effectively improved.
The MFB fusion module pairs the first high-level feature set F1 of the labeled training sampleslEach of the first high-level features in
Figure BDA0003246031410000118
With a second set of advanced features F2lSecond high level features of corresponding location
Figure BDA0003246031410000119
MFB fusion is carried out, and the implementation steps are as follows: third fully-connected layer of MFB module versus first advanced features
Figure BDA00032460314100001110
With a second high-level feature
Figure BDA00032460314100001111
Respectively performing dimension conversion to obtain high-grade features with the same dimension
Figure BDA00032460314100001112
And advanced features
Figure BDA00032460314100001113
Matrix dot product module pair
Figure BDA00032460314100001114
And
Figure BDA00032460314100001115
performing dot product, and pooling the dot product result by a pooling layer to obtain
Figure BDA00032460314100001116
And
Figure BDA00032460314100001117
the fusion result of (1).
The MFB fusion method avoids the redundancy problem caused by a multi-classifier fusion algorithm of majority voting adopted in the prior art, and further improves the classification precision of the PolSAR image.
Step 3d) using cross entropy loss function and passing each marked training sample
Figure BDA00032460314100001118
Corresponding first prediction tag
Figure BDA00032460314100001119
And clustering labels
Figure BDA00032460314100001120
Calculate HiFirst loss value of
Figure BDA00032460314100001121
By each unlabeled training sample
Figure BDA00032460314100001122
Corresponding second predictive label
Figure BDA00032460314100001123
And clustering labels
Figure BDA00032460314100001124
Calculate HiSecond loss value of
Figure BDA00032460314100001125
By each marked training sample
Figure BDA00032460314100001126
Corresponding third prediction tag
Figure BDA00032460314100001127
And ground object mark
Figure BDA00032460314100001128
Calculate HiThird loss value of
Figure BDA00032460314100001129
By each marked training sample
Figure BDA00032460314100001130
Corresponding fourth prediction tag
Figure BDA00032460314100001131
And ground object mark
Figure BDA00032460314100001132
Calculate HiFourth loss value of
Figure BDA00032460314100001133
And will be
Figure BDA00032460314100001134
And
Figure BDA00032460314100001135
as HiTotal Loss value of (Loss)i
Figure BDA0003246031410000121
First loss value
Figure BDA0003246031410000122
Second loss value
Figure BDA0003246031410000123
Third loss value
Figure BDA0003246031410000124
And a fourth loss value
Figure BDA0003246031410000125
The calculation formulas are respectively as follows:
Figure BDA0003246031410000126
Figure BDA0003246031410000127
Figure BDA0003246031410000128
Figure BDA0003246031410000129
where Σ (-) represents a summation operation, and In (-) represents a logarithm operation based on a natural constant e.
Step 3e) obtaining LossiFor weight parameter omegaiPartial derivatives of
Figure BDA00032460314100001210
And using a gradient descent method by
Figure BDA00032460314100001211
At HiThe weight parameter omega is subjected to counter propagationiUpdating is carried out;
for weight parameter omegaiUpdating, wherein the updating formula is as follows:
Figure BDA00032460314100001212
wherein, ω isi' means omegaiAnd (b) represents the learning rate,
Figure BDA00032460314100001213
the derivation operation is shown, and in this example, the learning rate η is 0.001.
Step 3f) judging whether I is more than or equal to I, if so, obtaining a trained semi-supervised polarimetric SAR image classification model H based on the multi-branch network*Otherwise, let i equal to i +1 and execute stepStep 3 b);
step 4), obtaining a classification result of the PolSAR image:
(4a) set D of test samplestestEach test specimen in (1)
Figure BDA00032460314100001214
Semi-supervised polarimetric SAR image classification model H based on multi-branch network and used as training*The input of (1); the first network and the second network in the advanced feature extraction module are respectively paired
Figure BDA00032460314100001215
Performing advanced feature extraction to obtain
Figure BDA00032460314100001216
Corresponding first high-level features
Figure BDA00032460314100001217
And second advanced features
Figure BDA0003246031410000131
(4b) MFB fusion Module pairs test samples
Figure BDA0003246031410000132
Corresponding first high-level features
Figure BDA0003246031410000133
And second advanced features
Figure BDA0003246031410000134
MFB fusion, third sort Module Pair
Figure BDA0003246031410000135
And
Figure BDA0003246031410000136
classifying the fusion result to obtain a test sample
Figure BDA0003246031410000137
Predictive label of
Figure BDA0003246031410000138
Then VsThe prediction label is the PolSAR image P corresponding to the prediction labels1The classification result of (1).

Claims (7)

1. A semi-supervised polarimetric SAR image classification method based on a multi-branch network is characterized by comprising the following steps:
(1) obtaining a test sample set DtestLabeled training sample set
Figure FDA0003246031400000011
And label-free training sample set
Figure FDA0003246031400000012
(1a) Obtaining a ground object class L containing C ═ { L ═ LcL 1 is less than or equal to C, and S PolSAR images P ═ { P ═ P ≦ C }sS is more than or equal to 1 and less than or equal to S, and for each PolSAR image PsDividing the image block to obtain an image block set P' ═ Ps'|1≤s≤S},
Figure FDA0003246031400000013
Then obtaining each image block
Figure FDA0003246031400000014
Is characterized by
Figure FDA0003246031400000015
Wherein C is more than or equal to 2 and LcRepresents the c-th ground object type, S is more than or equal to 100, PsRepresents the s PolSAR image, Ps' represents PsCorresponding containing VsA subset of the number of image blocks,
Figure FDA0003246031400000016
represents Ps' in vsImage block, Vs≥1000;
(1b) Randomly selecting N PolSAR images in PolSAR image set P as test set Ptest={Ps1S1 is more than or equal to 1 and is more than or equal to N, and each PolSAR image P is processeds1Corresponding set of image blocks Ps1' of each image block
Figure FDA0003246031400000017
Is characterized by
Figure FDA0003246031400000018
As a test sample, P was obtainedtestCorresponding test sample set DtestThen taking the residual S-N PolSAR images in P as a training set Ptrain={Ps2L 1 is not less than S2 is not less than S-N, and for each PolSAR image Ps2Corresponding set of image blocks Ps2' of each image block
Figure FDA0003246031400000019
Clustering is carried out to obtain
Figure FDA00032460314000000110
Cluster mark of
Figure FDA00032460314000000111
Wherein
Figure FDA00032460314000000112
B is the number of clusters;
(1c) from the training set PtrainIn randomly selecting NlAmplitude PolSAR image
Figure FDA00032460314000000113
And for each PolSAR image Ps21Corresponding set of image blocks Ps21' of each image block
Figure FDA00032460314000000114
Carrying out real ground object class marking to obtain
Figure FDA00032460314000000115
Ground object mark
Figure FDA00032460314000000116
Then will be
Figure FDA00032460314000000117
Is characterized by
Figure FDA00032460314000000118
Ground object mark
Figure FDA00032460314000000119
And clustering labels
Figure FDA00032460314000000120
Forming a marked training sample to obtain
Figure FDA00032460314000000121
Corresponding labeled training sample set
Figure FDA00032460314000000122
Wherein the content of the first and second substances,
Figure FDA00032460314000000123
(1d) will train set PtrainThe remaining S-N inlAmplitude PolSAR image
Figure FDA00032460314000000124
Each PolSAR image P ofs22Corresponding set of image blocks Ps22' of each image block
Figure FDA0003246031400000021
Is characterized by
Figure FDA0003246031400000022
And clustering labels
Figure FDA0003246031400000023
Forming a label-free training sample to obtain
Figure FDA0003246031400000024
Corresponding label-free training sample set
Figure FDA0003246031400000025
(2) Constructing a semi-supervised polarimetric SAR image classification model H based on a multi-branch network:
constructing a semi-supervised polarimetric SAR image classification model H comprising an advanced feature extraction module and a multi-branch processing module which are sequentially cascaded, wherein:
the advanced feature extraction module comprises a first network and a second network which are arranged in parallel, wherein the first network and the second network respectively comprise a plurality of parameter-shared convolution layers, a plurality of batch normalization layers and a plurality of activation layers, the first network further comprises a first full connection layer, and the second network further comprises a second full connection layer with different parameters from the first full connection layer;
the multi-branch processing module comprises an MFB fusion module, a first classification module, a second classification module and a third classification module which are arranged in parallel, and the output end of the MFB fusion module is cascaded with the third classification module; the MFB fusion module comprises a third full-connection layer, a matrix dot product module and a pooling layer which are sequentially cascaded; the first classification module comprises a fourth full connection layer and a Softmax activation layer which are cascaded; the second classification module comprises a fifth full connection layer and a Softmax activation layer which are cascaded, and the third classification module comprises a sixth full connection layer and a Softmax activation layer which are cascaded;
(3) performing iterative training on a semi-supervised polarimetric SAR image classification model H based on a multi-branch network:
(3a) the initial iteration number is I, the maximum iteration number is I, I is more than or equal to 200, and the image classification model of the ith iteration is Hi,HiThe weight parameter of is omegaiAnd let i equal to 1, Hi=H;
(3b) Will be derived from the labeled training sample set
Figure FDA0003246031400000026
With a replaced and randomly selected MlA marked training sample and from a set of unmarked training samples
Figure FDA0003246031400000027
With a replaced and randomly selected MuAn unlabeled training sample is used as a semi-supervised polarimetric SAR image classification model HiThe first network in the advanced feature extraction module respectively carries out advanced feature extraction on each marked training sample and each unmarked training sample to obtain a first advanced feature set of the marked training samples
Figure FDA0003246031400000028
And advanced feature sets of label-free training samples
Figure FDA0003246031400000029
Meanwhile, the second network carries out advanced feature extraction on each marked training sample to obtain a second advanced feature set of the marked training samples
Figure FDA00032460314000000210
Wherein
Figure FDA00032460314000000211
And
Figure FDA00032460314000000212
respectively represent the m-th1A first high-level feature output by the first network and a second high-level feature output by the second network for each labeled training sample,
Figure FDA00032460314000000213
denotes the m-th2A non-mark training sample channelAdvanced features of the first network output, where 30 ≦ Ml≤Nl×Vs,50≤Mu≤(S-N-Nl)×Vs
(3c) The first classification module pairs the first high-level feature set F1 of the labeled training sampleslAnd advanced feature set F1 of unlabeled training samplesuIs classified into F1lCorresponding first prediction label set
Figure FDA0003246031400000031
And F1uCorresponding second set of predicted labels
Figure FDA0003246031400000032
While the second classification module pairs the second high-level feature set of labeled training samples F2lIs classified to obtain the sum of F2lCorresponding third predictive tag set
Figure FDA0003246031400000033
The MFB fusion module pairs the first high-level feature set F1 of the labeled training sampleslEach of the first high-level features in
Figure FDA0003246031400000034
With a second set of advanced features F2lSecond high level features of corresponding location
Figure FDA0003246031400000035
MFB fusion, third sort Module Pair
Figure FDA0003246031400000036
And
Figure FDA0003246031400000037
the fusion result of (2) is classified to obtain a fourth prediction label
Figure FDA0003246031400000038
Then AND F1lAnd F2lThe fourth prediction label set corresponding to the fusion result is
Figure FDA0003246031400000039
(3d) Using cross entropy loss function and passing each labeled training sample
Figure FDA00032460314000000310
Corresponding first prediction tag
Figure FDA00032460314000000311
And clustering labels
Figure FDA00032460314000000312
Calculate HiFirst loss value of
Figure FDA00032460314000000313
By each unlabeled training sample
Figure FDA00032460314000000314
Corresponding second predictive label
Figure FDA00032460314000000315
And clustering labels
Figure FDA00032460314000000316
Calculate HiSecond loss value of
Figure FDA00032460314000000317
By each marked training sample
Figure FDA00032460314000000318
Corresponding third prediction tag
Figure FDA00032460314000000319
And ground object mark
Figure FDA00032460314000000320
Calculate HiThird loss value of
Figure FDA00032460314000000321
By each marked training sample
Figure FDA00032460314000000322
Corresponding fourth prediction tag
Figure FDA00032460314000000323
And ground object mark
Figure FDA00032460314000000324
Calculate HiFourth loss value of
Figure FDA00032460314000000325
And will be
Figure FDA00032460314000000326
And
Figure FDA00032460314000000327
as HiTotal Loss value of (Loss)i
Figure FDA00032460314000000328
(3e) Loss findingiFor weight parameter omegaiPartial derivatives of
Figure FDA00032460314000000329
And using a gradient descent method by
Figure FDA00032460314000000330
At HiThe weight parameter omega is subjected to counter propagationiUpdating is carried out;
(3f) judging whether I is more than or equal to I, if so, obtaining a trained semi-supervised polarimetric SAR image classification model H based on the multi-branch network*Otherwise, let i become i +1, and execute step (3 b);
(4) obtaining a classification result of the PolSAR image:
(4a) set D of test samplestestEach test specimen in (1)
Figure FDA0003246031400000041
Semi-supervised polarimetric SAR image classification model H based on multi-branch network and used as training*The input of (1); the first network and the second network in the advanced feature extraction module are respectively paired
Figure FDA0003246031400000042
Performing advanced feature extraction to obtain
Figure FDA0003246031400000043
Corresponding first high-level features
Figure FDA0003246031400000044
And second advanced features
Figure FDA0003246031400000045
(4b) MFB fusion Module pairs test samples
Figure FDA0003246031400000046
Corresponding first high-level features
Figure FDA0003246031400000047
And second advanced features
Figure FDA0003246031400000048
Performing MFB fusion, firstThree-classification module pair
Figure FDA0003246031400000049
And
Figure FDA00032460314000000410
classifying the fusion result to obtain a test sample
Figure FDA00032460314000000411
Predictive label of
Figure FDA00032460314000000412
Then VsThe prediction label is the PolSAR image P corresponding to the prediction labels1The classification result of (1).
2. The method for classifying semi-supervised polarimetric SAR images based on multi-branch network as claimed in claim 1, wherein the step (1a) of obtaining each image block
Figure FDA00032460314000000413
Is characterized by
Figure FDA00032460314000000414
The method comprises the following implementation steps:
obtaining each image block
Figure FDA00032460314000000415
Of horizontally polarized component
Figure FDA00032460314000000416
Perpendicular polarization component
Figure FDA00032460314000000417
And cross polarization component
Figure FDA00032460314000000418
I.e. scattering matrix
Figure FDA00032460314000000419
And to
Figure FDA00032460314000000420
Pauli decomposition is carried out to obtain each image block
Figure FDA00032460314000000421
Is characterized by
Figure FDA00032460314000000422
Figure FDA00032460314000000423
Figure FDA00032460314000000424
Wherein [ ·]TRepresenting a transpose operation.
3. The method for classifying semi-supervised polarimetric SAR images based on multi-branch network as claimed in claim 1, wherein the P for each PolSAR image in step (1b) iss2Corresponding set of image blocks Ps2' of each image block
Figure FDA00032460314000000425
Clustering is carried out, and the implementation steps are as follows:
(1b1) obtaining each image block
Figure FDA00032460314000000426
Of horizontally polarized component
Figure FDA00032460314000000427
Perpendicular polarization component
Figure FDA00032460314000000428
And cross polarization component
Figure FDA00032460314000000429
I.e. scattering matrix
Figure FDA00032460314000000430
And to
Figure FDA00032460314000000431
Pauli decomposition is carried out to obtain a three-dimensional Pauli feature vector
Figure FDA0003246031400000051
And pass through
Figure FDA0003246031400000052
And conjugate transpose thereof
Figure FDA0003246031400000053
Construction of
Figure FDA0003246031400000054
Corresponding coherence matrix
Figure FDA0003246031400000055
Figure FDA0003246031400000056
Wherein [ ·]HRepresents a conjugate transpose operation [ ·]*Represents a conjugate operation;
(1b2) the initialization iteration number is Iter, the maximum iteration number is Iter, the Iter is more than or equal to 10, and each PolSAR image P to be obtaineds2Corresponding set of image blocks Ps2All image blocks in' are randomly divided into B disjoint subsets
Figure FDA0003246031400000057
Each subset being
Figure FDA0003246031400000058
Corresponding to a mark
Figure FDA0003246031400000059
Let iter equal to 1;
(1b3) computing each subset
Figure FDA00032460314000000510
Corresponding mean coherence matrix ΣbAnd calculate sigmabAnd each image block
Figure FDA00032460314000000511
Corresponding coherence matrix
Figure FDA00032460314000000512
Wishart distance of
Figure FDA00032460314000000513
Figure FDA00032460314000000514
Figure FDA00032460314000000515
Wherein the content of the first and second substances,
Figure FDA00032460314000000516
representing subsets
Figure FDA00032460314000000517
Of (a) the k-th image block pkCorresponding coherence matrix, KbRepresenting subsets
Figure FDA00032460314000000518
The number of image blocks in (1), sigma (-) represents the summation operation, Tr (-) represents the tracing operation, [. ]]-1Representing an inversion operation;
(1b4) each image block
Figure FDA00032460314000000519
Into the subset with the smallest wishart distance,
Figure FDA00032460314000000520
cluster mark of
Figure FDA00032460314000000521
Marking the image blocks corresponding to the subset to which the image blocks belong, judging whether Iter is more than or equal to Iter, if so, obtaining each image block
Figure FDA00032460314000000522
Cluster mark of
Figure FDA00032460314000000523
Otherwise, let iter be iter +1, and perform step (1b 3).
4. The semi-supervised polarimetric SAR image classification method based on multi-branch network as claimed in claim 1, characterized in that the advanced feature extraction module and the multi-branch processing module in step (2) are provided, wherein:
the number of convolution layers in the first network and the second network which are contained in the advanced feature extraction module is 2, the size of the convolution kernel of the first convolution layer is 3 multiplied by 3, the convolution step length is 1, the number of the convolution kernels is 16, the size of the convolution kernel of the second convolution layer is 5 multiplied by 5, the convolution step length is 1, and the number of the convolution kernels is 32; the number of batch normalization layers is 2; the number of the ReLu active layers in the first network and the second network is 2; the number of the first full-link layer neurons is 64, and the number of the second full-link layer neurons is 32; the specific structure of the first network is as follows: the first convolution layer → the first batch of normalization layers → the first ReLu activation layer → the second convolution layer → the second batch of normalization layers → the second ReLu activation layer → the first full connection layer; the second network has the same basic structure as the first network, and only the second full connection layer replaces the first full connection layer in the first network;
the number of the neurons of the third full-connection layer in the MFB fusion module contained in the multi-branch processing module is 128; the number of the neurons of the fourth full connection layer in the first classification module is equal to the number B of clustering performed on each image block in the step (1B); the number of the neurons of the fifth full-connection layer contained in the second classification module and the number of the neurons of the sixth full-connection layer contained in the third classification module are equal to the number C of the ground object types.
5. The method for classifying semi-supervised polarimetric SAR image based on multi-branch network as claimed in claim 1, wherein the MFB fusion module in step (3c) performs classification on the first high-level feature set F1 of the labeled training sampleslEach of the first high-level features in
Figure FDA0003246031400000061
With a second set of advanced features F2lSecond high level features of corresponding location
Figure FDA0003246031400000062
MFB fusion is carried out, and the implementation steps are as follows:
third fully-connected layer of MFB module versus first advanced features
Figure FDA0003246031400000063
With a second high-level feature
Figure FDA0003246031400000064
Respectively performing dimension conversion to obtain high-grade features with the same dimension
Figure FDA0003246031400000065
And advanced features
Figure FDA0003246031400000066
Matrix dot product module pair
Figure FDA0003246031400000067
And
Figure FDA0003246031400000068
performing dot product, and pooling the dot product result by a pooling layer to obtain
Figure FDA0003246031400000069
And
Figure FDA00032460314000000610
the fusion result of (1).
6. The method for classifying semi-supervised polarimetric SAR images based on multi-branch network as claimed in claim 1, wherein the first loss value in step (3d)
Figure FDA00032460314000000611
Second loss value
Figure FDA00032460314000000612
Third loss value
Figure FDA00032460314000000613
And a fourth loss value
Figure FDA00032460314000000614
The calculation formulas are respectively as follows:
Figure FDA00032460314000000615
Figure FDA00032460314000000616
Figure FDA0003246031400000071
Figure FDA0003246031400000072
where Σ (-) represents a summation operation, and In (-) represents a logarithm operation based on a natural constant e.
7. The method for classifying semi-supervised polarimetric SAR images based on multi-branch network as claimed in claim 1, wherein the weighting parameter ω in step (3e)iUpdating, wherein the updating formula is as follows:
Figure FDA0003246031400000073
wherein, ω isi' means omegaiAnd (b) represents the learning rate,
Figure FDA0003246031400000074
representing a derivative operation.
CN202111032840.2A 2021-09-03 2021-09-03 Semi-supervised polarization SAR image classification method based on multi-branch network Active CN113869136B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111032840.2A CN113869136B (en) 2021-09-03 2021-09-03 Semi-supervised polarization SAR image classification method based on multi-branch network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111032840.2A CN113869136B (en) 2021-09-03 2021-09-03 Semi-supervised polarization SAR image classification method based on multi-branch network

Publications (2)

Publication Number Publication Date
CN113869136A true CN113869136A (en) 2021-12-31
CN113869136B CN113869136B (en) 2024-07-02

Family

ID=78989516

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111032840.2A Active CN113869136B (en) 2021-09-03 2021-09-03 Semi-supervised polarization SAR image classification method based on multi-branch network

Country Status (1)

Country Link
CN (1) CN113869136B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114998840A (en) * 2022-07-18 2022-09-02 成都东方天呈智能科技有限公司 Mouse target detection method based on deep cascade supervised learning

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107563422A (en) * 2017-08-23 2018-01-09 西安电子科技大学 A kind of polarization SAR sorting technique based on semi-supervised convolutional neural networks
CN111695466A (en) * 2020-06-01 2020-09-22 西安电子科技大学 Semi-supervised polarization SAR terrain classification method based on feature mixup

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107563422A (en) * 2017-08-23 2018-01-09 西安电子科技大学 A kind of polarization SAR sorting technique based on semi-supervised convolutional neural networks
CN111695466A (en) * 2020-06-01 2020-09-22 西安电子科技大学 Semi-supervised polarization SAR terrain classification method based on feature mixup

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王俊淑;江南;张国明;胡斌;李杨;吕恒;: "高光谱遥感图像DE-self-training半监督分类算法", 农业机械学报, no. 05, 25 May 2015 (2015-05-25) *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114998840A (en) * 2022-07-18 2022-09-02 成都东方天呈智能科技有限公司 Mouse target detection method based on deep cascade supervised learning

Also Published As

Publication number Publication date
CN113869136B (en) 2024-07-02

Similar Documents

Publication Publication Date Title
CN108388927B (en) Small sample polarization SAR terrain classification method based on deep convolution twin network
CN110309868A (en) In conjunction with the hyperspectral image classification method of unsupervised learning
CN112052754B (en) Polarization SAR image ground object classification method based on self-supervision characterization learning
CN110689086A (en) Semi-supervised high-resolution remote sensing image scene classification method based on generating countermeasure network
CN107491734B (en) Semi-supervised polarimetric SAR image classification method based on multi-core fusion and space Wishart LapSVM
CN108764308A (en) Pedestrian re-identification method based on convolution cycle network
CN112347970B (en) Remote sensing image ground object identification method based on graph convolution neural network
CN105989336B (en) Scene recognition method based on deconvolution deep network learning with weight
Chen et al. Agricultural remote sensing image cultivated land extraction technology based on deep learning
CN112949189A (en) Modeling method for multi-factor induced landslide prediction based on deep learning
Mussina et al. Multi-modal data fusion using deep neural network for condition monitoring of high voltage insulator
CN111738052B (en) Multi-feature fusion hyperspectral remote sensing ground object classification method based on deep learning
CN112149612A (en) Marine organism recognition system and recognition method based on deep neural network
CN116152678A (en) Marine disaster-bearing body identification method based on twin neural network under small sample condition
CN114973019A (en) Deep learning-based geospatial information change detection classification method and system
CN114818931A (en) Fruit image classification method based on small sample element learning
CN113869136B (en) Semi-supervised polarization SAR image classification method based on multi-branch network
CN114329031A (en) Fine-grained bird image retrieval method based on graph neural network and deep hash
Prasetiyo et al. Differential augmentation data for vehicle classification using convolutional neural network
Liu et al. Integration transformer for ground-based cloud image segmentation
CN110222793B (en) Online semi-supervised classification method and system based on multi-view active learning
CN106960225A (en) A kind of sparse image classification method supervised based on low-rank
Machairas et al. Application of dynamic image analysis to sand particle classification using deep learning
Chen et al. Mapping urban form and land use with deep learning techniques: a case study of Dongguan City, China
CN115205693A (en) Multi-feature ensemble learning dual-polarization SAR image enteromorpha extracting method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant