CN109766835A - The SAR target identification method of confrontation network is generated based on multi-parameters optimization - Google Patents

The SAR target identification method of confrontation network is generated based on multi-parameters optimization Download PDF

Info

Publication number
CN109766835A
CN109766835A CN201910026176.7A CN201910026176A CN109766835A CN 109766835 A CN109766835 A CN 109766835A CN 201910026176 A CN201910026176 A CN 201910026176A CN 109766835 A CN109766835 A CN 109766835A
Authority
CN
China
Prior art keywords
network
arbiter
generator
sample
parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910026176.7A
Other languages
Chinese (zh)
Other versions
CN109766835B (en
Inventor
杜兰
郭昱辰
何浩男
陈健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Xian Cetc Xidian University Radar Technology Collaborative Innovation Research Institute Co Ltd
Original Assignee
Xidian University
Xian Cetc Xidian University Radar Technology Collaborative Innovation Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University, Xian Cetc Xidian University Radar Technology Collaborative Innovation Research Institute Co Ltd filed Critical Xidian University
Priority to CN201910026176.7A priority Critical patent/CN109766835B/en
Publication of CN109766835A publication Critical patent/CN109766835A/en
Application granted granted Critical
Publication of CN109766835B publication Critical patent/CN109766835B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a kind of synthetic aperture radar SAR target identification method for generating confrontation network based on multi-parameters optimization, mainly solves the problem of that discrimination is not high when classifier training in the prior art and the classifier parameters that cannot be guaranteed that training obtains are optimal solutions.Its implementation are as follows: generate initial training sample set and test sample collection, and initial training sample expand and generates final training sample set;The structure and parameter group number for generating confrontation network is set;The arbiter in confrontation network is generated using the pseudo- sample training that the method training of multiple groups network parameter cross-training generates confrontation network, and training set sample and generator is utilized to generate simultaneously;The arbiter in confrontation network is generated with trained multiple groups to identify target model, and the recognition result for being averaged to obtain target model again is added to the result that multi-group differentiation device obtains.The present invention improves the accuracy of SAR target identification, can be used for the identification to static SAR target.

Description

The SAR target identification method of confrontation network is generated based on multi-parameters optimization
Technical field
The invention belongs to fields of communication technology, further relate to a kind of synthetic aperture radar SAR target type identifier side Method can be used for identifying the model of static target in synthetic aperture radar SAR target.
Background technique
Synthetic aperture radar SAR has the characteristics that round-the-clock, round-the-clock, high resolution and penetration power are strong, becomes current The important means of earth observation and military surveillance, synthetic aperture radar SAR image automatic target detection is by more and more extensive Concern.Currently, synthetic aperture radar SAR target identification method is in training classifier mostly only with original training data;? Locally optimal solution can be acquired using depth model mostly in classifier design.
A kind of patent document " SAR image recognition methods " (number of patent application of University of Electronic Science and Technology in its application CN201210201460.1, publication number CN102737253A) in propose a kind of target identification side SAR based on rarefaction representation Method.The implementation method of this method is: target data being expressed as to the linear combination of training sample using sparse representation theory, is passed through Solve optimization problem obtained with can separating capacity approximate non-negative sparse coefficient, be then based on the big of coefficient sum of all categories The classification of small determining sample.This method using the degree of similarity of target data and training sample as the foundation of classification, with Embody the true classification of target data.The shortcoming of this method is that original training data is simply only utilized to train classification Model.
Patent document " the SAR target identification method based on the CNN " (patent application of Xian Electronics Science and Technology University in its application Number CN201510165886.X, application publication number CN104732243A) in propose a kind of SAR target identification method based on CNN. The realization step of this method are as follows: multiple random translation transformation is carried out to each training image, expanding data is obtained and extends to training In sample set;Build convolutional neural networks CNN result;Training sample set after expansion is input to training network model in CNN; Test sample collection after multiple translation transformation is expanded is done to test sample;Test sample collection is input to trained CNN It is tested in network model, obtains its discrimination.Shortcoming existing for this method is that deep learning method is inevitably fallen into Enter in locally optimal solution, model cannot be guaranteed it is optimal solution after obtained training, different priori settings and initialization mode Trained obtained result is simultaneously unstable.
Summary of the invention
It is a kind of based on multi-parameters optimization generation it is an object of the invention in view of the deficiency of the prior art, propose The SAR target identification method of network is fought, to stablize recognition performance, improves discrimination.
Technical thought of the invention is, by generating sample image similar with training sample set using model is generated, to increase Add the availability data and information when trained classifier;By training generate confrontation model when, while joint training multiple groups join Number, using the resulting average result of multiple groups parameter as final prediction result, avoids the problem that model falls into locally optimal solution, mentions The stability and accuracy of high model identification, implementation include the following:
(1) training sample set and test sample collection are generated:
(1a) arbitrarily obtains every class at least 200 images, group in all categories of synthetic aperture radar SAR image collection Test sample collection is formed with all remaining samples at initial training sample set;
Every image in initial training sample set by translation, is rotated and turn over carry out data extending, obtained by (1b) Expand training sample set, final training sample set is collectively constituted by initial training sample set and expansion training sample set;
(2) setting generates the structure and parameter group number of confrontation network:
The generator generated fight in network and the arbiter number of plies and each layer are respectively set in tensorflow software Convolution kernel number, and the group number of precision setting network parameter as needed;
(3) generation confrontation network is trained:
The parameter of (3a) fixed arbiter, generates one group of noise vector at random, which is input in generator The pseudo- sample of one group of generation is obtained, then pseudo- sample is input in arbiter, the objective function by minimizing generator updates The parameter of generator;
The parameter of (3b) fixed generator, generates one group of noise vector at random, which is input in generator The pseudo- sample of one group of generation is obtained, then pseudo- sample and training dataset are input in arbiter jointly, is differentiated by maximizing The objective function of device updates the parameter of arbiter;
(3c) judges whether the objective function of generator and arbiter restrains: if objective function is not restrained, returning to (3a) step;If objective function is restrained, stop network training, obtains trained generation confrontation network;
(4) Network Recognition target model is fought with trained generation:
All samples that (4a) concentrates test sample are separately input to arbiter corresponding to trained every group of parameter In, obtain the output vector y of each arbiterm
The output vector y of (4b) to each arbitermIt is added and is averaged again, the average vector intermediate value is maximum one-dimensional Corresponding model classification is the type identifier result of test sample.
The present invention has the advantage that compared with prior art
First, since the present invention is in training sorter network, that is, when generating the arbiter in confrontation network, in addition to training set sample This outer pseudo- sample for also having used generator generation, only uses trained sample when overcoming classifier training in the prior art This, the problem for causing discrimination not high, so that the information that classifier training of the invention uses is more, training is more abundant, right The classification capacity of image is stronger, improves the accuracy of SAR target identification.
Second, it, will be more since present invention employs the training of the method for multiple groups network parameter cross-training to generate confrontation network Final result of the average result that group parameter obtains as target type identifier, it is inevitable to overcome classifier in the prior art Meeting fall into locally optimal solution, network parameter cannot be guaranteed the problem of being optimal solution after training, not only have to difference The robustness feature of initialization mode, and improve the object recognition rate of SAR image.
Detailed description of the invention
Fig. 1 is implementation flow chart of the invention;
Fig. 2 is the schematic diagram of SAR image used in the present invention;
Fig. 3 is the pseudo- sample analogous diagram generated with the generator in the present invention.
Specific embodiment
The invention will be further described with reference to the accompanying drawing.
Referring to Fig.1, steps are as follows for the realization of this example.
Step 1, training sample set and test sample collection are generated.
In all categories of synthetic aperture radar SAR image collection, every class at least 200 images are arbitrarily obtained, composition is just Beginning training sample set forms test sample collection with all remaining samples;
By in initial training sample set every picture respectively upwards, downwards, to the left and to the right move 30 pixels, obtain To 4 times of translation exptended sample;
By every picture in initial training sample set respectively according to 45 ° clockwise, 90 °, 135 °, 180 °, 225 °, 270 ° With 315 ° of progress image rotations, 7 times of rotation exptended sample is obtained;
By every picture in initial training sample set respectively according to from left to right and from top to bottom Image Reversal is carried out, obtain To 2 times of overturning exptended sample;
By initial training sample set, exptended sample is translated, rotation exptended sample and overturning exptended sample collectively constitute finally Training sample set.
Step 2, setting generates the structure and parameter group number of confrontation network.
Existing generation confrontation network is by generator and arbiter depth network model dimerous.
This example be respectively set in tensorflow software generate confrontation network in generator and the arbiter number of plies and The network parameter of generator is arranged for each layer of convolution kernel number, and the group number of precision setting network parameter as needed For N group, M group is set by the network parameter of arbiter;The precision of generator network is directly proportional to N, i.e., N is bigger, generator net The precision of network is higher, and the precision of arbiter network is directly proportional to M, i.e., M is bigger, and the precision of arbiter network is higher, N > 0, M > 0。
Step 3, generation confrontation network is trained.
3.1) network parameter of training generator:
The parameter of fixed arbiter, generates one group of noise vector at random, which is input in generator and is obtained The pseudo- sample of one group of generation, then pseudo- sample is input in arbiter, it is more newly-generated by the objective function for minimizing generator The parameter of device;
The objective function of the generator, is expressed as follows:
Wherein, G indicates that generator network, D indicate arbiter network, and z is noise, pz(z) prior distribution for being z, E () It indicates to calculate desired value, G (z) indicates to input noise z into the pseudo- sample exported after generator network G, and K is the total of training sample set Classification number, K+1 are the corresponding category label of pseudo- sample, and p (y=K+1 | G (z), D) indicates that arbiter network D in input is G (z) When, the value of the K+1 dimension of output vector;
To N group generator network parameter, the objective function of corresponding generator is as follows:
WhereinIndicate the parameter of generator network, GnIt indicates by generator network parameterThe generator net respectively constituted Network, DmIndicate the arbiter network that is respectively constituted by M arbiter network parameter, n=1,2 ..., N, m=1,2 ..., M;
3.2) network parameter of training arbiter:
The parameter of fixed generator, generates one group of noise vector at random, which is input in generator and is obtained The pseudo- sample of one group of generation;Pseudo- sample and training dataset are input in arbiter jointly again, by maximizing arbiter The parameter of objective function update arbiter;
The objective function of the arbiter, is expressed as follows:
Wherein, G indicates that generator network, D indicate that arbiter network, x indicate that authentic specimen, y=l indicate authentic specimen Label, l=1,2 ..., K, K are total classification number of training sample, pdata(x, y) indicates the Joint Distribution of sample and label, p (y =l | x, D) it indicates after authentic specimen x is inputted arbiter network, the value of the l dimension of arbiter network output, z is noise, pz (z) prior distribution for being z, E () indicate to calculate desired value, the puppet that G (z) expression will export after noise z input generator network G Sample, K+1 are the corresponding category label of pseudo- sample, p (y=K+1 | G (z), D) indicate arbiter network D when input is G (z), The value of the K+1 dimension of output vector;
To M group arbiter network parameter, the objective function of corresponding arbiter is as follows:
Wherein,Indicate the parameter of arbiter network, GnIndicate the generator respectively constituted by N number of generator network parameter Network, DmIt indicates by arbiter network parameterThe arbiter network respectively constituted, n=1,2 ..., N, m=1,2 ..., M;
3.3) judge whether the objective function of generator and arbiter restrains: if objective function is not restrained, returning 3.1);If objective function is restrained, stop network training, obtains trained generation confrontation network.
Step 4, Network Recognition target model is fought with trained generation.
4.1) all samples for concentrating test sample, are separately input to arbiter corresponding to trained every group of parameter In, obtain the output vector y of each arbiterm
4.2) to the output vector y of each arbitermIt is added and is averaged again, the average vector intermediate value is maximum one-dimensional Corresponding model classification is the type identifier of test sample as a result, it is expressed as follows:
Wherein, ymThe K dimension output vector for indicating m-th of arbiter, classifies test sample per the one-dimensional arbiter that represents For the probability size of the model classification;The average vector obtained after being averaged, m=1 are added for ym, 2 ..., M, M are to differentiate Device parameter group number, findmax () indicate which dimension lookup vector maximization corresponds to,Indicate vectorDimension where maximum value Degree, i.e. the type identifier result of test sample.
Effect of the invention is further described below with reference to emulation experiment.
1. emulation experiment condition.
The hardware platform of emulation experiment of the invention are as follows: processor Intel Xeon CPU, processor host frequency are 2.20GHz inside saves as 128GB, and video card is NVIDIA GTX 1080Ti, and operating system is ubuntu 16.04LTS, is used Software is python2.7, tensorflow.
The existing method used are as follows: target identification method SVM based on linear SVM classifier, based on from encoding The target identification method AE of device, the target identification method RBM based on limitation Boltzmann machine.
2. emulation experiment content.
Emulation experiment 1, using the method for the present invention, with the acquisition and the reality of identification MSTAR data set of movement and static target Measured data trains network parameter, pseudo- sample is generated with the generator that trained two groups of parameters are constituted, as a result such as Fig. 3, in which:
Fig. 3 (a) is the pseudo- sample that the generator constituted with first group of trained parameter generates after inputting one group of noise Image;
Fig. 3 (b) is the pseudo- sample that the generator constituted with second group of trained parameter generates after inputting one group of noise Image.
Emulation experiment 2, acquisition and identification using the method for the present invention and three existing methods to movement and static target Measured data in MSTAR data set carries out target type identifier, obtains various methods to the recognition result of test sample.In order to It evaluates the simulation experiment result and calculates every kind of method testing sample identification rate in above-mentioned emulation experiment using following formula:
Wherein, Accuracy indicates that the discrimination of test sample, T indicate to identify correct test sample number, and Q indicates to survey The total number of sample sheet.Accuracy value is bigger, illustrates that recognition performance is better.
The discrimination for three kinds of methods that above-mentioned emulation experiment uses, as shown in table 1.
The corresponding MSTAR test sample discrimination contrast table of 1 different recognition methods of table
Experimental method The method of the present invention SVM AE RBM
Discrimination 95.47% 88.64% 86.81% 87.84%
3. analysis of simulation result.
The comparison reference standard for analyzing this emulation experiment 1 is SAR image shown in Fig. 2, in which:
Fig. 2 (a) is by being the width BMP2 panzer measured image data that randomly selects from MSTAR data set;
Fig. 2 (b) is by being the width BTR70 panzer measured image data that randomly selects from MSTAR data set;
Fig. 2 (c) is by being the width T72 main battle tank measured image data that randomly selects from MSTAR data set.
By comparing Fig. 3 (a) and Fig. 2 (a), Fig. 2 (b) and Fig. 2 (c), it can be seen that first group of trained parameter is constituted The pseudo- sample image that is generated after inputting one group of noise of generator and true MSTAR sample it is very close;
By comparing Fig. 3 (b) and Fig. 2 (a), Fig. 2 (b) and Fig. 2 (c), it can be seen that second group of trained parameter is constituted The pseudo- sample image that is generated after inputting one group of noise of generator and true MSTAR sample it is very close.
Comparison result shows for puppet sample shown in Fig. 3 (a) and Fig. 3 (b) to be added in the training of arbiter, can increase The useful information that arbiter can use.
Emulation experiment 2 is analyzed, as can be seen from Table 1, discrimination of the invention can achieve 95.47%, compared to existing Technical method has highest discrimination.

Claims (6)

1. a kind of SAR target identification method for generating confrontation network based on multi-parameters optimization, which is characterized in that include the following:
(1) training sample set and test sample collection are generated:
(1a) arbitrarily obtains every class at least 200 images, composition is just in all categories of synthetic aperture radar SAR image collection Beginning training sample set forms test sample collection with all remaining samples;
Every image in initial training sample set by translation, is rotated and turn over carry out data extending, is expanded by (1b) Training sample set collectively constitutes final training sample set by initial training sample set and expansion training sample set;
(2) setting generates the structure and parameter group number of confrontation network:
The generator generated fight in network and the arbiter number of plies and each layer of volume are respectively set in tensorflow software Product core number, and the group number of precision setting network parameter as needed;
(3) generation confrontation network is trained:
The parameter of (3a) fixed arbiter, generates one group of noise vector at random, which is input in generator and is obtained The pseudo- sample of one group of generation, then pseudo- sample is input in arbiter, it is more newly-generated by the objective function for minimizing generator The parameter of device;
The parameter of (3b) fixed generator, generates one group of noise vector at random, which is input in generator and is obtained The pseudo- sample of one group of generation, then pseudo- sample and training dataset are input in arbiter jointly, by maximizing arbiter The parameter of objective function update arbiter;
(3c) judges whether the objective function of generator and arbiter restrains: if objective function is not restrained, returning (3a) Step;If objective function is restrained, stop network training, obtains trained generation confrontation network;
(4) Network Recognition target model is fought with trained generation:
All samples that (4a) concentrates test sample are separately input in arbiter corresponding to trained every group of parameter, Obtain the output vector y of each arbiterm
The output vector y of (4b) to each arbitermIt is added and is averaged again, maximum one-dimensional pair of the average vector intermediate value The model classification answered is the type identifier result of test sample.
2. according to the method described in claim 1, it is characterized by: every image in initial training sample set is passed through in (1b) Cross translation, rotate and turn over carry out data extending, device its be accomplished by
(1b1) is upward, downward by every picture difference in initial training sample set, moves 30 pixels to the left and to the right, Obtain 4 times of translation exptended sample;
(1b2) by every picture in initial training sample set respectively according to 45 ° clockwise, 90 °, 135 °, 180 °, 225 °, 270 ° and 315 ° progress image rotations, obtain 7 times of rotation exptended sample;
(1b3) by every picture in initial training sample set respectively according to from left to right and from top to bottom carrying out Image Reversal, Obtain 2 times of overturning exptended sample.
3. according to the method described in claim 1, it is characterized by: the group number of (2) middle setting network parameter, is by generator Network parameter is set as N group, sets M group for the network parameter of arbiter;The precision of generator network is directly proportional to N, differentiates The precision of device network is directly proportional to M, N > 0, M > 0.
4. according to the method described in claim 1, it is characterized by: the objective function of generator described in (3a), is expressed as follows:
Wherein, G indicates that generator network, D indicate arbiter network, and z is noise, pz(z) prior distribution for being z, E () are indicated Desired value is calculated, G (z) indicates to input noise z into the pseudo- sample exported after generator network G, and K is total classification of training sample set Number, K+1 are the corresponding category label of pseudo- sample, p (y=K+1 | G (z), D) indicate arbiter network D when input is G (z), it is defeated The value of the K+1 dimension of outgoing vector;
To N group generator network parameter, the objective function of corresponding generator is as follows:
WhereinIndicate the parameter of generator network, GnIt indicates by generator network parameterThe generator network respectively constituted, DmIndicate the arbiter network respectively constituted by M arbiter network parameter, n=1,2 ..., N, m=1,2 ..., M.
5. according to the method described in claim 1, it is characterized by: the objective function of arbiter described in (3b), is expressed as follows:
Wherein, G indicates that generator network, D indicate that arbiter network, x indicate that authentic specimen, y=l indicate the mark of authentic specimen Label, l=1,2 ..., K, K are total classification number of training sample, pdata(x, y) indicates the Joint Distribution of sample and label, p (y= L | x, D) it indicates after authentic specimen x is inputted arbiter network, the value of the l dimension of arbiter network output, z is noise, pz(z) For the prior distribution of z, E () indicates to calculate desired value, and G (z) indicates noise z inputting the pseudo- sample exported after generator network G This, K+1 is the corresponding category label of pseudo- sample, p (y=K+1 | G (z), D) indicate arbiter network D when input is G (z), it is defeated The value of the K+1 dimension of outgoing vector;
To M group arbiter network parameter, the objective function of corresponding arbiter is as follows:
Wherein,Indicate the parameter of arbiter network, GnIndicate the generator net respectively constituted by N number of generator network parameter Network, DmIt indicates by arbiter network parameterThe arbiter network respectively constituted, n=1,2 ..., N, m=1,2 ..., M.
6. according to the method described in claim 1, it is characterized by: test sample type identifier is as a result, table obtained in (4b) Show as follows:
Wherein, ymIt indicates the output vector of m-th of arbiter, is K dimensional vector, per the one-dimensional arbiter that represents by test specimens one's duty Class is the probability size of the model classification;For ymIt is added the average vector obtained after being averaged, m=1,2 ..., M, M are to sentence Other device parameter group number, findmax () indicate which dimension lookup vector maximization corresponds to,Indicate vectorWhere maximum value Dimension, i.e. the type identifier result of test sample.
CN201910026176.7A 2019-01-11 2019-01-11 SAR target recognition method for generating countermeasure network based on multi-parameter optimization Active CN109766835B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910026176.7A CN109766835B (en) 2019-01-11 2019-01-11 SAR target recognition method for generating countermeasure network based on multi-parameter optimization

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910026176.7A CN109766835B (en) 2019-01-11 2019-01-11 SAR target recognition method for generating countermeasure network based on multi-parameter optimization

Publications (2)

Publication Number Publication Date
CN109766835A true CN109766835A (en) 2019-05-17
CN109766835B CN109766835B (en) 2023-04-18

Family

ID=66453973

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910026176.7A Active CN109766835B (en) 2019-01-11 2019-01-11 SAR target recognition method for generating countermeasure network based on multi-parameter optimization

Country Status (1)

Country Link
CN (1) CN109766835B (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110297218A (en) * 2019-07-09 2019-10-01 哈尔滨工程大学 Based on the unknown modulation system detection method of radar signal for generating confrontation network
CN110401488A (en) * 2019-07-12 2019-11-01 北京邮电大学 A kind of demodulation method and device
CN110516525A (en) * 2019-07-01 2019-11-29 杭州电子科技大学 SAR image target recognition method based on GAN and SVM
CN110555811A (en) * 2019-07-02 2019-12-10 五邑大学 SAR image data enhancement method and device and storage medium
CN110609477A (en) * 2019-09-27 2019-12-24 东北大学 Electric power system transient stability discrimination system and method based on deep learning
CN111126503A (en) * 2019-12-27 2020-05-08 北京同邦卓益科技有限公司 Training sample generation method and device
CN111398955A (en) * 2020-03-13 2020-07-10 中国科学院电子学研究所苏州研究院 SAR image sidelobe removing method based on generation of antagonistic neural network
WO2021000903A1 (en) * 2019-07-02 2021-01-07 五邑大学 End-to-end sar image recognition method and apparatus, and storage medium
CN112766381A (en) * 2021-01-22 2021-05-07 西安电子科技大学 Attribute-guided SAR image generation method under limited sample
CN112949820A (en) * 2021-01-27 2021-06-11 西安电子科技大学 Cognitive anti-interference target detection method based on generation of countermeasure network
CN113537031A (en) * 2021-07-12 2021-10-22 电子科技大学 Radar image target identification method for generating countermeasure network based on condition of multiple discriminators
CN113723182A (en) * 2021-07-21 2021-11-30 西安电子科技大学 SAR image ship detection method under limited training sample condition
CN115277189A (en) * 2022-07-27 2022-11-01 中国人民解放军海军航空大学 Unsupervised intrusion flow detection and identification method based on generative countermeasure network

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107368852A (en) * 2017-07-13 2017-11-21 西安电子科技大学 A kind of Classification of Polarimetric SAR Image method based on non-down sampling contourlet DCGAN
CN107563428A (en) * 2017-08-25 2018-01-09 西安电子科技大学 Classification of Polarimetric SAR Image method based on generation confrontation network
CN108564115A (en) * 2018-03-30 2018-09-21 西安电子科技大学 Semi-supervised polarization SAR terrain classification method based on full convolution GAN
CN108764173A (en) * 2018-05-31 2018-11-06 西安电子科技大学 The hyperspectral image classification method of confrontation network is generated based on multiclass
US20180322366A1 (en) * 2017-05-02 2018-11-08 General Electric Company Neural network training image generation system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180322366A1 (en) * 2017-05-02 2018-11-08 General Electric Company Neural network training image generation system
CN107368852A (en) * 2017-07-13 2017-11-21 西安电子科技大学 A kind of Classification of Polarimetric SAR Image method based on non-down sampling contourlet DCGAN
CN107563428A (en) * 2017-08-25 2018-01-09 西安电子科技大学 Classification of Polarimetric SAR Image method based on generation confrontation network
CN108564115A (en) * 2018-03-30 2018-09-21 西安电子科技大学 Semi-supervised polarization SAR terrain classification method based on full convolution GAN
CN108764173A (en) * 2018-05-31 2018-11-06 西安电子科技大学 The hyperspectral image classification method of confrontation network is generated based on multiclass

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110516525A (en) * 2019-07-01 2019-11-29 杭州电子科技大学 SAR image target recognition method based on GAN and SVM
CN110516525B (en) * 2019-07-01 2021-10-08 杭州电子科技大学 SAR image target recognition method based on GAN and SVM
CN110555811A (en) * 2019-07-02 2019-12-10 五邑大学 SAR image data enhancement method and device and storage medium
WO2021000903A1 (en) * 2019-07-02 2021-01-07 五邑大学 End-to-end sar image recognition method and apparatus, and storage medium
CN110297218A (en) * 2019-07-09 2019-10-01 哈尔滨工程大学 Based on the unknown modulation system detection method of radar signal for generating confrontation network
CN110297218B (en) * 2019-07-09 2022-07-15 哈尔滨工程大学 Method for detecting unknown modulation mode of radar signal based on generation countermeasure network
CN110401488A (en) * 2019-07-12 2019-11-01 北京邮电大学 A kind of demodulation method and device
CN110401488B (en) * 2019-07-12 2021-02-05 北京邮电大学 Demodulation method and device
CN110609477B (en) * 2019-09-27 2021-06-29 东北大学 Electric power system transient stability discrimination system and method based on deep learning
CN110609477A (en) * 2019-09-27 2019-12-24 东北大学 Electric power system transient stability discrimination system and method based on deep learning
CN111126503A (en) * 2019-12-27 2020-05-08 北京同邦卓益科技有限公司 Training sample generation method and device
CN111126503B (en) * 2019-12-27 2023-09-26 北京同邦卓益科技有限公司 Training sample generation method and device
CN111398955A (en) * 2020-03-13 2020-07-10 中国科学院电子学研究所苏州研究院 SAR image sidelobe removing method based on generation of antagonistic neural network
CN112766381A (en) * 2021-01-22 2021-05-07 西安电子科技大学 Attribute-guided SAR image generation method under limited sample
CN112766381B (en) * 2021-01-22 2023-01-24 西安电子科技大学 Attribute-guided SAR image generation method under limited sample
CN112949820A (en) * 2021-01-27 2021-06-11 西安电子科技大学 Cognitive anti-interference target detection method based on generation of countermeasure network
CN112949820B (en) * 2021-01-27 2024-02-02 西安电子科技大学 Cognitive anti-interference target detection method based on generation of countermeasure network
CN113537031A (en) * 2021-07-12 2021-10-22 电子科技大学 Radar image target identification method for generating countermeasure network based on condition of multiple discriminators
CN113537031B (en) * 2021-07-12 2023-04-07 电子科技大学 Radar image target identification method for generating countermeasure network based on condition of multiple discriminators
CN113723182A (en) * 2021-07-21 2021-11-30 西安电子科技大学 SAR image ship detection method under limited training sample condition
CN115277189A (en) * 2022-07-27 2022-11-01 中国人民解放军海军航空大学 Unsupervised intrusion flow detection and identification method based on generative countermeasure network
CN115277189B (en) * 2022-07-27 2023-08-15 中国人民解放军海军航空大学 Unsupervised intrusion flow detection and identification method based on generation type countermeasure network

Also Published As

Publication number Publication date
CN109766835B (en) 2023-04-18

Similar Documents

Publication Publication Date Title
CN109766835A (en) The SAR target identification method of confrontation network is generated based on multi-parameters optimization
CN111259930B (en) General target detection method of self-adaptive attention guidance mechanism
CN105809198B (en) SAR image target recognition method based on depth confidence network
CN106683048B (en) Image super-resolution method and device
CN105809693B (en) SAR image registration method based on deep neural network
CN103886342B (en) Hyperspectral image classification method based on spectrums and neighbourhood information dictionary learning
CN105931253B (en) A kind of image partition method being combined based on semi-supervised learning
CN110276377A (en) A kind of confrontation sample generating method based on Bayes's optimization
CN109800631A (en) Fluorescence-encoded micro-beads image detecting method based on masked areas convolutional neural networks
CN107229918A (en) A kind of SAR image object detection method based on full convolutional neural networks
CN109284786B (en) SAR image terrain classification method for generating countermeasure network based on distribution and structure matching
CN104239902B (en) Hyperspectral image classification method based on non local similitude and sparse coding
CN103984966A (en) SAR image target recognition method based on sparse representation
CN106485651B (en) The image matching method of fast robust Scale invariant
CN108960330A (en) Remote sensing images semanteme generation method based on fast area convolutional neural networks
CN109242028A (en) SAR image classification method based on 2D-PCA and convolutional neural networks
CN109389080A (en) Hyperspectral image classification method based on semi-supervised WGAN-GP
CN106023257A (en) Target tracking method based on rotor UAV platform
CN110647802A (en) Remote sensing image ship target detection method based on deep learning
CN105913090B (en) SAR image objective classification method based on SDAE-SVM
CN108764310A (en) SAR target identification methods based on multiple dimensioned multiple features depth forest
CN109165658B (en) Strong negative sample underwater target detection method based on fast-RCNN
CN107491734A (en) Semi-supervised Classification of Polarimetric SAR Image method based on multi-core integration Yu space W ishart LapSVM
CN111753873A (en) Image detection method and device
CN109344845A (en) A kind of feature matching method based on Triplet deep neural network structure

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant