CN112990371A - Unsupervised night image classification method based on feature amplification - Google Patents
Unsupervised night image classification method based on feature amplification Download PDFInfo
- Publication number
- CN112990371A CN112990371A CN202110459160.2A CN202110459160A CN112990371A CN 112990371 A CN112990371 A CN 112990371A CN 202110459160 A CN202110459160 A CN 202110459160A CN 112990371 A CN112990371 A CN 112990371A
- Authority
- CN
- China
- Prior art keywords
- feature
- image
- classification
- night
- class
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/088—Non-supervised learning, e.g. competitive learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Software Systems (AREA)
- Mathematical Physics (AREA)
- Computing Systems (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Evolutionary Biology (AREA)
- Multimedia (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Analysis (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
The invention belongs to the technical field of computer vision identification, and relates to an unsupervised night image classification method based on feature amplification. Training a classification network by adopting an open data set with daytime image classification labels, extracting characteristic vectors of input images through the classification network, and calculating characteristic mean values and covariance matrixes of various classes; inputting the non-label night image into a classification network to obtain a pseudo label of the image, and calculating a feature mean value and a covariance matrix of each class of the night image in a feature space according to the pseudo label; carrying out weighted average on covariance matrixes obtained by the images of the same category in the daytime and at night to obtain a final covariance matrix; performing feature sampling according to the feature mean value of each category of night images and the weighted average covariance matrix; and retraining the classification network by the sampled characteristic values and the original characteristic values. According to the method, the night data are amplified on the characteristic level by learning the characteristic distribution of the labeled daytime images, so that the unsupervised classification of the night images is realized.
Description
Technical Field
The invention belongs to the technical field of computer vision recognition, and particularly relates to an unsupervised night image classification method based on feature amplification.
Background
The image classification is the most classical task in the field of computer vision recognition, is also the basis of other many vision problems, and has huge practical value and application prospect. Image classification is essentially a pattern classification problem, whose goal is to classify different images into different classes, achieving a minimum of classification errors. With the success of convolutional neural networks CNN, deep learning has proven to be an effective solution to the image classification problem.
Currently available large public data sets related to image classification mainly include ImageNet, COCO, Pascal VOC, etc., however, these data sets are basically images acquired under daytime environment. Research shows that the daytime image and the nighttime image have obvious field difference, and the neural network trained by the daytime data set often has the problem of performance dip when processing nighttime data. To address this problem, there are two main approaches:
1. and (4) domain adaptation. In the migration learning, when the data distribution of the source domain and the target domain is different, but the two tasks are the same, the special migration learning is called as domain adaptation. Domain adaptation is mainly achieved by finding a feature space to match the distribution of the source domain and the target domain in the shared space. The distribution of data at daytime and night is matched by learning the shared space, and the classification performance of the data at night can be effectively improved. The current technical means is mainly realized by generating a countermeasure network, but the problems of unstable training, long training time and the like exist.
2. And (5) data amplification. Because there is no large nighttime image classification dataset, nighttime data amplification can be performed without supervision. For example, the GAN is used to generate a corresponding low-light image from the daytime image, i.e., data amplification is performed at the image level. However, the night image generated by the method does not accord with the distribution of real data, and the field difference is the same as that of the real night data.
Therefore, how to effectively amplify the night data set and make the features extracted by the trained model more fit to the real night data distribution is an urgent problem to be solved by unsupervised night image classification.
Disclosure of Invention
In order to solve the problem of lack of a night classification data set in the prior art, the invention provides an unsupervised night image classification method based on feature amplification, which comprises the following steps of firstly obtaining feature distribution of each category in the day by training a day image classifier, sampling night features by using a mean value and covariance of the day feature distribution on the basis of the assumption that each dimension of a feature vector is Gaussian distribution, realizing night data amplification on a feature level, and finally retraining a model together with original data and sampling data, so as to improve the night image classification performance, wherein the specific technical scheme is as follows:
an unsupervised night image classification method based on feature augmentation comprises the following steps:
step 1: constructing a data set: downloading an open source night image classification data set Exclusive Dark (ExDark), selecting partial images from the open source night image classification data set Exclusive Dark to construct an unsupervised night image data set A, and using the rest images as a night image classification performance verification set B; randomly selecting images of which the parts correspond to the ExDark data set from the Pascal VOC public data set as a daytime image classification data set T;
step 2: training a classification network to extract image features, and obtaining a mean value and a covariance matrix of the features of each category of the image: training a classification network by adopting a daytime image classification data set T, extracting a characteristic vector of an input image through the classification network, and calculating a characteristic mean value and a covariance matrix of each class;
and step 3: inputting a night image data set A to the classification network to obtain a pseudo label of an input image;
and 4, step 4: counting feature mean values and covariance matrixes of all classes of the night image according to the pseudo labels;
and 5: carrying out weighted average on covariance matrixes obtained by daytime and nighttime images of the same category to obtain a fusion covariance matrix;
step 6: performing feature sampling according to the feature mean value and the fusion covariance matrix of each category of the night image;
and 7: and retraining the classification network by the characteristic sample generated by sampling and the original sample.
Further, the step 2 specifically includes:
step 2.1: selecting a ResNet50 deep residual error network as a classification network, and pre-training on an ImageNet data set;
step 2.2: training a classification network by adopting a daytime image classification data set T, modifying the output number of the last classification layer of the ResNet50 classification network into the classification number of the data set T, setting the learning rate of the last layer of the network to be 0.001, setting the pre-trained network layer to be 0.0001, performing model optimization by using SGD (generalized regression) and setting the batch size to be 32, training 50 epochs in total, and adopting a cross entropy loss functionThe calculation formula is as follows:
n represents the total number of samples,a label representing the ith sample,representing the predicted probability value of the ith sample;
step 2.3: extracting a feature vector of a daytime image classification data set T through a trained ResNet50 network, namely removing the last full-connection layer of the classification network, extracting a feature vector output by the second last layer, performing two-dimensional space feature vector visual analysis by using a T-SNE algorithm, and further verifying the effectiveness of the classification network;
step 2.4: respectively calculating corresponding characteristic mean values aiming at the classified categoriesAnd covariance matrixMean of features of class iExpressed as:
wherein the content of the first and second substances,representing the jth input image feature point belonging to class i,representing the total number of images belonging to class i, classFeature distribution covariance matrix ofExpressed as:
further, the step 3 specifically includes:
step 3.1: inputting the nighttime image data set A into the trained classification network, and calculating the feature point corresponding to each input image and the mean value of the feature points of each category calculated in the step 2.4, namely the feature centerIs a Euclidean distance ofCorresponding feature point of each imageWith the ith class feature centerEuropean distance ofExpressed as:
step 3.2: for each input image, the nearest class feature center is found according to the distance calculated in the step 3.1, if the feature point is locatedWith nearest class feature centerDistance less than a hyper-parameterThen will beIs set toThe category to which it belongs; otherwise, judging the characteristic pointFor noise, it is discarded, resulting in a feature set:
further, the step 4 specifically includes:
according to the obtained characteristic set S, counting the characteristic mean value of each category of the night imageAnd covariance matrixMean of features of class iExpressed as:
wherein the content of the first and second substances,representing the jth feature vector belonging to class i in the feature set S,representing the total number of feature vectors belonging to class i, classFeature distribution covariance matrix ofExpressed as:
further, the step 5 specifically includes:
covariance matrix of daytime image to be acquiredCovariance matrix with nighttime imagesCarrying out weighted average to obtain a fusion covariance matrixExpressed as:
ρ is the weight value to harmonize the ratio between the different distributions generated during the day and at night.
Further, the step 6 specifically includes:
according to the obtained feature mean value of each class of night imagesAnd a final weighted covariance matrixCharacteristic sampling is performed from the following Gaussian distributionIn which random generation belongs to class iA sample:
Further, the step 7 specifically includes:
the characteristic sample Z generated by sampling and the original real day timeThe feature data x and the night feature data y with the pseudo labels retrain the classification network together, and the loss functionExpressed as:
whereinA representation belongs to any feature vector in the training sample,indicating the label to which it corresponds,representing classification model parameters.
The invention has the beneficial effects that:
1. carrying out feature distribution migration on night data by using the real daytime image classification data with the labels, so that the night features acquired by sampling are closer to the real data distribution characteristics;
2. night data are amplified at a feature level, and compared with an image level, the dimensionality is lower, and feature distribution statistics and migration are facilitated;
3. the method has the advantages that the cost is mainly increased in the training stage and is not influenced in the reasoning stage, so that the nighttime image classification performance can be effectively improved on the premise of ensuring the reasoning speed.
Drawings
FIG. 1 is a schematic diagram of the distance between the feature space of an input image and the center of each class of features;
fig. 2 is a flow chart of the method of the present invention.
Detailed Description
In order to make the objects, technical solutions and technical effects of the present invention more apparent, the present invention is further described in detail below with reference to the accompanying drawings and examples.
As shown in fig. 1-2, an unsupervised night image classification method based on feature augmentation of the present invention includes the following steps:
step 1: constructing a data set: adopting 11 categories in an open source dataset exclusive Dark (ExDARK), namely bicycles, ships, bottles, buses, cars, cats, chairs, dogs, motorcycles, people and tables, and respectively selecting 800 corresponding images from a Pascal VOC public data set as a daytime image classification data set T for the 11 categories; in addition, the ExDARK dataset was divided into two parts: respectively selecting 400 images from 11 categories to construct an unsupervised night image dataset A; the remaining images serve as a nighttime image classification performance verification set B to evaluate the effectiveness of the algorithm.
Step 2: training a classification network to extract image features to obtain a mean value and a covariance matrix of the features of each category of the image, and specifically comprising the following steps:
step 2.1, classification network pre-training:
selecting a ResNet50 deep residual error network as a classification network, and pre-training on an ImageNet data set to enable the network to have prior knowledge, accelerate convergence and avoid overfitting;
step 2.2, fine adjustment of the classification network:
changing the last classification layer of the ResNet50 model trained in the step 2.1 from 1000 outputs to 11, and fine-tuning the classification layer by using the daytime image classification data set T constructed in the step 1, wherein the learning rate of the last layer of the classification network is set to 0.001, the pre-trained network layer is set to 0.0001, model optimization is performed by using SGD, the batch size is set to 32, a total of 50 epochs are trained, and a cross entropy loss function is adoptedThe calculation formula is as follows:
n represents the total number of samples,a label representing the ith sample,representing the predicted probability value for the ith sample.
Step 2.3, image feature extraction:
extracting the feature vector of the daytime image classification data set T by using the classification network trained in the step 2.2, namely removing the last full-connection layer of the classification network, and extracting 2048-dimensional feature vectors output by the second last layer; performing two-dimensional space feature vector visualization analysis by using a t-SNE algorithm; if the features of the same category are mutually aggregated and the features of different categories are mutually distinguished, the classification network is trained; otherwise, the classification network needs to be trained continuously until the expected classification effect is achieved, which indicates that the classification network has better feature extraction and feature distinguishing capabilities;
step 2.4, counting the characteristic distribution of the daytime data set:
each feature extracted in the step 2.3 is a feature vector with 2048 dimensions, and each dimension of the feature vectors with the same category is regarded as a Gaussian distribution, so that a new feature vector can be sampled according to the mean value and the variance; for 11 classes, respectively calculating corresponding feature mean valuesAnd covariance matrixMean of features of class iExpressed as:
wherein the content of the first and second substances,representing the jth input image feature point belonging to class i,representing the total number of images belonging to class i, classFeature distribution covariance matrix ofExpressed as:
and step 3: inputting a night image data set A to the classification network to obtain a pseudo label of an input image, and specifically comprising the following steps:
step 3.1, inputting the data set A to the classification network trained in the step 2, and calculating the feature point corresponding to each input image and the 11 class feature centers obtained in the step 2.4Is a Euclidean distance ofCorresponding feature point of each imageWith the ith class feature centerEuropean distance ofExpressed as:
as shown in fig. 1, a rectangle example represents a feature point of an input image, and a triangle example and a circle example represent two different class feature centers, which are simplified to 2, and actually 11.
Step 3.2, for each input image, finding the nearest class feature center according to the distance calculated in step 3.1, if the feature point isWith nearest class feature centerDistance less than a hyper-parameterThen will beIs set toThe category to which it belongs; otherwise, judging the characteristic pointFor noise, it is discarded, resulting in a feature set:
And 4, step 4: counting the feature mean value and covariance matrix of each class of night images according to the pseudo labels, specifically:
according to the characteristic set S obtained in the step 3.2, counting the characteristic mean value of each category of the night imageAnd covariance matrixMean of features of class iExpressed as:
wherein the content of the first and second substances,representing the jth feature vector belonging to class i in the feature set S,representing the total number of feature vectors belonging to class i, classFeature distribution covariance matrix ofExpressed as:
and 5: carrying out weighted average on covariance matrixes obtained by the images of the same category at daytime and at night to obtain a fusion covariance matrix, which specifically comprises the following steps: covariance matrix of daytime image acquired in step 2.4And the covariance matrix of the night image acquired in the step 4Carrying out weighted average to obtain a fusion covariance matrixExpressed as:
ρ is a weight value to reconcile the ratio between the different distributions produced during the day and at night, in this exampleThe performance is the best when the signal strength is not less than 0.8, which indicates that the diversity of the night data distribution can be effectively improved by combining the daytime data distribution, thereby improving the classification performance of the night chart.
Step 6: performing feature sampling according to the feature mean value and the fusion covariance matrix of each category of the night image, specifically: according to the feature mean value of each class of night images obtained in the step 4And the weighted covariance matrix obtained in step 5Feature sampling is performed, for class i as an example, from the following Gaussian distributionIn which random generation belongs to class iA sample:
The number of samples generated by each category is unified by the hyper-parameterSetting, in this example= 400. Because the night data and the day data have the problem of domain interval, only the feature points of the night image are considered when calculating the feature mean value; and the distribution diversity of the daytime data can increase the sample richness of the nighttime image, so that the covariance distribution of the daytime data is fused.
And 7: retraining a classification network together with a characteristic sample generated by sampling and an original sample, specifically: retraining the classification network and the loss function by the feature sample Z generated by sampling, the original real daytime feature data x and the nighttime feature data y with the pseudo labelExpressed as:
whereinA representation belongs to any feature vector in the training sample,indicating the label to which it corresponds,representing classification model parameters.
The invention trains the classification network by using the daytime data set with labels and migrates the characteristic diversity to the nighttime data, thereby making up the deficiency of the scarcity of the nighttime data. The classification performance in the ExDARK validation set B is 58.74 percent by using a classification network trained by daytime data only; by adopting the characteristic amplification method provided by the invention, the classification performance in the ExDARK verification set B reaches 69.22%, is improved by 10.48% compared with benchmark, the classification performance of data at night is greatly improved, and the practical benefit and the application value of the method are fully embodied.
Claims (7)
1. An unsupervised night image classification method based on feature augmentation is characterized by comprising the following steps of:
step 1: constructing a data set: downloading an open source night image classification data set Exclusive Dark (ExDark), selecting partial images from the open source night image classification data set Exclusive Dark to construct an unsupervised night image data set A, and using the rest images as a night image classification performance verification set B; randomly selecting images of which the parts correspond to the ExDark data set from the Pascal VOC public data set as a daytime image classification data set T;
step 2: training a classification network to extract image features, and obtaining a mean value and a covariance matrix of the features of each category of the image: training a classification network by adopting a daytime image classification data set T, extracting a characteristic vector of an input image through the classification network, and calculating a characteristic mean value and a covariance matrix of each class of the image;
and step 3: inputting a night image data set A to the classification network to obtain a pseudo label of an input image;
and 4, step 4: counting feature mean values and covariance matrixes of all classes of night images according to the pseudo labels;
and 5: carrying out weighted average on covariance matrixes obtained by daytime and nighttime images of the same category to obtain a fusion covariance matrix;
step 6: performing feature sampling according to the feature mean value and the fusion covariance matrix of each category of the night image;
and 7: and retraining the classification network by the characteristic sample generated by sampling and the original sample.
2. The unsupervised nighttime image classification method based on feature augmentation as claimed in claim 1, wherein the step 2 specifically comprises:
step 2.1: selecting a ResNet50 deep residual error network as a classification network, and pre-training on an ImageNet data set;
step 2.2: training a classification network by adopting a daytime image classification data set T, and modifying the output number of the last classification layer of the ResNet50 classification network into the number of the data set T categoriesSetting the learning rate of one layer to be 0.001, setting the pre-trained network layer to be 0.0001, carrying out model optimization by using SGD, setting the batch size to be 32, training 50 epochs in total, and adopting a cross entropy loss functionThe calculation formula is as follows:
n represents the total number of samples,a label representing the ith sample,representing the predicted probability value of the ith sample;
step 2.3: extracting a feature vector of a daytime image classification data set T through a trained ResNet50 network, and performing two-dimensional space feature vector visualization analysis on the feature vector by using a T-SNE algorithm;
step 2.4: respectively calculating corresponding characteristic mean values aiming at the classified categoriesAnd covariance matrixMean of features of class iExpressed as:
wherein the content of the first and second substances,representing the jth input image feature point belonging to class i,representing the total number of images belonging to class i, classFeature distribution covariance matrix ofExpressed as:
3. the unsupervised nighttime image classification method based on feature augmentation as claimed in claim 2, wherein the step 3 specifically comprises:
step 3.1: inputting the nighttime image data set A into the trained classification network, and calculating the feature point corresponding to each input image and the mean value of the feature points of each category calculated in the step 2.4, namely the feature centerIs a Euclidean distance ofCorresponding feature point of each imageWith the ith class feature centerEuropean distance ofExpressed as:
step 3.2: for each input image, the nearest class feature center is found according to the distance calculated in the step 3.1, if the feature point is locatedWith nearest class feature centerDistance less than a hyper-parameterThen will beIs set toThe category to which it belongs; otherwise, judging the characteristic pointFor noise, it is discarded, resulting in a feature set:
4. the unsupervised nighttime image classification method based on feature augmentation as claimed in claim 3, wherein the step 4 is specifically:
according to the obtained characteristic set S, counting the characteristic mean value of each category of the night imageAnd covariance matrixMean of features of class iExpressed as:
wherein the content of the first and second substances,representing the jth feature vector belonging to class i in the feature set S,representing the total number of feature vectors belonging to class i, classFeature distribution covariance matrix ofExpressed as:
5. the unsupervised nighttime image classification method based on feature augmentation as claimed in claim 4, wherein the step 5 is specifically:
covariance matrix of daytime image to be acquiredCovariance matrix with nighttime imagesCarrying out weighted average to obtain a fusion covariance matrixExpressed as:
ρ is a weight value.
6. The unsupervised nighttime image classification method based on feature augmentation as claimed in claim 5, wherein the step 6 is specifically:
according to the obtained feature mean value of each class of night imagesAnd a final weighted covariance matrixCharacteristic sampling is performed from the following Gaussian distributionIn which random generation belongs to class iA sample:
7. The unsupervised nighttime image classification method based on feature augmentation according to claim 6, wherein the step 7 is specifically:
retraining the classification network and the loss function by the feature sample Z generated by sampling, the original real daytime feature data x and the nighttime feature data y with the pseudo labelExpressed as:
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110459160.2A CN112990371B (en) | 2021-04-27 | 2021-04-27 | Unsupervised night image classification method based on feature amplification |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110459160.2A CN112990371B (en) | 2021-04-27 | 2021-04-27 | Unsupervised night image classification method based on feature amplification |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112990371A true CN112990371A (en) | 2021-06-18 |
CN112990371B CN112990371B (en) | 2021-09-10 |
Family
ID=76340379
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110459160.2A Active CN112990371B (en) | 2021-04-27 | 2021-04-27 | Unsupervised night image classification method based on feature amplification |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112990371B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113657561A (en) * | 2021-10-20 | 2021-11-16 | 之江实验室 | Semi-supervised night image classification method based on multi-task decoupling learning |
CN113989597A (en) * | 2021-12-28 | 2022-01-28 | 中科视语(北京)科技有限公司 | Vehicle weight recognition method and device, electronic equipment and storage medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105608433A (en) * | 2015-12-23 | 2016-05-25 | 北京化工大学 | Nuclear coordinated expression-based hyperspectral image classification method |
CN108764281A (en) * | 2018-04-18 | 2018-11-06 | 华南理工大学 | A kind of image classification method learning across task depth network based on semi-supervised step certainly |
CN110348399A (en) * | 2019-07-15 | 2019-10-18 | 中国人民解放军国防科技大学 | EO-1 hyperion intelligent method for classifying based on prototype study mechanism and multidimensional residual error network |
CN111814871A (en) * | 2020-06-13 | 2020-10-23 | 浙江大学 | Image classification method based on reliable weight optimal transmission |
CN112016392A (en) * | 2020-07-17 | 2020-12-01 | 浙江理工大学 | Hyperspectral image-based small sample detection method for soybean pest damage degree |
CN112434723A (en) * | 2020-07-23 | 2021-03-02 | 之江实验室 | Day/night image classification and object detection method based on attention network |
-
2021
- 2021-04-27 CN CN202110459160.2A patent/CN112990371B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105608433A (en) * | 2015-12-23 | 2016-05-25 | 北京化工大学 | Nuclear coordinated expression-based hyperspectral image classification method |
CN108764281A (en) * | 2018-04-18 | 2018-11-06 | 华南理工大学 | A kind of image classification method learning across task depth network based on semi-supervised step certainly |
CN110348399A (en) * | 2019-07-15 | 2019-10-18 | 中国人民解放军国防科技大学 | EO-1 hyperion intelligent method for classifying based on prototype study mechanism and multidimensional residual error network |
CN111814871A (en) * | 2020-06-13 | 2020-10-23 | 浙江大学 | Image classification method based on reliable weight optimal transmission |
CN112016392A (en) * | 2020-07-17 | 2020-12-01 | 浙江理工大学 | Hyperspectral image-based small sample detection method for soybean pest damage degree |
CN112434723A (en) * | 2020-07-23 | 2021-03-02 | 之江实验室 | Day/night image classification and object detection method based on attention network |
Non-Patent Citations (4)
Title |
---|
HANG GAO 等: "Low-shot Learning via Covariance-Preserving Adversarial Augmentation Networks", 《HTTPS://ARXIV.ORG/ABS/1810.11730》 * |
XU YIFENG 等: "Learning to See in Extremely Low-Light Environments with Small Data", 《HTTPS://WWW.PROQUEST.COM/OPENVIEW/AC68D120E78FCC554A08F3C465B3300E/1?PQ-ORIGSITE=GSCHOLAR&CBL=2032404》 * |
YUXUAN XIAO 等: "Making of Night Vision: Object Detection Under Low-Illumination", 《IEEE ACCESS》 * |
刘孝保 等: "基于超分辨率特征融合的工件表面细微缺陷", 《HTTPS://KNS.CNKI.NET/KCMS/DETAIL/11.5946.TP.20210129.1755.004.HTML》 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113657561A (en) * | 2021-10-20 | 2021-11-16 | 之江实验室 | Semi-supervised night image classification method based on multi-task decoupling learning |
CN113657561B (en) * | 2021-10-20 | 2022-03-18 | 之江实验室 | Semi-supervised night image classification method based on multi-task decoupling learning |
CN113989597A (en) * | 2021-12-28 | 2022-01-28 | 中科视语(北京)科技有限公司 | Vehicle weight recognition method and device, electronic equipment and storage medium |
CN113989597B (en) * | 2021-12-28 | 2022-04-05 | 中科视语(北京)科技有限公司 | Vehicle weight recognition method and device, electronic equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN112990371B (en) | 2021-09-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112381116B (en) | Self-supervision image classification method based on contrast learning | |
CN113378632B (en) | Pseudo-label optimization-based unsupervised domain adaptive pedestrian re-identification method | |
CN111126386B (en) | Sequence domain adaptation method based on countermeasure learning in scene text recognition | |
CN108564129B (en) | Trajectory data classification method based on generation countermeasure network | |
EP3767536A1 (en) | Latent code for unsupervised domain adaptation | |
CN109768985A (en) | A kind of intrusion detection method based on traffic visualization and machine learning algorithm | |
CN108229550B (en) | Cloud picture classification method based on multi-granularity cascade forest network | |
CN107194418B (en) | Rice aphid detection method based on antagonistic characteristic learning | |
CN109218223B (en) | Robust network traffic classification method and system based on active learning | |
CN112990371B (en) | Unsupervised night image classification method based on feature amplification | |
CN113408605A (en) | Hyperspectral image semi-supervised classification method based on small sample learning | |
CN114120041B (en) | Small sample classification method based on double-countermeasure variable self-encoder | |
CN105469080B (en) | A kind of facial expression recognizing method | |
CN110460605A (en) | A kind of Abnormal network traffic detection method based on autocoding | |
CN105320967A (en) | Multi-label AdaBoost integration method based on label correlation | |
CN111488917A (en) | Garbage image fine-grained classification method based on incremental learning | |
CN112669161B (en) | Financial wind control system based on block chain, public sentiment and core algorithm | |
CN113541834B (en) | Abnormal signal semi-supervised classification method and system and data processing terminal | |
CN105306296A (en) | Data filter processing method based on LTE (Long Term Evolution) signaling | |
CN114006870A (en) | Network flow identification method based on self-supervision convolution subspace clustering network | |
CN114926680A (en) | Malicious software classification method and system based on AlexNet network model | |
CN114818963B (en) | Small sample detection method based on cross-image feature fusion | |
CN114255371A (en) | Small sample image classification method based on component supervision network | |
Zhang et al. | Multi-weather classification using evolutionary algorithm on efficientnet | |
CN116647844A (en) | Vehicle-mounted network intrusion detection method based on stacking integration algorithm |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |