CN110210616A - A kind of zero degree learning classification algorithm for indicating to instruct feature selecting based on classification - Google Patents

A kind of zero degree learning classification algorithm for indicating to instruct feature selecting based on classification Download PDF

Info

Publication number
CN110210616A
CN110210616A CN201910330421.3A CN201910330421A CN110210616A CN 110210616 A CN110210616 A CN 110210616A CN 201910330421 A CN201910330421 A CN 201910330421A CN 110210616 A CN110210616 A CN 110210616A
Authority
CN
China
Prior art keywords
classification
sample
feature
training
feature selecting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910330421.3A
Other languages
Chinese (zh)
Inventor
廖昌粟
苏荔
黄庆明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Chinese Academy of Sciences
Original Assignee
University of Chinese Academy of Sciences
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Chinese Academy of Sciences filed Critical University of Chinese Academy of Sciences
Priority to CN201910330421.3A priority Critical patent/CN110210616A/en
Publication of CN110210616A publication Critical patent/CN110210616A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The present invention relates to a kind of to indicate to instruct the zero degree learning classification algorithm of feature selecting based on classification, the invention firstly uses the structures of neural network self-encoding encoder, while training sample visual signature is penetrated to the mapping of classification semanteme, and its reflection, in classification semantic space, sample and classification are lost.Define an additional feature selecting layer, it will be used for the expression of classification corresponding to the batch data of network training every time, reflection is mapped in sample visual signature space, it is poor to indicate to make with sample, and seek the mean value of all differences, again by result by sigmoid function, the significance distribution for obtaining a feature is indicated, the value of itself and feature selecting layer is lost.Utilize two loss functions, training pattern.Feature selecting layer proposed by the present invention and its training method, can choose out the feature for being conducive to zero degree learning classification, to obtain better zero degree learning classification effect.

Description

A kind of zero degree learning classification algorithm for indicating to instruct feature selecting based on classification
Technical field
The present invention relates to the technical fields of artificial intelligence, instruct feature selecting based on classification expression more particularly to a kind of Zero degree learning classification algorithm.
Background technique
It is well known that the successful application with deep learning in a variety of machine learning tasks, has constantly attracted more next More researchers utilizes deep learning, goes to solve the problems in more human lives, however, the algorithm mould based on deep learning Type, outstanding performance tend to rely on the labeled data of magnanimity, many data it is limited application (such as hazard event inspection Survey) in, it is difficult to reach adaptable level, therefore, it is necessary to zero degree learning methods to be used only instruction by transportable information The data for practicing collection classification, identify test set class;The existing zero degree for indicating to instruct feature selecting based on classification Learning classification algorithm uses deep neural network to carry out feature extraction to sample mostly, directly establishes between sample and classification expression Mapping, and do not select purpose is carried out for the feature of sample, so that final classifying quality is bad.
Summary of the invention
In order to solve the above technical problems, the present invention provides a kind of algorithm for being able to carry out feature selecting, so as to be zero degree The sample classification of study provides better sample characteristics, realizes that one kind of better zero degree learning classification effect is based on category table Show the zero degree learning classification algorithm for instructing feature selecting.
A kind of zero degree learning classification algorithm for indicating to instruct feature selecting based on classification of the invention, mainly includes following step It is rapid:
Step a:
Feature extraction is carried out to sample data using deep neural network, obtains training dataAnd test data
Step b:
The neural network model of one basic self-encoding encoder form of training, establishes the connection between sample and corresponding classification System realizes classification;
Step c:
In the model of step b, by training sampleBefore being mapped into semantic embedding space, point is multiplied by a feature choosing A layer mask is selected, feature selecting is carried out, re-maps in semantic embedding space, with matrix S composed by training classification semantic expressivenesss It loses;
Step d:
By training dataCorresponding classification semantic expressivenessUsing trained model in step b, reflection is mapped to sample In eigen space, and and training dataIt is operated, obtains target masko, production mask loses, in guiding step c The generation of feature selecting layer mask.In conjunction with all losses, model is trained;
Step e:
When test, by test sampleIt by mask layers and is mapped in classification semantic space, obtains corresponding classification.
A kind of zero degree learning classification algorithm for indicating to instruct feature selecting based on classification of the invention, in step c, if special The dimension of sign selection layer mask is consistent with the dimension of a sample x, is R1×lIf the training sample inputted every timeDimension be Rb×l, mask is replicated b parts, forms new tensor maskx∈Rb×l, by its withMake dot product, obtains the sample after feature selecting ThisIt by it using the model learnt in step b, is mapped into semantic embedding space, with training classification semantic expressiveness institute group At matrix SsIt loses.
A kind of zero degree learning classification algorithm for indicating to instruct feature selecting based on classification of the invention, in step d, for It is input to the data being trained in model every timeBy its corresponding classification semantic expressivenessUtilize what is learnt in step b Model, reflection are mapped in sample characteristics space, withIt is poor to make, and takes mean value on 0 dimension direction, and result is passed through sigmoid letter Number, the significance distribution for obtaining a feature indicate masko, later, it is lost with feature selecting layer mask, in conjunction with step Loss in c, is trained model.
Compared with prior art the invention has the benefit that the invention firstly uses the knots of neural network self-encoding encoder Structure, while training sample visual signature is to the mapping of classification semanteme, and its reflection is penetrated, in classification semantic space, by sample with Classification is lost.An additional feature selecting layer is defined, category table corresponding to the batch data of network training will be used for every time Show, reflection is mapped in sample visual signature space, and it is poor to indicate to make with sample, and seeks the mean value of all differences, then result is passed through Sigmoid function, the significance distribution for obtaining a feature indicates, the value of itself and feature selecting layer is lost.In conjunction with all Loss, is trained, and final model can be obtained, and a kind of algorithm being able to carry out feature selecting can be the sample of zero degree study This classification provides better sample characteristics, realizes better zero degree learning classification effect.
Detailed description of the invention
Fig. 1 is flow chart of the invention;
Specific embodiment
With reference to embodiment, the embodiment of the present invention is furthur described in detail.Following embodiment is used for Illustrate the present invention, but is not intended to limit the scope of the invention.
A kind of zero degree learning classification algorithm for indicating to instruct feature selecting based on classification of the invention, mainly includes following step It is rapid:
Step a:
Feature extraction is carried out to sample data using depth convolutional neural networks, obtains training dataWith Test dataTraining set has n1A sample, test set have n2A sample, and the dimension of each sample is 1 × l;
This part carries out feature extraction to picture sample data using depth convolutional neural networks.Depth convolutional neural networks Be it is generally acknowledged at present, to image data carry out feature extraction the best way, its available very effective sample characteristics.
Step b:
The neural network model of one basic self-encoding encoder form of training, if sample space reflecting to classification semantic space It penetrates as FCe, counter to be mapped as FCd;The then loss function (1) of available self-encoding encoder training:
Wherein, Ss∈Rb×mIndicate that the set of training classification semantic expressiveness, total b trained classification, and each classification are 1 × m Matrix;It indicates the corresponding class label of training set sample, is the form of one-hot, T representing matrix turns It sets, wherein entropy_loss () indicates to intersect entropy loss, and MSE () indicates mean square deviation loss;
The detailed process of formula (1) is: after sample is mapped into classification semantic space, being classified using entropy loss is intersected Training;Sample reflection is emitted back towards sample space again and original sample carries out mean square deviation loss;Above-mentioned steps can train one it is basic Self-encoding encoder form neural network model;
Step c:
In the model of step b, by training sampleBefore being mapped into semantic embedding space, point is multiplied by a feature choosing A layer mask is selected, feature selecting is first carried out, is re-mapped in semantic embedding space, it is semantic with the training classification after corresponding standardization Matrix S composed by indicatingsMatrix multiple is carried out, by result and corresponding label YsMake intersection entropy loss;Specifically, if it is special The dimension of sign selection layer mask is consistent with the dimension of a sample x, is a R1×lTensor, if the training sample inputted every timeDimension be Rb×l, mask is replicated b parts, forms new tensor maskx∈Rb×l, by its withMake dot product, obtains feature Sample after selectionBy it using the model learnt in step b, it is mapped into semantic embedding space, with corresponding mark Classification semantic expressiveness after standardizationMatrix multiple is carried out, by result and corresponding label YsMake intersection entropy loss.
Step d:
By training dataCorresponding classification semantic expressivenessUsing trained model in step b, reflection is mapped to sample In eigen space, and and training dataIt is operated, obtains target masko, production mask loses, in guiding step c The generation of feature selecting layer mask.In conjunction in step c intersection entropy loss and mask loss, model is trained.Come in detail It says, for being input to the data being trained in model every timeBy its corresponding classification semantic expressivenessUsing in step b The model of study, reflection are mapped in sample characteristics space, withIt is poor to make, and takes mean value on 0 dimension direction, and result is passed through Sigmoid function, the significance distribution for obtaining a feature indicate masko, later, it is square with feature selecting layer mask work Differential loss lose, in conjunction in step c intersection entropy loss and mask loss, model is trained.The formula of objective function such as (2) institute Show.
Step e:
When test, by test sampleIt by mask layers and is mapped in classification semantic space, obtains corresponding category table ShowFor each test set sample, the classification semantic expressiveness with its arest neighbors is found in spaceAs its Corresponding classification.
Experimentation and result explanation
The present invention tests on data set CUB-200-2011Birds (CUB) [1], entirely the picture of birds, 200 class in total, 150 classes are training set, and 50 classes are test set, and the semanteme of classification is 312 dimensions, there is 11788 pictures.
Feature extraction is carried out using picture sample of the GoogLeNet [2] to data set, it is finally available on test set 60.7% accuracy rate.Compared to the result 58.1% for the basic self-encoding encoder model for not increasing mask layers, 2.8 hundred are increased Branch.
The pertinent literature of retrieval is given below:
[1]Catherine Wah,Steve Branson,Peter Welinder,Pietro Perona,and Serge Belongie,“The caltech-ucsd birds-200-2011dataset,”California Institute of Technology,2011.
[2]Christian Szegedy,Wei Liu,Yangqing Jia,Pierre Sermanet,Scott Reed, Dragomir Anguelov,Dumitru Erhan,Vincent Vanhoucke,and Andrew Rabinovich, “Going deeper with convolutions,”in CVPR,2015,pp.1–9.
The above is only a preferred embodiment of the present invention, it is noted that for the ordinary skill people of the art For member, without departing from the technical principles of the invention, several improvements and modifications, these improvements and modifications can also be made Also it should be regarded as protection scope of the present invention.

Claims (3)

1. a kind of zero degree learning classification algorithm for indicating to instruct feature selecting based on classification, which is characterized in that mainly include following Step:
Step a:
Feature extraction is carried out to sample data using deep neural network, obtains training dataAnd test data
Step b:
The neural network model of one basic self-encoding encoder form of training, establishes contacting between sample and corresponding classification, real Now classify;
Step c:
In the model of step b, by training sampleBefore being mapped into semantic embedding space, point is multiplied by a feature selecting layer Mask carries out feature selecting, re-maps in semantic embedding space, with matrix S composed by training classification semantic expressivenesssIt damages It loses;
Step d:
By training dataCorresponding classification semantic expressivenessUsing trained model in step b, reflection is mapped to sample spy It levies in space, and and training dataIt is operated, obtains target masko, make mask loss, the feature in guiding step c Select the generation of layer mask.In conjunction with all losses, model is trained;
Step e:
When test, by test sampleIt by mask layers and is mapped in classification semantic space, obtains corresponding classification.
2. a kind of zero degree learning classification algorithm for indicating to instruct feature selecting based on classification as described in claim 1, feature It is, is R if the dimension of feature selecting layer mask is consistent with the dimension of a sample x in step c1×lIf inputting every time Training sampleDimension be Rb×l, mask is replicated b parts, forms new tensorBy its withMake a little Multiply, obtains the sample after feature selectingBy it using the model learnt in step b, it is mapped into semantic embedding space, With matrix S composed by training classification semantic expressivenesssIt loses.
3. a kind of zero degree learning classification algorithm for indicating to instruct feature selecting based on classification as described in claim 1, feature It is, in step d, for being input to the data being trained in model every timeBy its corresponding classification semantic expressivenessUsing the model learnt in step b, reflection is mapped in sample characteristics space, withIt is poor to make, and takes on 0 dimension direction Value, by result by sigmoid function, the significance distribution for obtaining a feature indicates masko, later, it is selected with feature It selects a layer mask to lose, in conjunction with the loss in step c, model is trained.
CN201910330421.3A 2019-04-23 2019-04-23 A kind of zero degree learning classification algorithm for indicating to instruct feature selecting based on classification Pending CN110210616A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910330421.3A CN110210616A (en) 2019-04-23 2019-04-23 A kind of zero degree learning classification algorithm for indicating to instruct feature selecting based on classification

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910330421.3A CN110210616A (en) 2019-04-23 2019-04-23 A kind of zero degree learning classification algorithm for indicating to instruct feature selecting based on classification

Publications (1)

Publication Number Publication Date
CN110210616A true CN110210616A (en) 2019-09-06

Family

ID=67786321

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910330421.3A Pending CN110210616A (en) 2019-04-23 2019-04-23 A kind of zero degree learning classification algorithm for indicating to instruct feature selecting based on classification

Country Status (1)

Country Link
CN (1) CN110210616A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112465035A (en) * 2020-11-30 2021-03-09 上海寻梦信息技术有限公司 Logistics distribution task allocation method, system, equipment and storage medium
CN113762005A (en) * 2020-11-09 2021-12-07 北京沃东天骏信息技术有限公司 Method, device, equipment and medium for training feature selection model and classifying objects

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113762005A (en) * 2020-11-09 2021-12-07 北京沃东天骏信息技术有限公司 Method, device, equipment and medium for training feature selection model and classifying objects
CN112465035A (en) * 2020-11-30 2021-03-09 上海寻梦信息技术有限公司 Logistics distribution task allocation method, system, equipment and storage medium

Similar Documents

Publication Publication Date Title
CN110533045B (en) Luggage X-ray contraband image semantic segmentation method combined with attention mechanism
CN106920243A (en) The ceramic material part method for sequence image segmentation of improved full convolutional neural networks
CN111639679A (en) Small sample learning method based on multi-scale metric learning
CN105184298B (en) A kind of image classification method of quick local restriction low-rank coding
CN110413924A (en) A kind of Web page classification method of semi-supervised multiple view study
CN104036293B (en) Rapid binary encoding based high resolution remote sensing image scene classification method
US20150147728A1 (en) Self Organizing Maps (SOMS) for Organizing, Categorizing, Browsing and/or Grading Large Collections of Assignments for Massive Online Education Systems
CN109299663A (en) Hand-written script recognition methods, system and terminal device
CN104751175B (en) SAR image multiclass mark scene classification method based on Incremental support vector machine
CN117529755A (en) Transfer learning in image recognition systems
Akhtar et al. Attack to fool and explain deep networks
CN113761259A (en) Image processing method and device and computer equipment
CN109902662A (en) A kind of pedestrian recognition methods, system, device and storage medium again
CN108537257A (en) The zero sample classification method based on identification dictionary matrix pair
CN109344898A (en) Convolutional neural networks image classification method based on sparse coding pre-training
CN108960270A (en) A kind of data scaling method and system based on manifold transfer learning
CN110210616A (en) A kind of zero degree learning classification algorithm for indicating to instruct feature selecting based on classification
CN108764361A (en) The operating mode's switch method of beam type oil pumping machine indicating diagram based on integrated study
CN110263855A (en) A method of it is projected using cobasis capsule and carries out image classification
CN108229505A (en) Image classification method based on FISHER multistage dictionary learnings
Pondenkandath et al. Identifying cross-depicted historical motifs
Bi et al. Critical direction projection networks for few-shot learning
CN110163149A (en) Acquisition methods, device and the storage medium of LBP feature
CN109271833A (en) Target identification method, device and electronic equipment based on the sparse self-encoding encoder of stack
CN117690178A (en) Face image recognition method and system based on computer vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20190906

RJ01 Rejection of invention patent application after publication