CN103336974B - A kind of flowers classification discrimination method based on local restriction sparse representation - Google Patents

A kind of flowers classification discrimination method based on local restriction sparse representation Download PDF

Info

Publication number
CN103336974B
CN103336974B CN201310250693.5A CN201310250693A CN103336974B CN 103336974 B CN103336974 B CN 103336974B CN 201310250693 A CN201310250693 A CN 201310250693A CN 103336974 B CN103336974 B CN 103336974B
Authority
CN
China
Prior art keywords
flowers
feature
classification
test data
pictures
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201310250693.5A
Other languages
Chinese (zh)
Other versions
CN103336974A (en
Inventor
郭礼华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to CN201310250693.5A priority Critical patent/CN103336974B/en
Publication of CN103336974A publication Critical patent/CN103336974A/en
Application granted granted Critical
Publication of CN103336974B publication Critical patent/CN103336974B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a kind of flowers classification discrimination method based on local restriction sparse representation, first collect flowers image data base and set up flowers popular science knowledge storehouse, then extracting the various characteristics of image of flowers picture;The linear expression of training data feature and test data characteristics is set up followed by sparse coding theory;When modeling linear expression, not only consider that test characteristics of image is minimum with the linear expression error of training image feature set, increase the weight constraints of the locality characteristic Similarity Structure of test data characteristics and training data feature simultaneously, and the statistical gradient descending method utilizing kernel function to extend effectively solves, complete flowers classification identification learning process;Finally flowers image zooming-out feature to be identified is substituted into flowers classification judgement formula, obtains identification result, and recall, from flowers popular science knowledge storehouse, the explanatory note that this flowers classification is corresponding.The present invention has the advantage that recognition performance is high.

Description

A kind of flowers classification discrimination method based on local restriction sparse representation
Technical field
The present invention relates to pattern recognition and field of artificial intelligence, particularly to one based on local restriction sparse representation Flowers classification discrimination method.
Background technology
Flowers classification identification i.e. refers to utilize computer that the image information of flowers is carried out feature extraction, and computer is according to people's Understanding and mode of thinking in addition kind is sorted out and understands, and then can provide the user the flowers knowledge of some science popularization, and it belongs to The category of computer automatic object identification, in terms of flowers classification identification, patent is fewer at present, has a small amount of paper at academia Deliver, such as paper (Yuning CHAI, Victor LEMPITSKY, Andrew ZISSERMAN.BiCoS:A Bi-level Co-Segmentation Method for Image Classification.ICCV, 2011), it utilizes image partition method Divide the image into into foreground and background, then extract distribution of color and the super-pixel information of image, and calculation is effectively deduced in utilization Method is identified.Paper (Nilsback, M-E.and Zisserman, A.Automated flower Classification over a large number of classes, Proceedings of the Indian Conference on Computer Vision, Graphics and Image Processing2008), it extracts image Color histogram, SIFT feature, after three kinds of features of HOG feature, carry out the classification of flowers classification with svm classifier.Paper (Nilsback, M-E.and Zisserman, A, A Visual Vocabulary for Flower Classification, Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition2006), it utilizes BOW feature to carry out the classification of flower chart picture.
Summary of the invention
In order to overcome the disadvantages mentioned above of prior art with not enough, it is an object of the invention to provide a kind of based on local restriction The flowers classification discrimination method of sparse representation, recognition performance is high.
The purpose of the present invention is achieved through the following technical solutions: a kind of flowers classification based on local restriction sparse representation is distinguished Knowledge method, comprises the following steps:
(1) collect flowers image data base and set up flowers popular science knowledge storehouse: utilizing existing plants and flowers wikipedia The title of S conventional flowers classification of definition, utilizes network search engines to search for flowers word corresponding to each flowers classification and is situated between Continue and picture, picture is formed flowers image data base;Explanatory note is included into flowers popular science knowledge storehouse;
(2) all pictures in the flowers image data base obtaining step (1) carry out feature extraction, and every pictures extracts Feature be m;
(3) flowers classification identification learning process:
(3-1) from flowers image data base, p pictures is selected, using its feature as test data set Y={yk, wherein k =1...m represents m the feature that every pictures extracts respectively, and p < N, N are the picture number in flowers image data base;Flower chart picture In data base, the feature of remaining N-p pictures is as training datasetWherein j=1,2 ... S, represent flowers Classification;
(3-2) will test data characteristics ykUse training data featureLinear expression is:Wherein Being the weight coefficient of training dataset characterization test data set, its value is more than 0;
(3-3) minimum for training dataset and test data set linear expression error, increase test data characteristics simultaneously With the weight constraints of the locality characteristic Similarity Structure of training data feature, set up following optimization learning criterion:
min W j k 1 2 Σ k = 1 m | | y k - Σ j = 1 S X j k w j k | | 2 2 + λ Σ j = 1 S | | D j k Θ w j k | | 2 - - - ( 1 )
Wherein Θ represents vector dot;λ is bound term weight, is used to leverage linear and represents between error and weight coefficient Constant variables;For test data characteristics ykWith training data featureEuclidean distance;
(3-4) statistical gradient descent method pair is usedBeing iterated updating, renewal equation is:
w j k , t + 1 = w j k , t - η ▿ w ▿ w = - X j k y k + X j k X j k w j t + λ | | y k - X j k | | 2 2 · w j k , t - - - ( 2 )
Wherein, t is the iterations of SGD iterative process, and η is the learning rate that statistical gradient declines iterative process;
(3-5) utilize nonlinear function φ that the feature of training and test data is carried out the nonlinear mapping regeneration to higher-dimension Core Hilbert space (reproducing kernel hilbert space is called for short RKHS), i.e. φ (xi)Tφ(xj)=g (xi,xj), wherein g (xixj) it is χ2Kernel function, xiAnd xjIt it is data characteristics;Formula (2) is transformed into:
w j k , t + 1 = w j k , t - η ▿ w ▿ w = - h k + G k w j k , t + λ ( P k - 2 h h + G k ) · w j k , t - - - ( 3 )
WhereinIt it is training data featureWith test data characteristics ykDot product kernel function,It it is training data featureWith the dot product kernel function of self, Pk=φ(yk)φ(yk) it is that test data are special Levy ykDot product kernel function with self;Through the successive ignition of formula (3), obtain the sign weight coefficient of optimum
(4) identification flowers: user shoots the image of flowers to be identified, to flowers image zooming-out characteristic Z to be identifiedk, its Middle k=1...m represents m the feature that every pictures extracts respectively;
According to Zk, flowers classification adjudicate formula identification flowers classification, and recall this flowers from flowers popular science knowledge storehouse The explanatory note that classification is corresponding;
Wherein flowers classification judgement formula is:
Wherein j*Represent the linear expression of the training data utilizing certain j classification and the minimum error values of test data, logical Crossing minimum selection, j is and identifies the flowers classification obtained.
In step (3-4), η reduces along with the increase of iterations t,
Step (3-5) described χ2The expression formula of kernel function is exp (-χ2(x, x)/μ), wherein χ2Card side for symmetrical expression (Chi-squared) distance, μ is the χ of current training dataset2The average of distance.
The described title utilizing existing plants and flowers wikipedia S conventional flowers classification of definition of step (1), utilizes Network search engines searches for flowers character introduction corresponding to each flowers classification and picture, particularly as follows: utilize existing plant flowers The title of grass wikipedia 313 conventional herbage flower classifications of definition, utilizes network search engines to search for each flowers classification Corresponding flowers character introduction and picture, each flowers classification downloads 100 pictures.
The feature that every pictures extracts includes color histogram, SIFT feature (Scale-invariant feature Transform, scale invariant feature converting characteristic), HOG feature (Histogram of oriented gradients, direction Histogram of gradients feature), BOW feature (Bag of word, dictionary bag feature), SSIM feature (Structural Similarity, structural similarity feature), GB feature (Structural Similarity, structural similarity feature).
Compared with prior art, the present invention has the following advantages and beneficial effect:
(1), in flowers classification identification learning process, optimization learning criterion not only allows for current linear and represents that error is Little principle, but also by the bound term of based on local for image similar features structural modeling to learning criterion, thus image is online When property represents, system preferably selects most like characteristics of image and carries out linear expression, then while ensureing that linear coefficient is openness, Improve again the identification performance of system.
(2) frame structure of the present invention can seamlessly incorporate more characteristics of image, provides convenient for follow-up system upgrading, And more characteristics of image having identification also can improve the identification performance of system further.
(3) during the identification method of the present invention can be applied to the flowers Science Popularization System of reality well, due to the identification of the present invention Modeling method recognition performance is high, thus ensures robustness and the stability of flowers Science Popularization System.
Accompanying drawing explanation
Fig. 1 is the flow chart of the flowers classification discrimination method based on local restriction sparse representation of the present embodiment.
Fig. 2 is the statistical gradient descending method broad flow diagram of the present embodiment.
Detailed description of the invention
Below in conjunction with embodiment, the present invention is described in further detail, but embodiments of the present invention are not limited to this.
Embodiment
As it is shown in figure 1, the flowers classification discrimination method based on local restriction sparse representation of the present embodiment, including following step Rapid:
(1) collect flowers image data base and set up flowers popular science knowledge storehouse: utilizing existing plants and flowers wikipedia Define the title of 313 conventional herbage flower classifications, utilize the network search engines such as *** and Baidu to search for each flowers Flowers character introduction that classification is corresponding and picture, each flowers classification downloads 100 pictures.Picture is formed flowers view data Storehouse;Explanatory note is included into flowers popular science knowledge storehouse.
(2) all pictures in the flowers image data base obtaining step (1) carry out feature extraction, and every pictures extracts Feature be 6, (realize details with reference to Lowe, D.G., Distinctive Image including color histogram, SIFT feature Features from Scale-Invariant Keypoints,International Journal of Computer Vision, 60,2, pp.91-110,2004), HOG feature (realizes details with reference to N.Dalal and B.Triggs.Histogram of oriented gradients for human detection.In Computer Vision and Pattern Recognition, pp.887 893.IEEE, 2005), BOW feature (realize details reference S.Lazebnik,C.Schmid,and J.Ponce.Beyond bag of features:Spatial pyramid matching for recognizing natural scene categories.In Computer Vision and Pattern Recognition, pp.2169 2178.IEEE, 2006), SSIM feature (realizes details with reference to E.Shechtman and M.Irani.,Matching local self-similarities across images and videos.In Proc.CVPR, 2007), GB feature (realizes details with reference to A.C.Berg, T.L.Berg, and J.Malik., Shape matching and object recognition using low distortion correspondences.In Proc.CVPR, 2005).
(3) flowers classification identification learning process:
(3-1) from flowers image data base, 30 pictures are selected, using its feature as test data set Y={yk, wherein 6 kinds of features of k=1...6 representative image respectively: color histogram, SIFT feature, HOG feature, BOW feature, SSIM feature and GB Feature;The feature of remaining 31270 pictures in flowers image data base is as training datasetWherein j= 1...313, represent the classification of flowers;
(3-2) will test data characteristics ykUse training data featureLinear expression is:Wherein Being the weight coefficient of training dataset characterization test data set, its value is more than 0;
(3-3) minimum for training dataset and test data set linear expression error, increase test data characteristics simultaneously With the weight constraints of the locality characteristic Similarity Structure of training data feature, set up following optimization learning criterion:
min W j k 1 2 Σ k = 1 m | | y k - Σ j = 1 S X j k w j k | | 2 2 + λ Σ j = 1 S | | D j k Θ w j k | | 2 - - - ( 1 )
Wherein Θ represents vector dot;λ is bound term weight, is used to leverage linear and represents between error and weight coefficient Constant variables (the present embodiment is set as 0.01);Represent test data characteristics ykThe training data feature similar with itLocality describe, be specifically defined and beFor test data characteristics ykWith training data featureEuclidean distance;Under optimization learning criterion, it is possible to ensure that similar test feature can select similar training characteristics to enter Line linearity represents;In order to make the final linear expression coefficient can be sparse, typically can be by wherein weight coefficientIt is less than Certain threshold value (the present embodiment 0.005) is set to 0.
(3-4) use statistical gradient descent method (Stochastic gradient descent Method is called for short SGD) rightBeing iterated updating, renewal equation is:
w j k , t + 1 = w j k , t - η ▿ w ▿ w = - X j k y k + X j k X j k w j t + λ | | y k - X j k | | 2 2 · w j k , t - - - ( 2 )
Wherein, t is the iterations of SGD iterative process, and η is the learning rate that statistical gradient declines iterative process;In order to protect Card learning criterion can effectively restrain, and the present embodiment sets learning rate η and reduces along with the increase of iterations t, η = 1 1 + 100 t ;
First statistical gradient descending method main flow process as in figure 2 it is shown, be to initialize weight coefficientThen with Machine selects part training datasetCalculate the kernel function that its training dataset is corresponding with test dataWithConnect And allow i=1 ..., n, utilize formula (3) to carry out n iteration of weight coefficient, after iteration is complete, computing formula (2) optimization object function Value, if it is less than last iteration process duration, randomization training dataset carries out next iteration circulation again, or shows Iteration has searched out optimal weight coefficient, and iteration terminates, and output optimum linearity represents weight coefficient;
(3-5) utilize nonlinear function φ that the feature of training and test data is carried out the nonlinear mapping regeneration to higher-dimension Core Hilbert space (reproducing kernel hilbert space is called for short RKHS), i.e. φ (xi)Tφ(xj)=g (xi,xj), wherein g (xi,xj) it is χ2Kernel function, xiAnd xjIt it is data characteristics;χ2The expression formula of kernel function is exp(-χ2(x,x)/ μ, wherein χ2For card side (Chi-squared) distance of symmetrical expression, μ is the χ of current training dataset2The average of distance.
Formula (2) is transformed into:
w j k , t + 1 = w j k , t - η ▿ w ▿ w = - X j k y k + X j k X j k w j t + λ | | y k - X j k | | 2 2 · w j k , t - - - ( 2 )
WhereinIt it is training data featureWith test data characteristics ykDot product kernel function,It it is training data featureWith the dot product kernel function of self, Pk=φ(yk)φ(yk) it is that test data are special Levy ykDot product kernel function with self;Through the successive ignition of formula (3), obtain the sign weight coefficient of optimum
(4) identification flowers: user shoots the image of flowers to be identified, to flowers image zooming-out characteristic Z to be identifiedk, its Middle k=1...6 represents color histogram, SIFT feature, HOG feature, BOW feature, SSIM feature, GB feature respectively;
According to Zk, flowers classification adjudicate formula identification flowers classification, and recall this flowers from flowers popular science knowledge storehouse The explanatory note that classification is corresponding;
Wherein flowers classification judgement formula is:
Wherein j*Represent the linear expression of the training data utilizing certain j classification and the minimum error values of test data, logical Crossing minimum selection, j is and identifies the flowers classification obtained.
Above-described embodiment is the present invention preferably embodiment, but embodiments of the present invention are not by described embodiment Limit, the change made under other any spirit without departing from the present invention and principle, modify, substitute, combine, simplify, All should be the substitute mode of equivalence, within being included in protection scope of the present invention.

Claims (5)

1. a flowers classification discrimination method based on local restriction sparse representation, it is characterised in that comprise the following steps:
(1) collect flowers image data base and set up flowers popular science knowledge storehouse: utilizing existing plants and flowers wikipedia to define S The title of individual conventional flowers classification, utilizes network search engines to search for flowers character introduction corresponding to each flowers classification and figure Sheet, forms flowers image data base by picture;Explanatory note is included into flowers popular science knowledge storehouse;S is natural number;
(2) all pictures in the flowers image data base obtaining step (1) carry out feature extraction, the spy that every pictures extracts Levy as m;M is natural number;
(3) flowers classification identification learning process:
(3-1) from flowers image data base, p pictures is selected, using the feature of p pictures as test data set Y, every figure Test data characteristics y of sheetkRepresent m the feature that every pictures extracts respectively as k=1...m, p < N, N are flower chart picture Picture number in data base;
In flowers image data base, the feature of remaining N-p pictures is as training dataset X, and the training data of every pictures is special LevyWork as j=1,2 ... S, represent the classification of flowers, represent m the feature that every pictures extracts as k=1...m respectively;
(3-2) will test data characteristics ykUse training data featureLinear expression is:WhereinIt it is training Data set characterizes the weight coefficient of test data set, and its value is more than 0;
(3-3) minimum for training dataset and test data set linear expression error, increase test data characteristics and instruction simultaneously Practice the weight constraints of locality characteristic Similarity Structure of data characteristics, set up following optimization learning criterion:
min 1 2 w j k Σ k = 1 m | | y k - Σ j = 1 S X j k w j k | | 2 2 + λ Σ j = 1 S | | D j k Θw j k | | 2 - - - ( 1 )
Wherein Θ represents vector dot;λ is bound term weight, and be used to that leverage linear represents between error and weight coefficient is normal Number variable;For test data characteristics ykWith training data featureEuclidean distance;
(3-4) statistical gradient descent method pair is usedBeing iterated updating, renewal equation is:
w j k , t + 1 = w j k , t - η ▿ w ▿ w = - X j k y k + X j k X j k w j k , t + λ | | y k - X j k | | 2 2 · w j k , t - - - ( 2 )
Wherein, t is the iterations of SGD iterative process, and η is the learning rate that statistical gradient declines iterative process;
(3-5) utilize nonlinear function φ that the feature of training and test data is carried out nonlinear mapping to wish to the reproducing kernel of higher-dimension That Bert space, i.e. φ (xi)Tφ(xj)=g (xi,xj), wherein g (xi,xj) it is χ2Kernel function, xiAnd xjIt it is data characteristics;Public Formula (2) is transformed into:
w j k , t + 1 = w j k , t - η ▿ w ▿ w = - h k + G k w j k , t + λ ( P k - 2 h k + G k ) · w j k , t - - - ( 3 )
WhereinIt it is training data featureWith test data characteristics ykDot product kernel function,It it is training data featureWith the dot product kernel function of self, Pk=φ (yk)φ(yk) it is test data Feature ykDot product kernel function with self;Through the successive ignition of formula (3), obtain the sign weight coefficient of optimum
(4) identification flowers: user shoots the image of flowers to be identified, to flowers image zooming-out characteristic Z to be identifiedk, wherein k= Represent m the feature that every pictures extracts the most respectively;
According to Zk, flowers classification adjudicate formula identification flowers classification, and from flowers popular science knowledge storehouse, recall this flowers classification pair The explanatory note answered;
Wherein flowers classification judgement formula is:
Wherein j*Represent the linear expression of the training data utilizing certain j classification and the minimum error values of test data, by minimum Selecting, j is and identifies the flowers classification obtained.
Flowers classification discrimination method based on local restriction sparse representation the most according to claim 1, it is characterised in that step Suddenly, in (3-4), η reduces along with the increase of iterations t,
Flowers classification discrimination method based on local restriction sparse representation the most according to claim 1, it is characterised in that step Suddenly (3-5) described χ2The expression formula of kernel function is exp (-χ2(x, x)/μ), wherein χ2For card side's distance of symmetrical expression, μ is current The χ of training dataset2The average of distance.
Flowers classification discrimination method based on local restriction sparse representation the most according to claim 1, it is characterised in that step Suddenly (1) the described title utilizing existing plants and flowers wikipedia S conventional flowers classification of definition, utilizes web search to draw Hold up and search for flowers character introduction corresponding to each flowers classification and picture, particularly as follows: utilize existing plants and flowers wikipedia Define the title of 313 conventional herbage flower classifications, utilize network search engines to search for the flowers that each flowers classification is corresponding Character introduction and picture, each flowers classification downloads 100 pictures.
Flowers classification discrimination method based on local restriction sparse representation the most according to claim 1, it is characterised in that every The feature that pictures extracts includes color histogram, SIFT feature, HOG feature, BOW feature, SSIM feature, GB feature.
CN201310250693.5A 2013-06-21 2013-06-21 A kind of flowers classification discrimination method based on local restriction sparse representation Active CN103336974B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310250693.5A CN103336974B (en) 2013-06-21 2013-06-21 A kind of flowers classification discrimination method based on local restriction sparse representation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310250693.5A CN103336974B (en) 2013-06-21 2013-06-21 A kind of flowers classification discrimination method based on local restriction sparse representation

Publications (2)

Publication Number Publication Date
CN103336974A CN103336974A (en) 2013-10-02
CN103336974B true CN103336974B (en) 2016-12-28

Family

ID=49245131

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310250693.5A Active CN103336974B (en) 2013-06-21 2013-06-21 A kind of flowers classification discrimination method based on local restriction sparse representation

Country Status (1)

Country Link
CN (1) CN103336974B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106941586A (en) * 2016-01-05 2017-07-11 腾讯科技(深圳)有限公司 The method and apparatus for shooting photo
CN105844535A (en) * 2016-04-19 2016-08-10 柳州名品科技有限公司 Agricultural vegetable greenhouse intelligent management platform having self-learning function
CN107153844A (en) * 2017-05-12 2017-09-12 上海斐讯数据通信技术有限公司 The accessory system being improved to flowers identifying system and the method being improved
CN110297930A (en) * 2019-06-14 2019-10-01 韶关市启之信息技术有限公司 A kind of colored language methods of exhibiting and device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3918143B2 (en) * 2000-12-28 2007-05-23 独立行政法人科学技術振興機構 Plant recognition system
CN101826161A (en) * 2010-04-09 2010-09-08 中国科学院自动化研究所 Method for identifying target based on local neighbor sparse representation
CN102902961A (en) * 2012-09-21 2013-01-30 武汉大学 Face super-resolution processing method based on K neighbor sparse coding average value constraint

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3918143B2 (en) * 2000-12-28 2007-05-23 独立行政法人科学技術振興機構 Plant recognition system
CN101826161A (en) * 2010-04-09 2010-09-08 中国科学院自动化研究所 Method for identifying target based on local neighbor sparse representation
CN102902961A (en) * 2012-09-21 2013-01-30 武汉大学 Face super-resolution processing method based on K neighbor sparse coding average value constraint

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
《locality-constrained linear coding for image classification》;Jinjun Wang,Jianchao Yang,ect;《IEEE Conference on Computer Vision and Pattern Recognition》;20100618;第3360-3367页 *
Automatic recognition of Blooming Flowers;Takeshi Saitoh,Kimiya Aoki,ect;《Proceedings of the 17th International Conference on Pattern Recognition,IEEE》;20040826;第一卷;第27-30页 *
基于数字图像的花卉种类识别技术研究;裴勇;《中国优秀硕士学位论文全文数据库 信息科技辑》;20111015(第10期);全文 *

Also Published As

Publication number Publication date
CN103336974A (en) 2013-10-02

Similar Documents

Publication Publication Date Title
CN110414377B (en) Remote sensing image scene classification method based on scale attention network
Oquab et al. Is object localization for free?-weakly-supervised learning with convolutional neural networks
Lim et al. Sketch tokens: A learned mid-level representation for contour and object detection
Eigen et al. Nonparametric image parsing using adaptive neighbor sets
Negrel et al. Evaluation of second-order visual features for land-use classification
CN101551809B (en) Search method of SAR images classified based on Gauss hybrid model
Strecha et al. LDAHash: Improved matching with smaller descriptors
Reddy Mopuri et al. Object level deep feature pooling for compact image representation
Cevikalp et al. Semi-supervised dimensionality reduction using pairwise equivalence constraints
CN106570521B (en) Multilingual scene character recognition method and recognition system
CN109063112B (en) Rapid image retrieval method, model and model construction method based on multitask learning deep semantic hash
CN111460980B (en) Multi-scale detection method for small-target pedestrian based on multi-semantic feature fusion
Prasad et al. Classifying computer generated charts
Zhao et al. Semantic parts based top-down pyramid for action recognition
CN103544504B (en) Scene character recognition method based on multi-scale map matching core
Kontschieder et al. Context-sensitive decision forests for object detection
CN103336974B (en) A kind of flowers classification discrimination method based on local restriction sparse representation
Peng et al. Deep boosting: joint feature selection and analysis dictionary learning in hierarchy
CN104036021A (en) Method for semantically annotating images on basis of hybrid generative and discriminative learning models
Chen et al. Learning to focus: cascaded feature matching network for few-shot image recognition
CN105389588A (en) Multi-semantic-codebook-based image feature representation method
CN102930258A (en) Face image recognition method
Afkham et al. Joint visual vocabulary for animal classification
Hsiao et al. Learning sparse representation for leaf image recognition
CN105718858A (en) Pedestrian recognition method based on positive-negative generalized max-pooling

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant