CN113486884A - Clothing retrieval method based on dense network and multiple similar losses - Google Patents

Clothing retrieval method based on dense network and multiple similar losses Download PDF

Info

Publication number
CN113486884A
CN113486884A CN202110659071.2A CN202110659071A CN113486884A CN 113486884 A CN113486884 A CN 113486884A CN 202110659071 A CN202110659071 A CN 202110659071A CN 113486884 A CN113486884 A CN 113486884A
Authority
CN
China
Prior art keywords
dense network
retrieval method
dimension reduction
feature vector
layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110659071.2A
Other languages
Chinese (zh)
Inventor
徐菲菲
田宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai University of Electric Power
Original Assignee
Shanghai University of Electric Power
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai University of Electric Power filed Critical Shanghai University of Electric Power
Priority to CN202110659071.2A priority Critical patent/CN113486884A/en
Publication of CN113486884A publication Critical patent/CN113486884A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Probability & Statistics with Applications (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a clothing retrieval method based on dense network and multiple similar losses, and belongs to the field of artificial intelligence. Because the invention is based on dense network and quotes multiple similarity loss function, the data flow is coded into specific characteristic diagram to compare, and the most approximate clothing style is found out most effectively, so the invention has higher accuracy in clothing retrieval, saves manpower and material resources, is also applicable when processing huge data set, and has wide application prospect.

Description

Clothing retrieval method based on dense network and multiple similar losses
Technical Field
The invention relates to the field of artificial intelligence, in particular to a clothing retrieval method based on dense networks and multiple similar losses.
Background
In daily life, people are always concerned about eating and wearing clothes. With the rapid development of e-commerce platforms, more and more people like to shop on the internet, how to select favorite clothes from a large number of clothes pictures, and the clothes retrieval technology can be used. This is a very challenging and valuable direction for many practical applications, motivating many visual researchers to explore.
The image retrieval in the past is mainly text-based image retrieval, which aims to perform retrieval matching aiming at image text content, and has the defects of high precision, obvious defect and large amount of manual labeling. While significant in effect for small data sets, large data sets require a significant amount of manpower and material resources, and research is now focused on content-based image retrieval. The content-based image retrieval mainly aims at extracting features of each pixel point through the pixel value of each pixel point, and further uses the coding vector of the pixel point to demonstrate the picture semantic meaning in another embedding space to find similar pictures in the space.
The image retrieval task processing method is a conversion from a traditional image algorithm SIFT which is popular originally to a convolutional neural network algorithm which is more and more fierce at present, namely the feature is extracted manually to the feature is extracted automatically by a computer. Image retrieval can also be viewed as a metric learning process. At present, the learning aiming at the depth measurement mainly has two development directions, one is to design a new network structure, such as a twin network, a three-element network and the like. The second is directed to the improvement of the loss function, image contrast loss, center loss, multiple similarity loss, etc. These methods mainly perform different degrees of push-pull in different directions for different eigenvectors embedded in space.
In apparel retrieval, such as in data sets, wrinkles or occlusions may occur in clothing. Both of these problems affect the test results. When the clothes are creased, training is carried out, the characteristic value extracted through the training sample is different from the characteristic value of the flat clothes, and the characteristic value is very interference in testing. In the cognition of people, the colors of the same type of clothes are different, the clothes are generally considered to be of the same type, but the RGB colors are generally taken as the main colors when the model identifies pictures, the output result of the model can be corrected through a large number of data sets of the same type, however, in the practical situation, the whole data set is possibly large, but the data of the same type is not large, so that the correct output of the model cannot be corrected, meanwhile, when the training set is trained, the data set is trained in a batch processing mode because the training set is too large, but the general batch processing mode only trains a subset of randomly sampled data which is divided into the data set, and the repeated data with the same attribute can appear.
Disclosure of Invention
The present invention has been made to solve the above-mentioned problems, and an object of the present invention is to provide a clothing retrieval method based on a dense network and multiple similar losses.
The invention provides a clothing retrieval method based on dense network and multiple similar losses, which is characterized by comprising the following steps: step S1, taking the pixel point matrix of the picture as image input, and taking coordinate values of a plurality of interested areas of the picture as the interested areas for input; step S2, after extracting the feature vector from the image input in the dense network, obtaining a first feature vector, a second feature vector and a classification result; step S3, obtaining local features after the first feature vector and the coordinate value of the region of interest pass through a region pooling layer; step S4, obtaining a dimension reduction global feature after the second feature vector passes through the first dimension reduction layer; step S5, obtaining dimension reduction local characteristics after the local characteristics pass through a second dimension reduction layer; step S6, dimension splicing is carried out on the dimension reduction global features and the dimension reduction local features, and final global and local joint vectors are obtained through tiling operation and full-connection calculation; and step S7, the second feature vector takes the cross entropy loss function as a loss function, the joint vector takes the multi-similarity loss function as a loss function until the loss value is reduced to the minimum, and the retrieval result is output.
In the clothing retrieval method based on dense network and multiple similar losses, the invention also has the following characteristics: in step S1, the coordinate values of the region of interest are obtained by the Kmeans method.
In the clothing retrieval method based on dense network and multiple similar losses, the invention also has the following characteristics: wherein the number of the interested areas is 3-5 blocks.
In the clothing retrieval method based on dense network and multiple similar losses, the invention also has the following characteristics: the first dimension reduction layer and the second dimension reduction layer respectively comprise 3 convolution layers, 3 normalization layers and 3 activation function layers.
In the clothing retrieval method based on dense network and multiple similar losses, the invention also has the following characteristics: the convolution kernels of the 3 convolutional layers are 1x1, 3x3 and 1x1 respectively.
In the clothing retrieval method based on dense network and multiple similar losses, the invention also has the following characteristics: wherein, the activation function of the activation function layer is a relu function.
In the clothing retrieval method based on dense network and multiple similar losses, the invention also has the following characteristics: the specific formula of the cross entropy loss function is as follows:
Figure BDA0003114573550000041
wherein x is a sample, y is a real label, a is a predicted value, and n is the number of data.
In the clothing retrieval method based on dense network and multiple similar losses, the invention also has the following characteristics: wherein, the specific formula of the multiple similarity loss function is as follows:
Figure BDA0003114573550000042
wherein m is the number of data, and alpha, beta and lambda are all adjustable threshold super parameters and LMSAre values of multiple similar loss functions of the data.
In the clothing retrieval method based on dense network and multiple similar losses, the invention also has the following characteristics: wherein the dense network comprises 1 convolutional layer, 1 pooling layer, 4 dense blocks, 3 conversion layers, and 1 classification layer.
Action and Effect of the invention
The clothing retrieval method based on the dense network and the multiple similar losses comprises the following steps: step S1, taking the pixel point matrix of the picture as image input, and taking coordinate values of a plurality of interested areas of the picture as the interested areas for input; step S2, after extracting the feature vector from the image input in the dense network, obtaining a first feature vector, a second feature vector and a classification result; step S3, obtaining local features after the first feature vector and the coordinate value of the region of interest pass through a region pooling layer; step S4, obtaining a dimension reduction global feature after the second feature vector passes through the first dimension reduction layer; step S5, obtaining dimension reduction local characteristics after the local characteristics pass through a second dimension reduction layer; step S6, dimension splicing is carried out on the dimension reduction global features and the dimension reduction local features, and final global and local joint vectors are obtained through tiling operation and full-connection calculation; and step S7, the second feature vector takes the cross entropy loss function as a loss function, the joint vector takes the multi-similarity loss function as a loss function until the loss value is reduced to the minimum, and the retrieval result is output. Because the invention is based on dense network and quotes multiple similarity loss function, the data flow is coded into specific characteristic diagram to compare, and the most approximate clothing style is found out most effectively, so the invention has higher accuracy in clothing retrieval, saves manpower and material resources, is also applicable when processing huge data set, and has wide application prospect.
Drawings
FIG. 1 is a flow diagram of a method for clothing retrieval based on dense networks and multiple similar losses in an embodiment of the present invention;
FIG. 2 is an overall framework diagram of a dense network and multiple similar loss based apparel retrieval in an embodiment of the present invention;
FIG. 3 is a structural framework diagram of a dense network in an embodiment of the invention;
FIG. 4 is a structural framework diagram of a dimension reduction layer in an embodiment of the invention;
FIG. 5 is a partial data set of a training set in an embodiment of the present invention;
FIG. 6 is a graph of accuracy versus results for different algorithms in an embodiment of the present invention.
Detailed Description
In order to make the technical means, the creation features, the achievement objects and the effects of the present invention easy to understand, a clothing retrieval method based on dense network and multiple similar losses is specifically described below with reference to the embodiments and the accompanying drawings.
< example >
FIG. 1 is a flow diagram of a method for clothing retrieval based on dense networks and multiple similar losses in an embodiment of the present invention.
As shown in fig. 1, the clothing retrieval method based on dense network and multiple similar losses provided by this embodiment includes the following steps:
step S1, inputting a pixel matrix of the picture as an image, and inputting a plurality of coordinate values of the region of interest of the picture as the region of interest, specifically including the following steps:
inspiration was obtained from YoLo using the Kmeans method, preprocessing the training set and finding multiple regions of interest.
In this embodiment, the region of interest is 4 blocks.
And step S2, after the image is input into the dense network and the feature vectors are extracted, obtaining a first feature vector, a second feature vector and a classification result. Fig. 3 is a structural framework diagram of a dense network in an embodiment of the invention.
As shown in fig. 3, the dense network includes 1 two-dimensional convolutional layer, 1 two-dimensional pooling layer, 4 dense blocks, 3 conversion layers, and 1 classification layer.
Wherein, the output-1 is the first feature vector, the output-2 is the second feature vector, and the output-3 is the classification result.
In this embodiment, the first feature vector and the second feature vector are output from the conversion layer and the dense block, respectively, because different scales can detect details with different sizes according to the feature pyramid inference, so that different feature details are obtained through convolution output of a plurality of layers. The classification result finally flows out of the model directly.
And step S3, obtaining local features after the first feature vector and the coordinate value of the region of interest pass through a region pooling layer.
And step S4, obtaining the dimension-reduced global feature after the second feature vector passes through the first dimension-reduced layer.
And step S5, obtaining the dimension reduction local feature after the local feature passes through the second dimension reduction layer.
FIG. 4 is a structural framework diagram of a dimension reduction layer in an embodiment of the invention.
As shown in fig. 4, each of the first dimension reduction layer and the second dimension reduction layer includes 3 convolution layers, 3 normalization layers, and 3 activation function layers. Convolution kernels of the 3 convolutional layers are 1x1, 3x3 and 1x1 respectively, and an activation function of the activation function layer is a relu function.
And step S6, performing dimension splicing on the dimension reduction global features and the dimension reduction local features, and performing tiling operation and full-connection calculation to obtain a final global and local joint vector.
FIG. 2 is an overall framework diagram of a dense network and multiple similar loss based apparel retrieval in an embodiment of the invention.
Wherein, the output-1 is the first feature vector, and the output-2 is the final joint vector.
And step S7, the second feature vector takes the cross entropy loss function as a loss function, the joint vector takes the multi-similarity loss function as a loss function until the loss value is reduced to the minimum, and the retrieval result is output.
In this embodiment, the relative optimum of the model cannot be found due to the multiple similarity losses used alone. Therefore, the IAPM established in this embodiment uses the multi-loss function joint training, which first uses the cross entropy loss function of the image classification as the main function and the multi-similarity loss function as the auxiliary function, and waits until the total loss is reduced to a certain degree, so as to increase the weight proportion of the multi-similarity loss function and correctly pull the feature vector.
The specific formula of the cross entropy loss function is as follows:
Figure BDA0003114573550000081
wherein x is a sample, y is a real label, a is a predicted value, and n is the number of data.
The specific formula for the multiple similarity loss function is as follows:
Figure BDA0003114573550000082
wherein m is dataThe number of alpha, beta and lambda are all adjustable threshold value super parameters and LMSAre values of multiple similar loss functions of the data.
In the present embodiment, in the testing stage, in order to adjust the weight of the model to adapt to the distribution of the data set, the feature map above the DenseNet classification layer, i.e., the output-2 and the second feature vector in fig. 3, are directly intercepted, cosine values between the feature maps are calculated according to cosine similarity, and finally, the first n clothing pictures with the lowest calculated cosine values are selected as the final retrieval result through comparison.
In this embodiment, data enhancement and model adaptation are used to enhance the correct output of the orthotic model. The entire model is mainly RGB to resolve styles and data enhancement is used to 1: 11, for expanding the color gamut, the result is shown in fig. 5.
FIG. 5 is a partial data set of a training set in an embodiment of the present invention.
In the embodiment, in the training stage of the training set, the algorithm adopts a pre-training method, a dense network model with ImageNet weight is used for pre-training the training set, a cross entropy loss function and a random descent gradient method are used in the training, the learning rate is set to be 0.01, and the learning attenuation rate is e-6. When the target expected level is reached, stopping training and acquiring the weight of pre-training; and when the target expected level is not reached, putting the model into an IAPM (inter-integrated adaptive pulse model) to continue training until the target expected level is reached, and finishing the training.
In this embodiment, a multiple similarity loss function is used to train the data set, and a batch is constructed in a structural random sampling manner, specifically, N similar clothes and M different clothes are randomly selected, and the size of the batch size is M × N. In this embodiment, the accuracy of top k of different algorithms is calculated, that is, the final k results similar to the retrieved pictures are generated by general retrieval, if one true correct retrieval result exists in the k results, the retrieval is regarded as successful, and the data is finally retrieved, and the number of successful retrieval is divided by the total retrieval times to obtain the accuracy.
FIG. 6 is a graph of accuracy versus results for different algorithms in an embodiment of the present invention. As shown in fig. 6, the abscissa Tok is the top k garment pictures closest to the retrieval, the unit is one, the ordinate is the accuracy, and the unit is 100%.
Effects and effects of the embodiments
The clothing retrieval method based on the dense network and the multiple similar losses according to the embodiment comprises the following steps: step S1, taking the pixel point matrix of the picture as image input, and taking coordinate values of a plurality of interested areas of the picture as the interested areas for input; step S2, after extracting the feature vector from the image input in the dense network, obtaining a first feature vector, a second feature vector and a classification result; step S3, obtaining local features after the first feature vector and the coordinate value of the region of interest pass through a region pooling layer; step S4, obtaining a dimension reduction global feature after the second feature vector passes through the first dimension reduction layer; step S5, obtaining dimension reduction local characteristics after the local characteristics pass through a second dimension reduction layer; step S6, dimension splicing is carried out on the dimension reduction global features and the dimension reduction local features, and a final joint vector is obtained through tiling operation and full-connection calculation; and step S7, the second feature vector takes the cross entropy loss function as the loss function, the joint vector takes the multi-similarity loss function as the loss function, and the retrieval result is output until the loss value is reduced to the minimum. Because the embodiment is based on the dense network and refers to the multiple similarity loss function, the data stream is encoded into the specific characteristic diagram for comparison, and the closest clothing style is found most effectively, the method has higher accuracy in clothing retrieval, saves manpower and material resources, is also applicable to processing a huge data set, and has wide application prospect.
The above embodiments are preferred examples of the present invention, and are not intended to limit the scope of the present invention.

Claims (9)

1. A clothing retrieval method based on dense network and multiple similar losses is characterized by comprising the following steps:
step S1, inputting a pixel point matrix of a picture as an image, and inputting a plurality of interested area coordinate values of the picture as interested areas;
step S2, after extracting the feature vector from the dense network, the image input obtains a first feature vector, a second feature vector and a classification result;
step S3, obtaining local features after the first feature vector and the coordinate value of the region of interest pass through a region pooling layer;
step S4, obtaining a dimension reduction global feature after the second feature vector passes through a first dimension reduction layer;
step S5, obtaining dimension reduction local characteristics after the local characteristics pass through a second dimension reduction layer;
step S6, performing dimension splicing on the dimension reduction global features and the dimension reduction local features, and performing tiling operation and full-connection calculation to obtain a final global and local joint vector;
and step S7, the second feature vector takes a cross entropy loss function as a loss function, the joint vector takes a multi-similarity loss function as a loss function until the loss value is reduced to the minimum, and a retrieval result is output.
2. The dense network and multiple similarity loss based apparel retrieval method of claim 1, wherein:
in step S1, the coordinate values of the region of interest are obtained by a Kmeans method.
3. The dense network and multiple similarity loss based apparel retrieval method of claim 1, wherein:
wherein the number of the interested areas is 3-5 blocks.
4. The dense network and multiple similarity loss based apparel retrieval method of claim 1, wherein:
the first dimension reduction layer and the second dimension reduction layer respectively comprise 3 convolution layers, 3 normalization layers and 3 activation function layers.
5. The dense network and multiple similarity loss based apparel retrieval method of claim 4, wherein:
the convolution kernels of the 3 convolutional layers are 1x1, 3x3 and 1x1 respectively.
6. The dense network and multiple similarity loss based apparel retrieval method of claim 4, wherein:
wherein the activation function of the activation function layer is a relu function.
7. The dense network and multiple similarity loss based apparel retrieval method of claim 1, wherein:
wherein, the specific formula of the cross entropy loss function is as follows:
Figure FDA0003114573540000021
wherein x is a sample, y is a real label, a is a predicted value, and n is the number of data.
8. The dense network and multiple similarity loss based apparel retrieval method of claim 1, wherein:
wherein, the specific formula of the multiple similarity loss function is as follows:
Figure FDA0003114573540000031
wherein m is the number of data, and alpha, beta and lambda are all adjustable threshold super parameters and LMSAre values of multiple similar loss functions of the data.
9. The dense network and multiple similarity loss based apparel retrieval method of claim 1, wherein:
wherein the dense network comprises 1 convolutional layer, 1 pooling layer, 4 dense blocks, 3 conversion layers, and 1 classification layer.
CN202110659071.2A 2021-06-15 2021-06-15 Clothing retrieval method based on dense network and multiple similar losses Pending CN113486884A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110659071.2A CN113486884A (en) 2021-06-15 2021-06-15 Clothing retrieval method based on dense network and multiple similar losses

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110659071.2A CN113486884A (en) 2021-06-15 2021-06-15 Clothing retrieval method based on dense network and multiple similar losses

Publications (1)

Publication Number Publication Date
CN113486884A true CN113486884A (en) 2021-10-08

Family

ID=77935290

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110659071.2A Pending CN113486884A (en) 2021-06-15 2021-06-15 Clothing retrieval method based on dense network and multiple similar losses

Country Status (1)

Country Link
CN (1) CN113486884A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108829848A (en) * 2018-06-20 2018-11-16 华中科技大学 A kind of image search method and system
CN110825899A (en) * 2019-09-18 2020-02-21 武汉纺织大学 Clothing image retrieval method integrating color features and residual network depth features
US20200242422A1 (en) * 2019-01-29 2020-07-30 Boe Technology Group Co., Ltd. Method and electronic device for retrieving an image and computer readable storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108829848A (en) * 2018-06-20 2018-11-16 华中科技大学 A kind of image search method and system
US20200242422A1 (en) * 2019-01-29 2020-07-30 Boe Technology Group Co., Ltd. Method and electronic device for retrieving an image and computer readable storage medium
CN110825899A (en) * 2019-09-18 2020-02-21 武汉纺织大学 Clothing image retrieval method integrating color features and residual network depth features

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
XU FEIFEI ET AL: "A Clothing Retrieval Model Based on DenseNet and Multi-Similarity Loss", 《IPO》 *
何福贵: "《Python深度学习 逻辑、算法与编程实战》", 30 September 2020 *
刘树春等: "《深度实践OCR 基于深度学习的文字识别》", 31 May 2020 *
陈云霁: "《智能计算***》", 29 February 2020 *

Similar Documents

Publication Publication Date Title
CN110443143B (en) Multi-branch convolutional neural network fused remote sensing image scene classification method
CN107256246B (en) printed fabric image retrieval method based on convolutional neural network
CN106919951B (en) Weak supervision bilinear deep learning method based on click and vision fusion
CN112348036A (en) Self-adaptive target detection method based on lightweight residual learning and deconvolution cascade
CN111881714A (en) Unsupervised cross-domain pedestrian re-identification method
CN107392919B (en) Adaptive genetic algorithm-based gray threshold acquisition method and image segmentation method
CN107169417B (en) RGBD image collaborative saliency detection method based on multi-core enhancement and saliency fusion
CN110633708A (en) Deep network significance detection method based on global model and local optimization
CN109446985B (en) Multi-angle plant identification method based on vector neural network
CN105469111B (en) The object classification method of small sample set based on improved MFA and transfer learning
CN109446922B (en) Real-time robust face detection method
CN110211127B (en) Image partition method based on bicoherence network
CN104504406B (en) A kind of approximate multiimage matching process rapidly and efficiently
CN109344898A (en) Convolutional neural networks image classification method based on sparse coding pre-training
CN106203448A (en) A kind of scene classification method based on Nonlinear Scale Space Theory
CN111488797B (en) Pedestrian re-identification method
CN105844299B (en) A kind of image classification method based on bag of words
Wangli et al. Foxtail Millet ear detection approach based on YOLOv4 and adaptive anchor box adjustment
CN113486884A (en) Clothing retrieval method based on dense network and multiple similar losses
CN115294424A (en) Sample data enhancement method based on generation countermeasure network
Li et al. Criminal investigation image classification based on spatial cnn features and elm
CN116246158A (en) Self-supervision pre-training method suitable for remote sensing target detection task
CN111046861B (en) Method for identifying infrared image, method for constructing identification model and application
CN114049675A (en) Facial expression recognition method based on light-weight two-channel neural network
Wang et al. Apple automatic classification method based on improved VGG11

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20211008