CN110399917A - A kind of image classification method based on hyperparameter optimization CNN - Google Patents

A kind of image classification method based on hyperparameter optimization CNN Download PDF

Info

Publication number
CN110399917A
CN110399917A CN201910671268.0A CN201910671268A CN110399917A CN 110399917 A CN110399917 A CN 110399917A CN 201910671268 A CN201910671268 A CN 201910671268A CN 110399917 A CN110399917 A CN 110399917A
Authority
CN
China
Prior art keywords
cnn
particle
hyper parameter
image classification
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910671268.0A
Other languages
Chinese (zh)
Other versions
CN110399917B (en
Inventor
付俊
王思淼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northeastern University China
Original Assignee
Northeastern University China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northeastern University China filed Critical Northeastern University China
Priority to CN201910671268.0A priority Critical patent/CN110399917B/en
Publication of CN110399917A publication Critical patent/CN110399917A/en
Application granted granted Critical
Publication of CN110399917B publication Critical patent/CN110399917B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • G06F18/24147Distances to closest patterns, e.g. nearest neighbour classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • G06N20/20Ensemble learning
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Software Systems (AREA)
  • Medical Informatics (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Image Analysis (AREA)

Abstract

The present invention provides a kind of image classification method based on hyperparameter optimization CNN, belongs to image identification technical field.This method is hyper parameter of the invention according to the structural parameters that the design feature of CNN framework chooses convolutional layer C1 and pond layer P1, and limits the value range of hyper parameter as (Xl,Xu).Then it using the hyper parameter of two kinds of period variation PSO algorithm optimization CNN of global variation and local variations, avoids traditional PS O algorithm and local optimum is rested on to the optimization of hyper parameter, to obtain the image classification performance more more competitive than traditional PS O algorithm.It is obviously improved the efficiency and cost of deep learning CNN hyperparameter optimization, has farthest played the image classification potentiality of CNN framework, saved hardware resource when CNN carries out image classification and calculates cost, there is certain application value in engineering practice.

Description

A kind of image classification method based on hyperparameter optimization CNN
Technical field
The present invention relates to image identification technical field more particularly to a kind of image classification sides based on hyperparameter optimization CNN Method.
Background technique
Image Classfication Technology has developed comparative maturity, and the CNN framework suitable for different scenes classification emerges one after another, but multiple Miscellaneous CNN structure often expends hardware resource and calculates cost.Before CNN is for image classification training, it need to set in CNN in advance These parameters are known as hyper parameter by some parameters in portion, and choosing one group of optimal hyper parameter can be in the premise for not changing CNN structure Under utmostly promote CNN image classification performance.Therefore, suitable hyper parameter is selected the image classification performance of CNN framework is complete It releases entirely particularly important in engineering practice.
The hyperparameter optimization technique study of image classification has had some achievements, and the research of early stage is concentrated on machine learning Hyperparameter optimization method be used for CNN.Hyperparameter optimization method is broadly divided into model-free optimization and the optimization based on model, the former State-of-the-art method includes simple grid and random search, and the latter includes the heuristic value based on population, Yi Jiji Optimize (GP) in the Bayes of Gaussian process.Heuristic value especially merits attention for CNN hyperparameter optimization, wherein grain Swarm optimization since its simplicity and versatility have been proved to highly effective in terms of solving the multiple tasks in many regions, And it has the great potential of large-scale parallel.Hyperparameter optimization based on particle swarm algorithm, search efficiency are much super Other hyperparameter optimization algorithms such as grid search, random search are crossed, the search time of hyperparameter optimization is accelerated, solves tradition The problems such as optimization efficiency of hyperparameter optimization is low, time-consuming.But particle swarm algorithm there are problems that being easily trapped into local optimum, this meeting Hyperparameter optimization is caused to rest on local optimum, rather than the one of global optimum group of hyper parameter, this makes not searching to a certain extent Rope is to one group of hyper parameter for being optimal CNN performance, so that CNN image classification can not be made to reach optimal result.
Summary of the invention
The present invention in view of the above shortcomings of the prior art, provides a kind of image classification method based on hyperparameter optimization CNN.
The technical solution used in the present invention is:
A kind of image classification method based on hyperparameter optimization CNN, includes the following steps:
Step 1: the image data set classified to needs pre-processes, and it is divided into training set T, test in proportion Collect C and verifying collection V, wherein verifying collection V is extracted from training set T, meet | T |=9 | V |,
Step 2: CNN framework, including convolutional layer C1, pond layer P1, convolutional layer C2, pond layer P2 composition are built, by Softmax activation terminates.It is this according to the structural parameters that the design feature of CNN framework chooses convolutional layer C1 and pond layer P1 The hyper parameter of invention determines that the value range of this group of hyper parameter is (Xl,Xu)。
Step 3: using the hyper parameter of period variation PSO algorithm optimization CNN framework on verifying collection, obtaining one group of hyper parameter Value Xi(g);
Step 3-1: initialization particle rapidity, fitness function value, individual optimum position Pg, overall situation optimum position Pi, setting The number of iterations g is 0, iteration precision δ >=0, and particle search Spatial Dimension is D, particle number N;
The current location vector of each particle in group is Xi=(xi,1,xi,2,...,xi,D), i=1,2 ..., N, when Preceding velocity vector is Vi=(vi,1,vi,2,...,vi,D), i=1,2 ..., N, individual optimum position vector are Pg=(pG, 1+pg,2 +...+pg,D), i=1,2 ..., N, global optimum position vector are Pi=(pi1+pi2+...+piD), i=1,2 ..., N;
Step 3-2: in the g times iteration, each particle updates speed and the position of oneself:
Vi(g+1)=ω Vi(g)+c1r1(Pi(g)-Xi(g))+c2r2(Pg(g)-Xi(g)) (1)
Xi(g+1)=Xi(g)+Vi(g) (2)
Wherein, ω is inertial factor, c1And c2It is constant, r for Studying factors1And r2It is in (0,1) range Random number;
Step 3-3: global mutation operator is selected to change the position of all particles in entire group, or selection Local variations operator changes the position of the elite particle in group:
Global mutation operator and the formula that local variations operator changes position are as follows:
Wherein, A1, A2It is customized amplitude factor, is constant, r3, r4It is 0,1) random number in range,For Elite particle, q are the number of the new particle generated by local variations operator, f1For global variation frequency, f2For local variations frequency Rate;
Step 3-4: checking whether particle rapidity and position cross the border, and is replaced accordingly if crossing the border with the boundary value that it exceeds Particle value, specific judgment method are as follows:
If Vi(g)≤Vl, then Vi(g)=V1;If Vi(g)≥Vu, then Vi(g)=V1;If Xi(g)≤X1, then Xi(g)=X1; If Xi(g)≥Xu, then Xi(g)=X1
Wherein, (V1,Vu) be particle velocity interval, (X1,Xu) be particle position range;
Step 3-5: the i.e. required hyper parameter value X of the optimal location that execution step 3-1 to step 3-4 is obtainedi(g)。
Step 4: by hyper parameter Xi(g) be input in CNN, and the training set obtained in step 1 to the CNN after optimization into Row training;
Step 5: the test set that step 1 is obtained inputs in trained CNN, obtains the classification results of test set C;
Step 6: judging whether iteration reaches termination condition;
Step 6-1: the fitness function value of each particle period variation PSO is calculated:
Wherein, CNN (Xi(g)) accuracy rate of the classification results obtained for step 5 in the claim 1, Xi(g) for institute State the hyper parameter value that step 3 in claim 1 obtains;
Step 6-2: it by comparing each particle fitness function value of the obtained current iteration of step 6-1, updates respectively Individual history optimal location Pi(g) and group's optimal location Pg(g), current iteration optimal particle X is obtainedmin(g):
Step 6-3: judge that the fitness value of optimal particle increases the threshold value for being less than and being indicated by ε, judge best in group Particle position update be less than byThe minimum step of expression, judges whether the number of iterations g reaches maximum number of iterations gmaxIf full One of above-mentioned termination condition of foot, then terminate iteration.
Step 7: if not reaching termination condition, executing step 3 to step 6 and continue iteration;
Step 8: if reaching termination condition, obtaining final optimal hyper parameter, be denoted as Xmin(g);
Step 9: by final optimal hyper parameter Xmin(g) it substitutes into CNN, classifies to the image of entire data set, obtain To classification results.
The beneficial effects of adopting the technical scheme are that provided by the invention a kind of based on hyperparameter optimization The image classification method of CNN effectively improves the problem of particle swarm algorithm is easily trapped into local optimum, and then improves particle Group's convergence speed of the algorithm and convergence precision;Since the search performance of hyperparameter optimization method is better, keep away to a certain extent The problem for having exempted to select improper caused CNN nicety of grading undesirable due to hyper parameter, improves image classification accuracy, thus most The performance of CNN processing image is played to great Cheng.
Detailed description of the invention
Fig. 1 is a kind of image classification method flow chart based on hyperparameter optimization CNN of the present invention;
Fig. 2 is the segment in first embodiment of the invention for the Handwritten Digit Recognition MNIST data set of image classification;
Fig. 3 is that hyperparameter optimization method improves front and back MNIST data set classification performance with repeatedly in first embodiment of the invention Generation number variation diagram;
Fig. 4 is the segment in second embodiment of the invention for the object identification cifar-10 data set of image classification;
Fig. 5 be second embodiment of the invention in hyperparameter optimization method improve front and back cifar-10 data set classification performance with The number of iterations variation diagram.
Specific embodiment
With reference to the accompanying drawings and examples, specific embodiments of the present invention will be described in further detail.Implement below Example is not intended to limit the scope of the invention for illustrating the present invention.
Embodiment 1
As shown in Figure 1, the method for the present embodiment is as described below.
Step 1: selection benchmark dataset Handwritten Digit Recognition MNIST data set is data set to be sorted, the data set Segment as shown in Fig. 2, the data set has 70000 gray level images, each image is 28 × 28 pixels, wherein including 10 classes Not, each classification has 7000 images.60000 images in the data set are randomly selected as training set, remaining 10000 A image picks out 6000 images as verifying collection as test set at random from training set.
Step 2: CNN framework, including convolutional layer C1, pond layer P1, convolutional layer C2, pond layer P2 composition are built, by Softmax activation terminates.It is this according to the structural parameters that the design feature of CNN framework chooses convolutional layer C1 and pond layer P1 The hyper parameter of invention determines that the value range of this group of hyper parameter is (Xl,Xu), as shown in table 1.
The range of the parameter and its permission of 1 convolutional layer of table and maximum pond layer
Step 3: using the hyper parameter of period variation PSO algorithm optimization CNN framework on verifying collection, obtaining one group of hyper parameter Value Xi(g);
Step 3-1: initialization particle rapidity, fitness function value, individual optimum position Pg, overall situation optimum position Pi, setting The number of iterations g is 0, iteration precision δ >=0, particle search Spatial Dimension D=4, particle number N=10;
The current location vector of each particle in group is Xi=(xi,1,xi,2,...,xi,4), i=1,2 ..., 10, Current velocity vector is Vi=(vi,1,vi,2,...,vi,4), i=1,2 ..., 10, individual optimum position vector is Pg=(pG, 1+ pg,2+...+pg,4), i=1,2 ..., 10, global optimum position vector is Pi=(pi1+pi2+...+pi4), i=1,2 ..., 10;
Step 3-2: in the g times iteration, each particle updates speed and the position of oneself:
Vi(g+1)=ω Vi(g)+c1r1(Pi(g)-Xi(g))+c2r2(Pg(g)-Xi(g)) (1)
Xi(g+1)=Xi(g)+Vi(g) (2)
Wherein, ω=c1=c2=0.5, r1And r2It is the random number in (0,1) range;
Step 3-3: global mutation operator is selected to change the position of all particles in entire group, or selection Local variations operator changes the position of the elite particle in group:
Global mutation operator and the formula that local variations operator changes position are as follows:
Wherein, A1=A2=1, r3、r4For the random number in (0,1) range,For elite particle, calculated by local variations The number q=3, global variation frequency f for the new particle that son generates1=10, local variations frequency f2=2;
Step 3-4: checking whether particle rapidity and position cross the border, and is replaced accordingly if crossing the border with the boundary value that it exceeds Particle value, specific judgment method are as follows:
If Vi(g)≤Vl, then Vi(g)=V1;If Vi(g)≥Vu, then Vi(g)=V1;If Xi(g)≤X1, then Xi(g)=X1; If Xi(g)≥Xu, then Xi(g)=X1
Wherein, (V1,Vu) be particle velocity interval value be (- 2,2), (X1,Xu) be particle position range numerical value such as Described in table 1;
Step 3-5: the i.e. required hyper parameter value X of the optimal location that execution step 3-1 to step 3-4 is obtainedi(g)。
Step 4: by hyper parameter Xi(g) be input in CNN, and the training set obtained in step 1 to the CNN after optimization into Row training;
Step 5: the test set that step 1 is obtained inputs in trained CNN, obtains the classification results of image;
Step 6: judging whether iteration reaches termination condition;
Step 6-1: the fitness function value of each particle period variation PSO is calculated, fitness function curve is drawn, such as schemes Shown in 3:
Wherein, CNN (Xi(g)) accuracy rate of the classification results obtained for step 5 in the claim 1, Xi(g) for institute State the hyper parameter value that step 3 in claim 1 obtains;
Step 6-2: it by comparing each particle fitness function value of the obtained current iteration of step 6-1, updates respectively Individual history optimal location Pi(g) and group's optimal location Pg(g), current iteration optimal particle X is obtainedmin(g):
Step 6-3: judge that the increase of optimal particle fitness value five generations successively is less than threshold epsilon=0.0001, judge group In best particle position continuous 5 generation update be less than minimum stepJudge whether the number of iterations g reaches maximum The number of iterations 20 terminates iteration if meeting one of above-mentioned termination condition.
Step 7: if not reaching termination condition, executing step 3 to step 6;
Step 8: if reaching termination condition, obtaining final optimal hyper parameter, be denoted as Xmin(g);
Step 9: by final optimal hyper parameter Xmin(g) it substitutes into CNN, classifies to the image of entire data set, obtain To classification results.
Embodiment 2
As shown in Figure 1, the method for the present embodiment is as described below.
Step 1: selection benchmark dataset object identification cifar-10 data set is data set to be sorted, the data set Segment is as shown in figure 4, the data set has 60000 32 × 32 pixel color images, wherein including 10 classifications, each classification has 6000 images.50000 images in the data set are randomly selected as training set, remaining 10000 images are as survey Examination collection picks out 5000 images as verifying collection at random from training set.
Step 2, with embodiment 1, obtains hyperparameter optimization method and improves front and back cifar-10 data set classification to step 9 It can change with the number of iterations as shown in Figure 5.
Hyperparameter optimization method improves front and back Handwritten Digit Recognition MNIST data set and object identification cifar-10 data set The comparison of image classification accuracy rate is as shown in table 2:
For the accuracy rate of different data collection CNN image classification before and after 2 hyperparameter optimization of table
The result shows that the method for the present invention has centainly image classification accuracy rate under the premise of not changing classification CNN framework Degree is promoted, and has farthest played the image classification potentiality of CNN framework, has saved hardware when CNN carries out image classification Resource and calculating cost, there is certain application value in engineering practice.

Claims (5)

1. a kind of image classification method based on hyperparameter optimization CNN, it is characterised in that include the following steps:
Step 1: the image data set classified to needs pre-processes, and it is divided into training set T, test set C in proportion Collect V with verifying;
Step 2: building CNN framework, hyper parameter and its value range are chosen according to the design feature of CNN framework;
Step 3: using the hyper parameter of period variation PSO algorithm optimization CNN framework on verifying collection, obtaining one group of hyper parameter value Xi (g);
Step 4: by hyper parameter Xi(g) it is input in CNN, and the training set obtained in step 1 instructs the CNN after optimization Practice;
Step 5: the test set C that step 1 is obtained is inputted in trained CNN, obtains the classification results of test set C;
Step 6: judging whether iteration reaches termination condition;
Step 7: if not reaching termination condition, executing step 3 to step 6 and continue iteration;
Step 8: if reaching termination condition, obtaining final optimal hyper parameter, be denoted as Xmin(g);
Step 9: by final optimal hyper parameter Xmin(g) it substitutes into CNN, classifies to the image of entire data set, divided Class result.
2. a kind of image classification method based on hyperparameter optimization CNN according to claim 1, it is characterised in that the step Verifying collection V is extracted from training set T in rapid 1, is met | T |=9 | and V |,
3. a kind of image classification method based on hyperparameter optimization CNN according to claim 1, it is characterised in that the step The CNN framework built in rapid 2 includes convolutional layer C1, pond layer P1, convolutional layer C2, pond layer P2 composition, is activated eventually by Softmax Only, it is hyper parameter of the invention according to the structural parameters that the design feature of CNN framework chooses convolutional layer C1 and pond layer P1, determines The value range of this group of hyper parameter is (Xl,Xu)。
4. a kind of image classification method based on hyperparameter optimization CNN according to claim 1, it is characterised in that the step Process in rapid 3 on verifying collection using the hyper parameter of period variation PSO algorithm optimization CNN framework is as follows:
Step 3-1: initialization particle rapidity, fitness function value, individual optimum position Pg, overall situation optimum position Pi, iteration is set Number g is 0, iteration precision δ >=0, and particle search Spatial Dimension is D, particle number N;
The current location vector of each particle in group is Xi=(xi,1,xi,2,...,xi,D), i=1,2 ..., N, current speed Degree vector is Vi=(vi,1,vi,2,...,vi,D), i=1,2 ..., N, individual optimum position vector are Pg=(pG, 1+pg,2+...+ pg,D), i=1,2 ..., N, global optimum position vector are Pi=(pi1+pi2+...+piD), i=1,2 ..., N;
Step 3-2: in the g times iteration, each particle updates speed and the position of oneself:
Vi(g+1)=ω Vi(g)+c1r1(Pi(g)-Xi(g))+c2r2(Pg(g)-Xi(g)) (1)
Xi(g+1)=Xi(g)+Vi(g) (2)
Wherein, ω is inertial factor, c1And c2It is constant, r for Studying factors1And r2It is random in (0,1) range Number;
Step 3-3: selecting global mutation operator to change the position of all particles in entire group, or selection part Mutation operator changes the position of the elite particle in group:
Global mutation operator and the formula that local variations operator changes position are as follows:
Xi(g)=Xi(g)[1+A1(0.5-r3)δ]
I=1,2 ..., N (3)
Wherein, A1, A2It is customized amplitude factor, is constant, r3, r4For the random number in (0,1) range,For elite Particle, q are the number of the new particle generated by local variations operator, f1For global variation frequency, f2For local variations frequency;
Step 3-4: checking whether particle rapidity and position cross the border, and replaces corresponding particle if crossing the border with the boundary value that it exceeds Value, specific judgment method are as follows:
If Vi(g)≤Vl, then Vi(g)=V1;If Vi(g)≥Vu, then Vi(g)=V1;If Xi(g)≤X1, then Xi(g)=X1;If Xi (g)≥Xu, then Xi(g)=X1
Wherein, (V1,Vu) be particle velocity interval, (X1,Xu) be particle position range;
Step 3-5: the i.e. required hyper parameter value X of the optimal location that execution step 3-1 to step 3-4 is obtainedi(g)。
5. a kind of image classification method based on hyperparameter optimization CNN according to claim 1, it is characterised in that the step Judge in rapid 6 iteration whether reach termination condition process it is as follows:
Step 6-1: the fitness function value of each particle period variation PSO is calculated:
Wherein, CNN (Xi(g)) accuracy rate of the classification results obtained for step 5 in the claim 1, XiIt (g) is the power Benefit requires the hyper parameter value that step 3 obtains in 1;
Step 6-2: by comparing each particle fitness function value of the obtained current iteration of step 6-1, difference more new individual History optimal location Pi(g) and group's optimal location Pg(g), current iteration optimal particle X is obtainedmin(g):
Step 6-3: judge that the fitness value of optimal particle increases the threshold value for being less than and being indicated by ε, judge the best particle in group Location updating be less than byThe minimum step of expression, judges whether the number of iterations g reaches maximum number of iterations gmaxIf on meeting One of termination condition is stated, then terminates iteration.
CN201910671268.0A 2019-07-24 2019-07-24 Image classification method based on hyper-parameter optimization CNN Active CN110399917B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910671268.0A CN110399917B (en) 2019-07-24 2019-07-24 Image classification method based on hyper-parameter optimization CNN

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910671268.0A CN110399917B (en) 2019-07-24 2019-07-24 Image classification method based on hyper-parameter optimization CNN

Publications (2)

Publication Number Publication Date
CN110399917A true CN110399917A (en) 2019-11-01
CN110399917B CN110399917B (en) 2023-04-18

Family

ID=68324921

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910671268.0A Active CN110399917B (en) 2019-07-24 2019-07-24 Image classification method based on hyper-parameter optimization CNN

Country Status (1)

Country Link
CN (1) CN110399917B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110942090A (en) * 2019-11-11 2020-03-31 北京迈格威科技有限公司 Model training method, image processing method, device, electronic equipment and storage medium
CN111160459A (en) * 2019-12-30 2020-05-15 上海依图网络科技有限公司 Device and method for optimizing hyper-parameters
CN112197876A (en) * 2020-09-27 2021-01-08 中国科学院光电技术研究所 Single far-field type depth learning wavefront restoration method based on four-quadrant discrete phase modulation

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108334949A (en) * 2018-02-11 2018-07-27 浙江工业大学 A kind of tachytelic evolution method of optimization depth convolutional neural networks structure
CN109085469A (en) * 2018-07-31 2018-12-25 中国电力科学研究院有限公司 A kind of method and system of the signal type of the signal of cable local discharge for identification
US20190180188A1 (en) * 2017-12-13 2019-06-13 Cognizant Technology Solutions U.S. Corporation Evolution of Architectures For Multitask Neural Networks
CN109919202A (en) * 2019-02-18 2019-06-21 新华三技术有限公司合肥分公司 Disaggregated model training method and device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190180188A1 (en) * 2017-12-13 2019-06-13 Cognizant Technology Solutions U.S. Corporation Evolution of Architectures For Multitask Neural Networks
CN108334949A (en) * 2018-02-11 2018-07-27 浙江工业大学 A kind of tachytelic evolution method of optimization depth convolutional neural networks structure
CN109085469A (en) * 2018-07-31 2018-12-25 中国电力科学研究院有限公司 A kind of method and system of the signal type of the signal of cable local discharge for identification
CN109919202A (en) * 2019-02-18 2019-06-21 新华三技术有限公司合肥分公司 Disaggregated model training method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张进等: "改进的基于粒子群优化的支持向量机特征选择和参数联合优化算法", 《计算机应用》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110942090A (en) * 2019-11-11 2020-03-31 北京迈格威科技有限公司 Model training method, image processing method, device, electronic equipment and storage medium
CN110942090B (en) * 2019-11-11 2024-03-29 北京迈格威科技有限公司 Model training method, image processing device, electronic equipment and storage medium
CN111160459A (en) * 2019-12-30 2020-05-15 上海依图网络科技有限公司 Device and method for optimizing hyper-parameters
CN112197876A (en) * 2020-09-27 2021-01-08 中国科学院光电技术研究所 Single far-field type depth learning wavefront restoration method based on four-quadrant discrete phase modulation

Also Published As

Publication number Publication date
CN110399917B (en) 2023-04-18

Similar Documents

Publication Publication Date Title
Ren et al. A scatter learning particle swarm optimization algorithm for multimodal problems
Wang et al. Stud krill herd algorithm
CN110399917A (en) A kind of image classification method based on hyperparameter optimization CNN
Jovanovic et al. Ant colony optimization algorithm with pheromone correction strategy for the minimum connected dominating set problem
Luo et al. A clonal selection algorithm for dynamic multimodal function optimization
CN110363344A (en) Probability integral parameter prediction method based on MIV-GP algorithm optimization BP neural network
CN106502092B (en) A kind of thermal process model parameter identification method using improvement Hybrid Particle Swarm
Hoseini et al. Efficient contrast enhancement of images using hybrid ant colony optimisation, genetic algorithm, and simulated annealing
CN107392919B (en) Adaptive genetic algorithm-based gray threshold acquisition method and image segmentation method
Lan et al. A two-phase learning-based swarm optimizer for large-scale optimization
CN107506865B (en) Load prediction method and system based on LSSVM optimization
CN108399450A (en) Improvement particle cluster algorithm based on biological evolution principle
CN105868775A (en) Imbalance sample classification method based on PSO (Particle Swarm Optimization) algorithm
CN105844628B (en) It is a kind of based on the table ore zoning map of krill optimization algorithm as split plot design
CN108564592A (en) Based on a variety of image partition methods for being clustered to differential evolution algorithm of dynamic
Li et al. Dynamic community detection algorithm based on incremental identification
CN113011076A (en) Efficient particle swarm optimization method based on RBF proxy model
CN110097176A (en) A kind of neural network structure searching method applied to air quality big data abnormality detection
CN107292381A (en) A kind of method that mixed biologic symbiosis for single object optimization is searched for
CN110287985A (en) A kind of deep neural network image-recognizing method based on the primary topology with Mutation Particle Swarm Optimizer
CN112733458A (en) Engineering structure signal processing method based on self-adaptive variational modal decomposition
CN108985323A (en) A kind of short term prediction method of photovoltaic power
CN106485030B (en) A kind of symmetrical border processing method for SPH algorithm
CN109034479B (en) Multi-target scheduling method and device based on differential evolution algorithm
CN105205534B (en) A kind of three value FPRM circuit areas and power consumption optimum polarity search method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant