CN107516128A - A kind of flowers recognition methods of the convolutional neural networks based on ReLU activation primitives - Google Patents

A kind of flowers recognition methods of the convolutional neural networks based on ReLU activation primitives Download PDF

Info

Publication number
CN107516128A
CN107516128A CN201710436948.5A CN201710436948A CN107516128A CN 107516128 A CN107516128 A CN 107516128A CN 201710436948 A CN201710436948 A CN 201710436948A CN 107516128 A CN107516128 A CN 107516128A
Authority
CN
China
Prior art keywords
cnn
function
layer
mrow
flowers
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710436948.5A
Other languages
Chinese (zh)
Inventor
郭子琰
舒心
刘常燕
李雷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Post and Telecommunication University
Nanjing University of Posts and Telecommunications
Original Assignee
Nanjing Post and Telecommunication University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Post and Telecommunication University filed Critical Nanjing Post and Telecommunication University
Priority to CN201710436948.5A priority Critical patent/CN107516128A/en
Publication of CN107516128A publication Critical patent/CN107516128A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/28Quantising the image, e.g. histogram thresholding for discrimination between background and foreground patterns

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a kind of flowers recognition methods of the convolutional neural networks based on ReLU activation primitives, belong to image identification technical field, including step:CNN basic parameters are set;Weights and bias term are initialized, successively designs the down-sampled layer of convolution;Random sequence is generated, 50 samples is chosen every time and carries out batch training, forward process, error conduction and gradient calculation process are completed and by gradient summation renewal into weight model, for updating weight in next step;The training function set and renewal function is called to be trained, and the accuracy rate of test sample, the present invention can effectively carry out at a high speed flowers identification under the influence of illumination, rotation, obstruction conditions.

Description

A kind of flowers recognition methods of the convolutional neural networks based on ReLU activation primitives
Technical field
The invention belongs to the technical field of image recognition, more particularly to a kind of convolutional neural networks based on ReLU functions Flowers recognition methods.
Background technology
With science and technology fast development, smart mobile phone it is universal, people increasingly tend to it is more vivid, hold Intelligible picture replaces cumbersome word.However, the pictured of information also generates many problems.Typically, for tradition Writing record information, we can directly search plain keyword to obtain corresponding content, and when with picture come expressing information When, we but directly can not carry out searching element or processing to the information of picture expression.It is although at full speed with computer technology Development, we can handle picture and obtain important information, but to the work or seldom of flowers identification.And have The experiment for carrying out flowers identification or APP discrimination and calculating speed it is relatively low, therefore want more ideally identification flower Grass, it is also necessary to which better method is supported.
And from the 1960s, after Hubel and Wiesel proposition convolutional neural networks, convolutional neural networks gradually develop And widely cause attention.Because convolutional neural networks can directly input image, avoid at early stage of image complexity Reason, and convolutional neural networks can be very good to identify the X-Y scheme of displacement, scaling and other forms distortion consistency, therefore It is widely used.Neutral net is the common method of image procossing, and in image procossing, image is generally with its pixel Vector represents that data volume is very big, then the multilayer computing via neutral net, data volume will be incremented by with power level, utilize nerve Network, which is trained, is practically impossible to what is completed.And CNN is shared by local receptor field and weights, ginseng can be effectively reduced Keep count of, improve training speed.Therefore, we determine to carry out flowers identification using convolutional neural networks, improve flowers identification Accuracy rate, promote the development of correlation technique.And in existing CNN frameworks, sigmoid systems function is commonly used as activation primitive, is CNN calculating speed is further improved, we employ a kind of convolutional Neural based on approximate bioactivation function ReLU functions Network does flowers identification.
The content of the invention
The technical problems to be solved by the invention are overcome the deficiencies in the prior art, there is provided a kind of based on ReLU functions The flowers recognition methods of convolutional neural networks, solves the problems such as prior art discrimination is low, and recognition speed is slow, to realize that flowers are known The raising of other technology.
It is of the invention specifically to solve above-mentioned technical problem using following technical scheme:
A kind of flowers recognition methods of the convolutional neural networks based on ReLU functions, specifically includes following steps:
Step 1:Flower chart picture is pre-processed by image gray processing and bilinear interpolation;
Step 2:CNN basic parameter is configured, and initializes CNN weight and bias term;CNN basic parameter The quantity of the down-sampled layer of convolutional layer, CNN including CNN, the size of CNN convolution kernel, the CNN down-sampled range of decrease, CNN Network structure and CNN training parameter;
Step 3:It is provided for training CNN function, the function includes output function and reverse mistake in forward process Gradient calculation function in journey, choose N number of flowers image pattern and carry out batch training, wherein N takes positive integer;
It is provided for training the process of CNN function as follows:
Step 3.1, by setting the output function in forward process, functional operation is done to each layer of inputs of CNN, by letter Number output result is successively transmitted;
Step 3.2, the error of actual value and predicted value is calculated, by setting the gradient calculation function in reverse procedure, profit With the weights and bias term for minimizing gradient and each layer of CNN of method modification;
Step 4, the function that training and invocation step 3 are set, completes the identification to selected N number of flowers image pattern.
As the further preferred scheme of the flowers recognition methods of convolutional neural networks of the present invention based on ReLU functions, In step 1, flower chart picture is zoomed in and out by bilinear interpolation.
As the further preferred scheme of the flowers recognition methods of convolutional neural networks of the present invention based on ReLU functions, In step 4, the CNN specifically includes following structure:
Input layer, for reading in the image by simple rule;
Convolutional layer, for feature extraction:Each input of neuron and the local-connection of preceding layer, extract local spy Sign;
Down-sampled layer, reduction dimension is carried out for Further Feature Extraction, and to convolution results, so as to reduce amount of calculation;
Output layer, the picture feature extracted for exporting convolutional neural networks.
As the further preferred scheme of the flowers recognition methods of convolutional neural networks of the present invention based on ReLU functions, In step 4, the output function in forward process specifically represents as follows;
xl=f (ul),ul=Wlxl-1+bl
Wherein, f is ReLU activation primitives, and w is weights, and b is bias term, xl-1It is the output of l-1 layers, that is, l layers is defeated Enter.
As the further preferred scheme of the flowers recognition methods of convolutional neural networks of the present invention based on ReLU functions, In step 4, the gradient calculation function in reverse procedure specifically represents as follows:
If CNN structures are convolutional layer, the gradient calculation function in reverse procedure specifically represents as follows:
Wherein, E refers to error, and b refers to bias term,J-th of the sensitivity in l layers is represented, (u, v) represents sensitivity matrix In element position;
If CNN structures are down-sampled layer, the gradient calculation function in reverse procedure specifically represents as follows:
Wherein, E refers to error,J-th of the sensitivity in l layers is represented, β is multiplier deviation,Its In, the down-sampled function of down function representations.
The present invention compared with prior art, has following technique effect using above technical scheme:
1st, the present invention compared with conventional images recognition methods, by local receptor field and weights shared by convolutional neural networks, Number of parameters huge in image recognition can be effectively reduced, improves training speed;
2nd, the present invention can directly act on 2-D gray image and carry out image recognition, break traditions empirically artificially Feature is extracted from flower chart picture, then the method for Classification and Identification is carried out in this feature, avoids the blindness manually participated in;
3rd, of the invention compared with traditional convolutional neural networks, traditional CNN typically chooses Sigmoid systems activation primitive, still Need to carry out pre-training using this function, otherwise will occur that gradient disappears the problem of can not restraining, and approximate biological neural For activation primitive ReLU functions in the case of no pre-training, training effect is more preferable than normal activation function, or even more general than some Effect after logical activation primitive pre-training is more preferable, and training speed is faster.
Brief description of the drawings:
Fig. 1 is the functional image comparison diagram of ReLU functions and Softplus functions of the present invention;
Fig. 2 is the functional image comparison diagram of ReLU functions and Sigmoid functions of the present invention;
Fig. 3 is the CNN structure charts that the present invention uses;
Fig. 4 is flow chart of the method for the present invention.
Embodiment
Technical scheme is described in further detail below in conjunction with the accompanying drawings:
A kind of flowers recognition methods of the convolutional neural networks based on ReLU functions, as shown in figure 4, specifically including following step Suddenly:
Step 1:Flower chart picture is pre-processed by image gray processing and bilinear interpolation:
Step 2:CNN basic parameter is configured, and initializes CNN weight and bias term;CNN basic parameter The quantity of the down-sampled layer of convolutional layer, CNN including CNN, the size of CNN convolution kernel, the CNN down-sampled range of decrease, CNN Network structure and CNN training parameter;
Step 3:It is provided for training CNN function, the function includes output function and reverse mistake in forward process Gradient calculation function in journey, choose N number of flowers image pattern and carry out batch training, wherein N takes positive integer:
It is provided for training the process of CNN function as follows:
Step 3.1, by setting the output function in forward process, functional operation is done to each layer of inputs of CNN, by letter Number output result is successively transmitted;
Step 3.2, the error of actual value and predicted value is calculated, by setting the gradient calculation function in reverse procedure, profit With the weights and bias term for minimizing gradient and each layer of CNN of method modification;
Step 4, the function that training and invocation step 3 are set, completes the identification to selected N number of flowers image pattern.
Specific embodiment is as follows:
1 image preprocessing
Experiment is realized under MatlabR2014a platform.Pre- place has been carried out by image gray processing and bilinear interpolation Reason.
Color can cause certain interference to the identification of flowers species, and coloured image amount of storage is big, deals with not It is convenient, it is therefore desirable to coloured image is converted into comprising same information content and the simpler quick gray level image of processing procedure, This process is referred to as gray processing processing, is advantageous to carry out modularized processing to image, eliminates picture noise to obtain more preferable two Value image, and the amount of calculation of image procossing can be reduced.
After image gray processing, the size of the image of input is possibly different from, and has the resolution ratio of some images larger, there is one It is a little smaller.And length-width ratio also not necessarily can be the same.And convolutional neural networks structural requirement input picture size herein It is fixed, so to be zoomed in and out to various sizes of image.At present in most cases using passing through bilinear interpolation Zoom in and out so that the image of output is fixed resolution.
The optimization of 2ReLU function pair activation primitives
As shown in figure 1, in traditional neutral net, Sigmoid functions are commonly used as activation primitive:
Mathematically, the signal gain of nonlinear Sigmoid function pairs central area is larger, to the signal of two lateral areas Gain is small, in the feature space mapping of signal, there is good effect.Standard sigmoid output do not possess it is openness, it is necessary to The redundant data close to 0 is trained with some penalty factors, so as to produce sparse data, such as L1, L1/L2 or Student- T makees penalty factor.Therefore need to carry out pre-training, the problem of gradient disappearance can not restrain otherwise will occur.At present, it is a kind of Approximate biological neural activation primitive is widely used in convolutional neural networks, mainly includes ReLU functions and Softplus letters They are compared by number respectively below.
Wherein, Relu functions are defined as:
Relu (x)=max (0, x) (2)
ReLU is linear correction function, and its effect is if the value calculated is less than 0, just allows it to be equal to 0, otherwise keeps Value originally is constant.This is that a kind of some data of pressure are 0 method, but is proven, and the network after training has completely For the openness of appropriateness.And effect of visualization after training and the effect that goes out of traditional approach pre-training are much like, this also illustrates ReLU possesses the sparse ability of guiding appropriateness.Therefore there is stronger advantage compared with Sigmoid functions.
As shown in Fig. 2 Softplus functions are another approximate biological neural activation primitives, it is near with ReLU functional images Seemingly, it is but smoother, it is defined as follows:
Softplus (x)=ln (1+ex) (3)
But, on the one hand, it is smaller to nonlinear degree of dependence in depth network;Another aspect sparse features are simultaneously Do not need network that there is very strong processing linearly inseparable mechanism.So use simple, fireballing linear activation primitive ReLU Function is more particularly suitable.
3 parameter settings
CNN basic parameter is configured first, including CNN convolution, the quantity of down-sampled layer, convolution kernel is big Small, the down-sampled range of decrease, network structure and training parameter.Afterwards, convolution kernel is initialized, biasing, afterbody single-layer perceptron are set Meter.Because convolution is down-sampled successively to be designed, Initialize installation weight is controlled between -1~1 random number, and respectively Design the weight of afterbody single-layer perceptron and wealthy value.
The flowers identification CNN frameworks designed on this basis are as shown in Figure 3.
The CNN frameworks of flowers identification are specific as follows:
(1) input.When original image is not gray level image, gray processing is carried out first;When size is not 28 × 28, adopt Image is zoomed in and out with bilinear interpolation method, to ensure to meet input requirements.
(2) C1 layers.C1 is a convolutional layer, and the convolution kernel of 5 × 5 sizes has been used in C1, and it is big finally to obtain 24 × 24 A small characteristic pattern.The result that convolution obtains not is to be stored directly in C1 layers, but first passes through an activation primitive and carry out Calculate, be re-used as the characteristic value of some neuron of C1 layers, traditional activation primitive typically chooses Sigmoid systems function, but makes Needed to carry out pre-training with this function, the problem of gradient disappearance can not restrain otherwise will occur.And approximate biological neural swashs For function ReLU functions living in the case of no pre-training, training effect is more preferable than normal activation function, or even more common than some Effect after activation primitive pre-training is more preferable, and training speed is faster.In practical operation, one is also added when convolution Individual bias term.For image block x, convolution is carried out using convolution kernel w, bias term b exports the convolution for y, and computing is:
Y=ReLU (wx+b)=max (0, wx+b) (4)
(3) S1 layers.S1 is sub-sampling layer, and it obtains the characteristic pattern of 12 12 × 12 sizes.It is by will own in C1 The sub-block x summations of the 2 × 2 of non-overlapping copies, multiplied by with a weight w, are obtained plus a bias term b.Sub-sampling calculated Cheng Wei:
Y=ReLU (w ∑s xi+ b)=max (0, w ∑ xi+b) (5)
(4) C2 layers.C2 is also a feature extraction layer, and it has similar place with C1, while also has certain difference.C2 Characteristic pattern share 24.Each characteristic pattern in C2 is by several characteristic patterns in S1 or whole characteristic patterns when making convolution Input is combined into, convolution is then done again and obtains.
(5) output layer.Output layer is a full articulamentum with S2, and all neurons in S2 are connected to current layer by it Each single neuron.Returned and classified using softmax, because it produces the good probability distribution of output, finally The picture feature that obtained activation value i.e. convolutional neural networks extract.
4 function setups
It is provided for training CNN function, including output function in forward process is activation primitive, in reverse procedure Gradient calculation function.Choose N number of flowers image pattern and carry out batch training:First by setting the output letter in forward process Number, functional operation is done to each layer of input (the namely output of last layer), obtains output result, by function output result by Layer transmits.Then the error of actual value and error amount is calculated, in order that error is minimum, by setting gradient function, and will be each The gradient of layer is added and summed, using minimizing gradient and method modification above each layer of weights and bias term so that CNN is more Precisely.
Function includes the gradient calculation function in output function and reverse procedure in forward process;
(1) output function in forward process;
In CNN, the emphasis of propagated forward method is the propagated forward of input layer, the propagated forward of convolutional layer and pond The propagated forward of layer.Current layer is represented with l, then the output of current layer is represented by:
xl=f (ul),ul=Wlxl-1+bl (6)
It is as previously mentioned to export activation primitive, have chosen the more preferable ReLU functions of training effect.In institute's above formula, f is ReLU activation primitives, w are weights, and b is bias term, xl-1It is the output of l-1 layers, that is, the input of l layers.
Wherein, in forward process, input parameter is obtained first, down-sampled processing twice is then carried out to it, by this lot number According to afterbody single-layer perceptron is sent into, output layer is obtained by way of connecting entirely.
(2) the gradient calculation function in reverse procedure;
On the basis of forward process is obtained a result, the error of simultaneously transmission network is calculated, calculates gradient.Missed in extraction When poor, bundling lamination and down-sampled layer discussion:
If 1. the layer is convolutional layer
Then error is transmitted through coming from the down-sampled layer of later layer, can expand the error propagation of down-sampled layer, due in training The error of convolutional layer is to be drawn by the data of down-sampled layer by the processing of ReLU functions, so the error obtained in convolutional layer To be obtained after the derivation twice of ReLU functions;
In convolutional layer, the characteristic pattern of preamble layer carries out convolution by the core that can learn, then passes through activation primitive ReLU, structure Into the characteristic pattern of output.The figure each exported may include the convolution of multiple input figures, in general:
Wherein, j and k is output figure, and i is input figureThe jth characteristic image in l layers (convolutional layer) is represented, f (°) is represented One activation primitive, MjRepresent the set of input set.
Each figure j in convolutional layer is calculated, and its relative down-sampled layer is mapped:
Wherein, β refers to multiplier deviation,J-th of the sensitivity in l layers is represented, up (°) represents to rise sampling operation.
By allIn project sum and deviated and gradient to calculate:
Wherein, E refers to error, and b refers to bias term,J-th of the sensitivity in l layers is represented, (u, v) represents sensitivity matrix In element position.
Finally, the gradient of the weight of kernel function is calculated by backpropagation, and all gradients that the weight is related to are asked With:
Wherein,Refer to used in the connection between i-th kind of characteristic pattern of l layers input and the jth kind characteristic pattern of output Convolution kernel,RepresentIn, the quilt in convolution processThe region multiplied,Represent b-th of spirit in l layers Sensitivity.
If 2. the layer is down-sampled layer
Down-sampled layer produce output figure it is down-sampled after result, it is assumed that have N number of input, just have N number of output, output can It is expressed as:
Wherein down represents down-sampled function.This function can make output all smaller in different dimensions than inputting.Each output There are oneself multiplier deviation β and additional deviation b.
Additional deviation b is exactly the summation of element in error signal figure:
Wherein,
E refers to error, and b refers to bias term,J-th of the sensitivity in l layers is represented, (u, v) is represented in sensitivity matrix Element position.
Multiplier deviation β is relevant with the original down-sampled figure of current layer in propagated forward, and this is preserved during propagated forward A little figures are advantageous to calculate, and define:
So β gradient is:
Wherein, E refers to error,J-th of the sensitivity in l layers is represented, β is multiplier deviation,( The down-sampled function of down function representations).
Finally carry out gradient updating, including the weight of renewal feature extraction layer and the weight of afterbody single-layer perceptron.
5 training CNN
The function that simultaneously invocation step 3 is set is trained, completes the identification to selected N number of flowers image pattern.
6 experimental verifications
In order to verify feasibility that the convolutional neural networks based on ReLU functions identify to flowers, we choose 300 respectively The picture for opening rose and daisy is tested.50 samples are chosen in every kind of flowers as training set.7.5% knowledge is obtained Other error rate, there is good recognition performance.
In order to verify the superiority for the ReLU activation primitives selected herein, a series of contrast experiment is, in MINST Handwritten numeral data set, CIFAR-10 basic datas collection, on JC-NORB data sets, will in the case of no pre-training The identification error rate of ReLU functions and Sigmoid functions and Softplus functions is contrasted:1 different activation primitives of table are in difference Identification error rate on data set;
Table 1
It can be seen from the experimental result of table 1 in the case of no pre-training, approximate biological neural activation primitive ReLU Function, which compares Sigmoid functions with Softplus functions, has very big advantage.ReLU functions are known compared to Softplus functions Rate is not approximate and has a certain degree of advantage, and because ReLU functions are simple, efficient, can show faster recognition speed, Further prove that ReLU functions have feasibility and dominance as activation primitive.
To sum up, the flowers recognition methods of the convolutional neural networks proposed by the invention based on ReLU functions, with traditional Based on CNN frameworks, activation primitive is improved.Acquired results are based on sigmoid activation primitives better than existing at present CNN image-recognizing method.
Embodiments of the present invention are explained in detail above in conjunction with accompanying drawing, but the present invention is not limited to above-mentioned implementation Mode, can also be on the premise of present inventive concept not be departed from those of ordinary skill in the art's possessed knowledge Make a variety of changes.

Claims (5)

1. a kind of flowers recognition methods of the convolutional neural networks based on ReLU functions, it is characterised in that specifically include following step Suddenly:
Step 1:Flower chart picture is pre-processed by image gray processing and bilinear interpolation:
Step 2:CNN basic parameter is configured, and initializes CNN weight and bias term;CNN basic parameter includes CNN convolutional layer, the quantity of CNN down-sampled layer, the CNN size of convolution kernel, the CNN down-sampled range of decrease, CNN network The training parameter of structure and CNN;
Step 3:It is provided for training CNN function, the function is included in output function and reverse procedure in forward process Gradient calculation function, choose N number of flowers image pattern and carry out batch training, wherein N takes positive integer:
It is provided for training the process of CNN function as follows:
Step 3.1, by setting the output function in forward process, functional operation is done to each layer of inputs of CNN, function is defeated Go out result successively to transmit;
Step 3.2, the error of actual value and predicted value is calculated, by setting the gradient calculation function in reverse procedure, using most The weights and bias term of smallization gradient and each layer of CNN of method modification;
Step 4, the function that training and invocation step 3 are set, completes the identification to selected N number of flowers image pattern.
2. the flowers recognition methods of the convolutional neural networks according to claim 1 based on ReLU functions, it is characterised in that: In step 1, flower chart picture is zoomed in and out by bilinear interpolation.
3. the flowers recognition methods of the convolutional neural networks according to claim 1 based on ReLU functions, it is characterised in that In step 4, the CNN specifically includes following structure:
Input layer, for reading in the image by simple rule;
Convolutional layer, for feature extraction:Each input of neuron and the local-connection of preceding layer, extract local feature;
Down-sampled layer, reduction dimension is carried out for Further Feature Extraction, and to convolution results, so as to reduce amount of calculation;
Output layer, the picture feature extracted for exporting convolutional neural networks.
4. the flowers recognition methods of the convolutional neural networks according to claim 1 based on ReLU functions, it is characterised in that In step 4, the output function in forward process specifically represents as follows;
xl=f (ul),ul=Wlxl-1+bl
Wherein, f is ReLU activation primitives, and w is weights, and b is bias term, xl-1It is the output of l-1 layers, that is, the input of l layers.
5. the flowers recognition methods of the convolutional neural networks according to claim 1 based on ReLU functions, it is characterised in that In step 4, the gradient calculation function in reverse procedure specifically represents as follows:
If CNN structures are convolutional layer, the gradient calculation function in reverse procedure specifically represents as follows:
<mrow> <mfrac> <mrow> <mo>&amp;part;</mo> <mi>E</mi> </mrow> <mrow> <mo>&amp;part;</mo> <msub> <mi>b</mi> <mi>j</mi> </msub> </mrow> </mfrac> <mo>=</mo> <msub> <mi>&amp;Sigma;</mi> <mrow> <mi>u</mi> <mo>,</mo> <mi>v</mi> </mrow> </msub> <msub> <mrow> <mo>(</mo> <msubsup> <mi>&amp;delta;</mi> <mi>j</mi> <mi>l</mi> </msubsup> <mo>)</mo> </mrow> <mrow> <mi>u</mi> <mi>v</mi> </mrow> </msub> </mrow>
Wherein, E refers to error, and b refers to bias term,J-th of the sensitivity in l layers is represented, (u, v) is represented in sensitivity matrix Element position;
If CNN structures are down-sampled layer, the gradient calculation function in reverse procedure specifically represents as follows:
Wherein, E refers to error,J-th of the sensitivity in l layers is represented, β is multiplier deviation,Wherein, The down-sampled function of down function representations.
CN201710436948.5A 2017-06-12 2017-06-12 A kind of flowers recognition methods of the convolutional neural networks based on ReLU activation primitives Pending CN107516128A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710436948.5A CN107516128A (en) 2017-06-12 2017-06-12 A kind of flowers recognition methods of the convolutional neural networks based on ReLU activation primitives

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710436948.5A CN107516128A (en) 2017-06-12 2017-06-12 A kind of flowers recognition methods of the convolutional neural networks based on ReLU activation primitives

Publications (1)

Publication Number Publication Date
CN107516128A true CN107516128A (en) 2017-12-26

Family

ID=60721751

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710436948.5A Pending CN107516128A (en) 2017-06-12 2017-06-12 A kind of flowers recognition methods of the convolutional neural networks based on ReLU activation primitives

Country Status (1)

Country Link
CN (1) CN107516128A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108171275A (en) * 2018-01-17 2018-06-15 百度在线网络技术(北京)有限公司 For identifying the method and apparatus of flowers
CN108858201A (en) * 2018-08-15 2018-11-23 深圳市烽焌信息科技有限公司 It is a kind of for nursing the robot and storage medium of children
CN108898059A (en) * 2018-05-30 2018-11-27 上海应用技术大学 Flowers recognition methods and its equipment
CN108994855A (en) * 2018-08-15 2018-12-14 深圳市烽焌信息科技有限公司 Rubbish periodic cleaning method and robot
CN109325484A (en) * 2018-07-30 2019-02-12 北京信息科技大学 Flowers image classification method based on background priori conspicuousness
CN109978036A (en) * 2019-03-11 2019-07-05 华瑞新智科技(北京)有限公司 Target detection deep learning model training method and object detection method
CN110228673A (en) * 2019-04-17 2019-09-13 华南师范大学 A kind of intelligent classification dustbin
CN113191955A (en) * 2021-06-17 2021-07-30 江苏奥易克斯汽车电子科技股份有限公司 Method and device for reconstructing image super-resolution
CN116824512A (en) * 2023-08-28 2023-09-29 西华大学 27.5kV visual grounding disconnecting link state identification method and device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104850845A (en) * 2015-05-30 2015-08-19 大连理工大学 Traffic sign recognition method based on asymmetric convolution neural network
CN106778902A (en) * 2017-01-03 2017-05-31 河北工业大学 Milk cow individual discrimination method based on depth convolutional neural networks

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104850845A (en) * 2015-05-30 2015-08-19 大连理工大学 Traffic sign recognition method based on asymmetric convolution neural network
CN106778902A (en) * 2017-01-03 2017-05-31 河北工业大学 Milk cow individual discrimination method based on depth convolutional neural networks

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
许可: "卷积神经网络在图像识别上的应用的研究", 《中国优秀硕士学位论文全文数据库》 *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108171275A (en) * 2018-01-17 2018-06-15 百度在线网络技术(北京)有限公司 For identifying the method and apparatus of flowers
CN108898059A (en) * 2018-05-30 2018-11-27 上海应用技术大学 Flowers recognition methods and its equipment
CN109325484A (en) * 2018-07-30 2019-02-12 北京信息科技大学 Flowers image classification method based on background priori conspicuousness
CN109325484B (en) * 2018-07-30 2021-08-24 北京信息科技大学 Flower image classification method based on background prior significance
CN108858201A (en) * 2018-08-15 2018-11-23 深圳市烽焌信息科技有限公司 It is a kind of for nursing the robot and storage medium of children
CN108994855A (en) * 2018-08-15 2018-12-14 深圳市烽焌信息科技有限公司 Rubbish periodic cleaning method and robot
CN109978036A (en) * 2019-03-11 2019-07-05 华瑞新智科技(北京)有限公司 Target detection deep learning model training method and object detection method
CN110228673A (en) * 2019-04-17 2019-09-13 华南师范大学 A kind of intelligent classification dustbin
CN113191955A (en) * 2021-06-17 2021-07-30 江苏奥易克斯汽车电子科技股份有限公司 Method and device for reconstructing image super-resolution
CN116824512A (en) * 2023-08-28 2023-09-29 西华大学 27.5kV visual grounding disconnecting link state identification method and device
CN116824512B (en) * 2023-08-28 2023-11-07 西华大学 27.5kV visual grounding disconnecting link state identification method and device

Similar Documents

Publication Publication Date Title
CN107516128A (en) A kind of flowers recognition methods of the convolutional neural networks based on ReLU activation primitives
Rahman et al. A new benchmark on american sign language recognition using convolutional neural network
Phung et al. A deep learning approach for classification of cloud image patches on small datasets
CN103955702B (en) SAR image terrain classification method based on depth RBF network
CN109711426B (en) Pathological image classification device and method based on GAN and transfer learning
CN108304826A (en) Facial expression recognizing method based on convolutional neural networks
CN107798381A (en) A kind of image-recognizing method based on convolutional neural networks
CN112862690B (en) Transformers-based low-resolution image super-resolution method and system
CN104866810A (en) Face recognition method of deep convolutional neural network
CN109785344A (en) The remote sensing image segmentation method of binary channel residual error network based on feature recalibration
CN104517122A (en) Image target recognition method based on optimized convolution architecture
CN107491729B (en) Handwritten digit recognition method based on cosine similarity activated convolutional neural network
Ren et al. Convolutional neural network based on principal component analysis initialization for image classification
CN108416270A (en) A kind of traffic sign recognition method based on more attribute union features
CN106997463A (en) A kind of guideboard recognition methods based on compressed sensing domain and convolutional neural networks
CN112733716A (en) SROCRN network-based low-resolution text image identification method
Bouchain Character recognition using convolutional neural networks
CN113205103A (en) Lightweight tattoo detection method
CN109508640A (en) Crowd emotion analysis method and device and storage medium
CN117593666B (en) Geomagnetic station data prediction method and system for aurora image
CN112818777B (en) Remote sensing image target detection method based on dense connection and feature enhancement
CN108764233A (en) A kind of scene character recognition method based on continuous convolution activation
CN109034279A (en) Handwriting model training method, hand-written character recognizing method, device, equipment and medium
CN116246110A (en) Image classification method based on improved capsule network
CN110866866A (en) Image color-matching processing method and device, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20171226