CN115661491A - Monitoring method for pest control in tea tree planting - Google Patents

Monitoring method for pest control in tea tree planting Download PDF

Info

Publication number
CN115661491A
CN115661491A CN202211126212.5A CN202211126212A CN115661491A CN 115661491 A CN115661491 A CN 115661491A CN 202211126212 A CN202211126212 A CN 202211126212A CN 115661491 A CN115661491 A CN 115661491A
Authority
CN
China
Prior art keywords
layer
tea
tea lesser
shallow
network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211126212.5A
Other languages
Chinese (zh)
Inventor
陈世春
王晓庆
江宏燕
胡翔
彭萍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Academy of Agricultural Sciences
Original Assignee
Chongqing Academy of Agricultural Sciences
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Academy of Agricultural Sciences filed Critical Chongqing Academy of Agricultural Sciences
Priority to CN202211126212.5A priority Critical patent/CN115661491A/en
Publication of CN115661491A publication Critical patent/CN115661491A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06MCOUNTING MECHANISMS; COUNTING OF OBJECTS NOT OTHERWISE PROVIDED FOR
    • G06M11/00Counting of objects distributed at random, e.g. on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Evolutionary Computation (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Computational Linguistics (AREA)
  • Mathematical Physics (AREA)
  • Biophysics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a monitoring method for pest control in tea tree planting, which comprises the following steps: a. firstly, extracting the characteristics of an image sample by adopting a backbone network, and simultaneously selecting the shallow characteristics of the image sample to perform regression operation; b. superposing deep features in each layer of the backbone network on shallow features layer by layer in a mode of combining up-sampling and convolution layers; c. fusing the deep-layer characteristics of each layer of the main network with the corresponding shallow-layer characteristics after each layer of superposition to serve as an output layer of the neural network, and finishing regression identification on the tea lesser leafhoppers; meanwhile, according to the recognition and regression operation results, the defined yellow board range is combined, the duplicate removal processing of repeated recognition of tea lesser leafhoppers at the same position on different output layers is completed, and finally, the quantity calculation is carried out. The monitoring method can efficiently finish the identification and counting of the tea lesser leafhoppers, has high judgment accuracy, and is time-saving, labor-saving and labor-saving.

Description

Monitoring method for pest control in tea tree planting
The application is a divisional application of a method for identifying and counting tea lesser leafhoppers based on a convolutional neural network, which is provided with the application number of 202010967815.2.
Technical Field
The invention relates to the technical field of tea garden pest control, in particular to a monitoring method for pest control in tea tree planting.
Background
Tea lesser leafhopper is one of the important pests which have the widest distribution and serious harm in each tea area in China and influence the yield and quality of tea. In the middle and lower reaches of Yangtze river in China, the loss caused by the loss is about 10 to 15 percent of summer and autumn tea in normal years, and the disaster year is as high as more than 50 percent. In the prevention and control of tea garden diseases and insect pests, one of the primary tasks is the prevention and control of tea lesser leafhoppers; however, to take a prevention measure for tea lesser leafhoppers, the occurrence quantity and the occurrence trend of the tea lesser leafhoppers in the tea garden need to be monitored. At present, a manual monitoring method is mainly adopted, namely the insect population on 100 tender leaves (30 bud tips) is randomly investigated when the leaves are dry in the morning in sunny days or is investigated all day after the leaves are cloudy; however, the method has high requirements on professional knowledge, judgment experience (the tea lesser leafhopper is agile and active in movement and good in climbing) and vision level (the tea lesser leafhopper is small in size) of technical personnel, and the method needs manual screening, identification and counting for a long time, thereby being time-consuming and labor-consuming. Meanwhile, the accuracy rate is greatly influenced by artificial factors, the fluctuation range of each identification is large, the error is large, and the monitoring of the tea lesser leafhoppers cannot be accurately, quickly and effectively finished.
Disclosure of Invention
The invention aims to provide a monitoring method for insect pest control in tea tree planting, which can efficiently complete the identification and counting of tea lesser leafhoppers, has high judgment accuracy, saves time and labor cost.
The purpose of the invention is realized by the following technical scheme:
a monitoring method for pest control in tea tree planting is characterized by comprising the following steps:
a. firstly, taking a standard VGG16 network structure as a backbone network, extracting the characteristics of an image sample containing all contours of a yellow plate, and selecting the shallow characteristics of the image sample to perform regression operation;
b. superposing deep features containing rich semantic information in each layer of the backbone network on shallow features layer by layer in a mode of combining upsampling and convolution layers to enrich the semantic information of the shallow features; in the step b, direct assignment of corresponding positions and zero filling position are adopted for upsampling, that is, if data with the width of N and the height of M are sampled, A, B times of upsampling is respectively carried out, and then data with the width of NA and the height of MB are obtained; wherein
Figure BDA0003848850170000021
The corresponding positions of (N, M) in the original image are taken as the rest points excluding the corresponding positions of the middle points in the original image, and the rest points are filling positions;
c. fusing each layer of deep layer features of the main network with the corresponding overlapped shallow layer features of each layer to serve as an output layer of the neural network, and finishing regression recognition on the tea lesser leafhoppers on the output layer of the neural network; in the regression identification process, the identification of the tea lesser leafhoppers at different positions and the duplicate removal treatment of the tea lesser leafhoppers at the same position on the same layer of output layer are completed; meanwhile, according to the identification of the tea lesser leafhoppers and the regression operation result (the regression operation result contains the position information of the identified tea lesser leafhoppers), the de-duplication processing of repeated identification of the tea lesser leafhoppers at the same position on different output layers is completed by combining the defined yellow board range, and finally, the quantity of the tea lesser leafhoppers on the yellow board is calculated.
The training method of the network adopts a general identification model iterative training mode, and the training set is obtained by adopting a manual marking mode of selecting targets on different data; in order to weaken the influence of the distance randomness of camera imaging pixels and shooting distance, the size is estimated in the step a in a template matching mode, so that the optimal feature layer in the backbone network is selected as an input layer of the shallow network; the template matching mode specifically comprises the following steps:
firstly, drawing a square frame with the side length of h and the unit of cm at four corners and the center of a yellow plate, and performing template matching operation by adopting a rectangular frame in an original image of the image to respectively determine four vertex angles and the center of the yellow plate; then, the number of pixel points contained in the rectangular frame in the matching of the four vertex angles and the center is calculated simultaneously, and the number is n respectively 1 、n 2 、n 3 、n 4 And n 0 And respectively estimating the number of imaging pixel points of the tea lesser leafhopper in the image original image, wherein the specific formula is as follows:
Figure BDA0003848850170000022
wherein i =0,1,2,3,4;
wherein k is the body length of the lesser leafhopper of tea, and the unit is millimeter; 1 in millimeters; h represents the side length of the drawn square frame and the unit is centimeter;
the rectangular yellow board is divided into four triangular areas through four top angles and a center, then the pixel size occupied by the average tea lesser leafhopper in each area is calculated, and the specific formula is as follows:
Figure BDA0003848850170000031
wherein i =1,2,3,4; and x5= x1;
and finally, selecting the optimal characteristic layer of the tea lesser leafhoppers for identification according to the pixel size occupied by the tea lesser leafhoppers in each area, wherein the specific formula is as follows:
Figure BDA0003848850170000032
wherein i =1,2,3,4;
in the formula (I), the compound is shown in the specification,
Figure BDA0003848850170000033
represents rounding down; and k represents that the characteristic layer output by the kth block is selected as the recognition layer of the tea lesser leafhopper in the area, namely the input layer of the shallow neural network.
Because the body length of the tea lesser leafhopper is short and the display range of the tea lesser leafhopper on the whole image containing the yellow board is extremely small, if the deeper semantic information is directly obtained through the deep neural network, although the recognition performance and the recognition accuracy can be improved, the deeper network output characteristics can directly ignore the information of the tea lesser leafhopper, thereby causing the monitoring failure; if the shallow neural network is directly adopted to identify the tea lesser leafhopper, the detection precision is extremely low due to the lack of high-level semantic information. Meanwhile, different shooting distances can cause different sizes of the pixels of the tea lesser leafhoppers, so that the selection of the network identification network characteristic layer has uncertainty: the network layer is too deep, and the tea lesser leafhopper information is lost; the network layer is too shallow, and semantic information and feeling are not large enough; further reducing the detection accuracy. According to the method, the VGG16 network structure is used as a main network to extract the characteristics of the tea lesser leafhoppers containing yellow boards, the shallow layer characteristics are selected to perform regression operation, the deep layer characteristics rich in semantic information of the main network are superposed on the shallow layer characteristics layer by layer in a mode of up-sampling and convolution layer combination, so that the semantic information of the shallow layer characteristics is enriched, the image information of the tea lesser leafhoppers is prevented from being lost, the identification and calculation of the tea lesser leafhoppers are efficiently and accurately completed, the monitoring of the tea lesser leafhoppers is completed, and the method is time-saving, labor-saving and saves labor cost.
And c, further optimizing, wherein the step c judges whether the repeated identification at the same position is carried out by adopting a mode of comparing the proportion of the intersection area and the union area of the two identification frames at the same position with a threshold value.
And c, further optimizing, wherein the range of the yellow board in the step c is determined according to the matching result of the rectangular frames on the four top corners of the yellow board or the image segmentation mode (such as large difference between the color of the yellow board and the background).
In the step a, image samples containing all outlines of the yellow board are image frames obtained by adopting any one of high-definition camera shooting or mobile phone shooting, and the image frames are converted into a format directly read by a deep learning frame through preprocessing and training; when regression operation is performed in the step a, the proportion of the anchors is set as 1: k or k:1; wherein k is the body length of tea lesser leafhopper.
The invention has the following technical effects:
the invention provides a monitoring method for pest control in tea tree planting, which is characterized in that a VGG16 network structure is used as a main network to extract the characteristics of tea lesser leafhoppers containing yellow boards, shallow layer characteristics are selected to perform regression operation, deep layer characteristics rich in semantic information of the main network are superposed on the shallow layer characteristics layer by layer in a mode of up-sampling and convolution layer combination, so that the semantic information of the shallow layer characteristics is enriched, the image information loss of the tea lesser leafhoppers is avoided, and the automatic identification, position regression and automatic calculation of the tea lesser leafhoppers are further completed; the monitoring method has the advantages that the precision of the method for identifying the tea lesser leafhoppers can reach 98% under the condition of 1% omission; meanwhile, the processing speed of high-definition pictures such as 1080P data reaches the ms level, the tea lesser leafhoppers can be identified and calculated quickly and accurately, the tea lesser leafhoppers in the tea garden can be monitored quickly and accurately, and the method can be widely applied to prevention and control of the tea lesser leafhoppers such as tea trees, fruit trees and the like. In addition, the monitoring method effectively avoids manual long-time screening, identification and counting, thereby saving time, labor and labor cost.
Drawings
Fig. 1 is a block diagram of a backbone network structure in an embodiment of the present invention.
FIG. 2 is a general block diagram of a convolutional neural network in an embodiment of the present invention.
Detailed Description
The technical solution of the present invention will be clearly and completely described below with reference to specific embodiments.
Examples
As shown in the attached figures 1 and 2, the monitoring method for the pest control in tea tree planting is characterized by comprising the following steps:
a. firstly, taking a standard VGG16 network structure as a backbone network, extracting the characteristics of an image sample containing all contours of a yellow plate, and selecting the shallow characteristics of the image sample to perform regression operation; the method specifically comprises the following steps: an image sample containing all the outlines of the yellow board is an image frame obtained by adopting any one of high-definition camera shooting or mobile phone shooting, and is converted into a format directly read by a deep learning frame through preprocessing and training, and then feature extraction is carried out; in order to weaken the influence of the distance randomness of camera imaging pixels and shooting distance, the size is estimated by adopting a template matching mode, so that the optimal characteristic layer in a backbone network is selected as an input layer of a shallow network, and the method specifically comprises the following steps:
firstly, drawing a square frame with the side length of h (such as 1 cm) at four corners and the center of a yellow plate, and performing template matching operation by adopting a rectangular frame in an original image of the image to respectively determine four vertex angles and the center of the yellow plate; then, the number of pixel points contained in the rectangular frame in the matching of the four vertex angles and the center is calculated simultaneously, and the number is n respectively 1 、n 2 、n 3 、n 4 And n 0 And respectively estimating the number of imaging pixel points of the tea lesser leafhopper in the image original image, wherein the specific formula is as follows:
Figure BDA0003848850170000051
wherein i =0,1,2,3,4;
wherein k is the body length of the tea lesser leafhopper, and the unit is millimeter, for example, the tea lesser leafhopper with the body length of 3.5mm is adopted; the above reaction formula is:
Figure BDA0003848850170000052
wherein i =0,1,2,3,4;
the rectangular yellow board is divided into four triangular areas through four vertex angles and a center, and then the pixel size occupied by the average tea lesser leafhopper in each area is calculated, wherein the specific formula is as follows:
Figure BDA0003848850170000053
wherein i =1,2,3,4; and x 5 =x 1
And finally, selecting the optimal characteristic layer of the tea lesser leafhoppers for identification according to the pixel size occupied by the tea lesser leafhoppers in each area, wherein the specific formula is as follows:
Figure BDA0003848850170000054
wherein i =1,2,3,4;
in the formula (I), the compound is shown in the specification,
Figure BDA0003848850170000055
represents rounding down; k represents that a characteristic layer output by the kth block is selected as a tea lesser leafhopper identification layer of the area, namely an input layer of a shallow neural network;
when regression operation is carried out, the proportion of the set anchors is 1:3.5 or 3.5:1.
b. superposing deep features containing rich semantic information in each layer of the backbone network on shallow features layer by layer in a mode of up-sampling and convolution layer combination to enrich semantic information of the shallow features (as shown in figure 2); the upsampling adopts a mode that the corresponding position is directly assigned and the filling position is zero.
c. Each layer of deep-layer features of the backbone network is fused with the corresponding overlapped shallow-layer features of each layer to serve as an output layer of the neural network (for example, in fig. 2, the output layer is a 0 th layer feature, a 1 st layer feature, a 2 nd layer feature, a 3 rd layer feature, a 4 th layer feature and a 5 th layer feature), so that each layer of results are output, and regression recognition of the tea lesser leafhoppers is completed; in regression recognition, recognition of tea lesser leafhoppers at different positions and duplicate removal processing of the tea lesser leafhoppers at the same position on the same layer of output layer are completed (for repeated recognition of judging whether the tea lesser leafhoppers are at the same position, a mode of comparing the intersection area and the union area ratio of two recognition frames at the same position with a threshold value is adopted); meanwhile, according to the identification of the tea lesser leafhoppers and the regression operation result (the regression operation result contains the position information of the identified tea lesser leafhoppers), the range of the delimited yellow board (the range of the yellow board can be delimited in an image segmentation mode, such as the color of the yellow board is greatly different from the background, or the range of the yellow board can be delimited according to the matching result of rectangular frames on four vertex angles of the yellow board) is combined, so that the duplicate removal processing of the repeated identification of the tea lesser leafhoppers at the same position on different output layers is completed, and finally, the number of the tea lesser leafhoppers on the yellow board is calculated.
The training method of the network can be achieved by a general identification model iterative training mode and a training set is obtained by a manual marking mode of selecting targets on different data in a frame mode.
The monitoring method has the advantages that the precision of the method for identifying the tea lesser leafhoppers can reach 98% under the condition of 1% omission; meanwhile, the processing speed of high-definition pictures such as 1080P data reaches ms level, the tea lesser leafhopper can be identified and calculated quickly and accurately, and the method can be widely applied to the control of the lesser leafhopper in the fields of tea trees, fruit trees and the like.

Claims (5)

1. A monitoring method for pest control in tea tree planting is characterized by comprising the following steps:
a. firstly, taking a standard VGG16 network structure as a backbone network, extracting the characteristics of an image sample containing all contours of a yellow plate, and selecting the shallow characteristics of the image sample to perform regression operation;
b. superposing deep features containing rich semantic information in each layer of the backbone network on shallow features layer by layer in a mode of combining upsampling and convolution layers to enrich the semantic information of the shallow features; in the step b, the up-sampling adopts a mode that the corresponding position is directly assigned and the filling position is zero, namely if the data with the width of N and the height of M are sampled, A, B times are up-sampled respectively, and then the data with the width of NA and the height of MB are obtained; wherein
Figure FDA0003848850160000011
For the corresponding position of (N, M) in the original image, excludeAll the other points after the corresponding position of the original image are filling positions;
c. fusing each layer of deep layer features of the main network with the corresponding overlapped shallow layer features of each layer to serve as an output layer of the neural network, and finishing regression recognition on the tea lesser leafhoppers on the output layer of the neural network; in the regression identification process, the identification of the tea lesser leafhoppers at different positions and the duplicate removal treatment of the tea lesser leafhoppers at the same position on the same layer of output layer are completed; meanwhile, according to the recognition and regression operation results of the tea lesser leafhoppers and the defined yellow board range, the duplicate removal processing of repeated recognition of the tea lesser leafhoppers at the same position on different output layers is completed, and finally the quantity of the tea lesser leafhoppers on the yellow board is calculated.
2. A method of monitoring pest control in tea tree planting according to claim 1, wherein: the training method of the network adopts a general identification model iterative training mode, and the training set is obtained by adopting a manual marking mode of selecting targets on different data; in the step a, estimating the size by adopting a template matching mode, so as to select the optimal characteristic layer in the backbone network as an input layer of the shallow network; the template matching mode specifically comprises the following steps:
firstly, drawing a square frame with the side length of h and the unit of cm at four corners and the center of a yellow plate, and performing template matching operation by adopting a rectangular frame in an original image of the image to respectively determine four vertex angles and the center of the yellow plate; then, the number of pixel points contained in the rectangular frame in the matching of the four vertex angles and the center is calculated simultaneously, and the number is n respectively 1 、n 2 、n 3 、n 4 And n 0 And respectively estimating the number of imaging pixel points of the tea lesser leafhopper in the image original image, wherein the specific formula is as follows:
Figure FDA0003848850160000012
wherein i =0,1,2,3,4;
wherein k is the body length of the lesser leafhopper of tea, and the unit is millimeter; the unit of 1 is millimeters; h represents the side length of the drawn square frame and the unit is centimeter;
the rectangular yellow board is divided into four triangular areas through four vertex angles and a center, and then the pixel size occupied by the average tea lesser leafhopper in each area is calculated, wherein the specific formula is as follows:
Figure FDA0003848850160000021
wherein i =1,2,3,4; and x 5 =x 1
And finally, selecting the optimal characteristic layer of the tea lesser leafhoppers for identification according to the pixel size occupied by the tea lesser leafhoppers in each area, wherein the specific formula is as follows:
Figure FDA0003848850160000022
wherein i =1,2,3,4;
in the formula (I), the compound is shown in the specification,
Figure FDA0003848850160000023
represents rounding down; and k represents that the characteristic layer output by the kth block is selected as the recognition layer of the tea lesser leafhopper in the area, namely the input layer of the shallow neural network.
3. A method of monitoring pest control in tea tree planting according to claim 1 or claim 2, wherein: and c, judging whether the repeated identification at the same position is carried out or not by adopting a mode of comparing the proportion of the intersection area and the union area of the two identification frames at the same position with a threshold value.
4. A method of monitoring pest control in tea tree planting according to any one of claims 1 to 3, wherein: and c, determining the range of the yellow board in the step c according to the matching result of the rectangular frames on the four top corners of the yellow board or the image segmentation mode.
5. A method of monitoring pest control during tea tree planting according to any one of claims 1 to 4, wherein: in the step a, image samples containing all outlines of the yellow board are image frames obtained by adopting any one mode of high-definition camera shooting or mobile phone shooting, and are converted into a format directly read by a deep learning frame through preprocessing and training; when regression operation is carried out in the step a, the proportion of the set anchors is 1: k or k:1; wherein k is the body length of tea lesser leafhopper.
CN202211126212.5A 2020-09-15 2020-09-15 Monitoring method for pest control in tea tree planting Pending CN115661491A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211126212.5A CN115661491A (en) 2020-09-15 2020-09-15 Monitoring method for pest control in tea tree planting

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010967815.2A CN112101455B (en) 2020-09-15 2020-09-15 Tea lesser leafhopper identification and counting method based on convolutional neural network
CN202211126212.5A CN115661491A (en) 2020-09-15 2020-09-15 Monitoring method for pest control in tea tree planting

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202010967815.2A Division CN112101455B (en) 2020-09-15 2020-09-15 Tea lesser leafhopper identification and counting method based on convolutional neural network

Publications (1)

Publication Number Publication Date
CN115661491A true CN115661491A (en) 2023-01-31

Family

ID=73759106

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202010967815.2A Active CN112101455B (en) 2020-09-15 2020-09-15 Tea lesser leafhopper identification and counting method based on convolutional neural network
CN202211126212.5A Pending CN115661491A (en) 2020-09-15 2020-09-15 Monitoring method for pest control in tea tree planting

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202010967815.2A Active CN112101455B (en) 2020-09-15 2020-09-15 Tea lesser leafhopper identification and counting method based on convolutional neural network

Country Status (1)

Country Link
CN (2) CN112101455B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114862833A (en) * 2022-06-01 2022-08-05 江苏东南生科茶业有限公司 Tea biological content detection method based on multi-section random identification

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108399362B (en) * 2018-01-24 2022-01-07 中山大学 Rapid pedestrian detection method and device
CN110770752A (en) * 2018-09-04 2020-02-07 安徽中科智能感知产业技术研究院有限责任公司 Automatic pest counting method combining multi-scale feature fusion network with positioning model
CN109815867A (en) * 2019-01-14 2019-05-28 东华大学 A kind of crowd density estimation and people flow rate statistical method
CN110781744A (en) * 2019-09-23 2020-02-11 杭州电子科技大学 Small-scale pedestrian detection method based on multi-level feature fusion

Also Published As

Publication number Publication date
CN112101455B (en) 2022-08-09
CN112101455A (en) 2020-12-18

Similar Documents

Publication Publication Date Title
CN108898085B (en) Intelligent road disease detection method based on mobile phone video
CN107220980B (en) A kind of MRI image brain tumor automatic division method based on full convolutional network
CN108710865A (en) A kind of driver's anomaly detection method based on neural network
CN106650737B (en) Automatic image cutting method
CN109214298B (en) Asian female color value scoring model method based on deep convolutional network
CN108334847A (en) A kind of face identification method based on deep learning under real scene
CN106204779A (en) The check class attendance method learnt based on plurality of human faces data collection strategy and the degree of depth
CN111488827A (en) Crowd counting method and system based on multi-scale feature information
CN110827312B (en) Learning method based on cooperative visual attention neural network
CN112418124B (en) Intelligent fish monitoring method based on video image
CN106897681A (en) A kind of remote sensing images comparative analysis method and system
CN107123146A (en) The mark localization method and system of a kind of scaling board image
CN106803257A (en) The dividing method of scab in a kind of crop disease leaf image
CN106503695A (en) A kind of tobacco plant identification and method of counting based on Aerial Images
CN104598907A (en) Stroke width figure based method for extracting Chinese character data from image
CN109961013A (en) Recognition methods, device, equipment and the computer readable storage medium of lane line
CN115223054A (en) Remote sensing image change detection method based on partition clustering and convolution
CN111259925A (en) Method for counting field wheat ears based on K-means clustering and width mutation algorithm
CN107392251A (en) A kind of method that target detection network performance is lifted using category images
CN103985130A (en) Image significance analysis method for complex texture images
CN104598914A (en) Skin color detecting method and device
CN115546187A (en) Agricultural pest and disease detection method and device based on YOLO v5
CN115661491A (en) Monitoring method for pest control in tea tree planting
CN107665347A (en) Visual saliency target detection method based on filtering optimization
CN102938052A (en) Sugarcane segmentation and recognition method based on computer vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination