CN116824369A - Litchi diseases real-time detection system and method based on edge calculation and yolov7-tiny - Google Patents

Litchi diseases real-time detection system and method based on edge calculation and yolov7-tiny Download PDF

Info

Publication number
CN116824369A
CN116824369A CN202310721957.4A CN202310721957A CN116824369A CN 116824369 A CN116824369 A CN 116824369A CN 202310721957 A CN202310721957 A CN 202310721957A CN 116824369 A CN116824369 A CN 116824369A
Authority
CN
China
Prior art keywords
module
map
tiny
average
yolov7
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310721957.4A
Other languages
Chinese (zh)
Inventor
肖佳仪
岳学军
罗志环
曾凡国
宋庆奎
丁子予
李海锋
郑健宇
李炫天
陈俊致
钟文山
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China Agricultural University
Original Assignee
South China Agricultural University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China Agricultural University filed Critical South China Agricultural University
Priority to CN202310721957.4A priority Critical patent/CN116824369A/en
Publication of CN116824369A publication Critical patent/CN116824369A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/18Complex mathematical operations for evaluating statistical data, e.g. average values, frequency distributions, probability functions, regression analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Medical Informatics (AREA)
  • Artificial Intelligence (AREA)
  • Computational Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mathematical Analysis (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Operations Research (AREA)
  • Probability & Statistics with Applications (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Algebra (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a litchi disease real-time detection system and a method based on edge calculation and yolov7-tiny, wherein the litchi disease real-time detection system comprises an image recognition module, a yolov7-tiny module, an attention CBAM module, a mAP module and an edge deployment module, wherein the image recognition module is used for receiving influence data transmitted by a shooting device and forwarding the data, the mAP module is used for comprehensively measuring a detection result, the calculation mode is that mAP=average precision summation of all categories is divided by all categories, the attention CBAM module is used for the yolov7-tiny module, a model is made to grasp key points in a neural network, and the feature scaling module is used for scaling image features; the litchi disease detection method is a set of high-technology litchi disease condition collecting and detecting information transmission device integrating computer remote communication, computer hardware technology and multimedia technology, can judge the litchi disease condition in real time, simplifies detection steps and greatly improves detection efficiency.

Description

Litchi diseases real-time detection system and method based on edge calculation and yolov7-tiny
Technical Field
The invention relates to the field of pest and disease detection, in particular to a litchi disease real-time detection system and method based on edge calculation and yolov 7-tiny.
Background
At present, most litchi gardens in China still adopt a traditional manual mode for management, and irrigation of fruit trees, fertilization of fruit trees, disease and pest prediction, prevention and treatment and the like are all judged and decided by human experience. The manual management mode is rough and has low purposefulness, and particularly, the traditional irrigation modes such as flood irrigation and the like are adopted, so that serious water resource waste is caused.
The existing disease detection is usually carried out in a manual detection mode, the detection efficiency is low, the damage state of plants can be judged only through the appearance characteristics of the diseases in the traditional detection, and the detection result precision is poor.
Disclosure of Invention
The technical problems to be solved by the invention are as follows: the existing disease detection is usually carried out in a manual detection mode, the detection efficiency is low, the damage state of plants can be judged only through the appearance characteristics of the diseases in the traditional detection, and the detection result precision is poor.
The invention solves the technical problems through the following technical scheme, and the litchi disease real-time detection system based on edge calculation and yolov7-tiny comprises an image recognition module, a yolov7-tiny module, an attention CBAM module, a mAP module and an edge deployment module;
the image recognition module is used for receiving the influence data transmitted by the shooting device and forwarding the data;
the mAP module is used for comprehensively measuring the detection result, and the calculation mode is mAP=average precision summation of all categories divided by all categories;
the attention CBAM module is used for a yolov7-tiny module and enables a model to grasp important points in a neural network;
the feature scaling module is used for scaling the image features;
the improved Y0L0v7-tiny module is used for learning data which is not successfully detected, and the learning method is based on network deep learning of Y0L0v 7;
the edge deployment module is used for deploying storage weight parameters to judge new disease image data;
the image recognition module, the mAP module, the improved Y0L0v7-tiny module, the edge deployment module and the interaction module are sequentially in communication connection.
Preferably, the specific processing steps of the mAP module are as follows:
and (S1) S: the mAP module receives the database and compares the database with data stored in the model database, and the comparison results comprise four types, namely TP, FP, FN, TN, wherein TP is judged to be positive in the positive type, FP is judged to be negative in the negative type, FN is judged to be negative in the positive type, and TN is judged to be negative in the negative type;
s2: calculating accuracy and recall rate;
the calculation formula of the accuracy is as follows: precision = TP/(tp+fp)
The calculation formula of the recall rate is as follows: recall=tp/(tp+fn);
s3: calculating average accuracy of each category: average Precision per class = (Precision 1+precision2+, +precision N)/N
Calculating average recall rates of all the categories: average Recall per class = (recal1+recal2+ & gt+recalln)/N
Where N represents the total number of categories, precision1, precision2, precision N represents the Precision of each category, recall1, recall2, recall N represents the Recall of each category, respectively;
s4: calculate all class average accuracy Mean Average Precision = (Average Precision1+ Average Precision2 +2..+ Average PrecisionN)/N
Calculate Average Recall for all categories Mean Average Recall = (Average recal1+average recal2+, +average RecallN)/N
Where N represents the total number of categories, average Precision1, average Precision2,.. Average PrecisionN represent the Average accuracy of each category, respectively, average Recall1, average Recall2,..and Average Recall N represent the Average Recall of each category;
s5: the average value mAP of each category is obtained after the calculation, and the closer the mAP is to 1, the higher the detection accuracy is, and the lower the detection accuracy is otherwise.
Preferably, the scaling method of the feature scaling module is mean normalization, and the feature scaling formula is as follows:
a=(x-μ)/σ
where a is the value normalized by the average, x represents the original value, μ represents the average value, and σ represents the standard deviation.
Preferably, the attention module specifically works as follows:
given an intermediate feature map, the CBAM module would infer the attention map sequentially along two independent dimensions (channels and spaces) and then multiply the attention map with the input feature map for adaptive feature optimization, since CBAM is a lightweight generic module, negligible overhead of this module would seamlessly integrate it into any CNN architecture and could perform end-to-end training with the underlying CNN.
Preferably, the edge deployment module adopts Jetson nano equipment, and Jetson nano has a camera interface, and the Jetson nano is connected with the nano camera through the camera interface to acquire images.
Preferably, the edge deployment module embeds an identification program, and the program includes a trained yolov7tiny model.
The litchi diseases real-time detection method based on edge calculation and yolov7-tiny specifically comprises the following steps:
step one: constructing an improved yolov7-tiny model, wherein the model improvement module comprises a convolution attention mechanism module (CBAM) and a feature scaling module, and training the yolov7-tiny model by using a training set to obtain a final trained model;
step two: deploying the trained yolov7-tiny model into an edge deployment module equipment end Jetson nano;
step three: the image recognition module, namely a nano camera, receives a real-time disease video stream, wherein the video stream comprises a plurality of continuous images;
step four: and inputting the video stream into external equipment to a final target detection model deployed on an Injeida nano development board for target detection, and outputting a target detection result.
Compared with the prior art, the invention has the following advantages:
the deep learning in the disease detection process is performed by using a deep network learning architecture of Y0L0v7-tiny in the prior art, so that the learning effect is stronger, the confidence is higher, and the mAP calculation can be more accurate;
by setting the feature scaling module, the features can be scaled and normalized, so that the features can be evenly amplified or reduced, further, the feature comparison result is more accurate in the mAP calculation process, meanwhile, the feature scaling module can be learned and stored in a model library by Y0L0v7tiny after scaling the features, thus, images with the same features are encountered later, the calculation result is faster,
the litchi disease detection method is a set of high-technology litchi disease condition collecting and detecting information transmission device integrating computer remote communication, computer hardware technology and multimedia technology, can judge the litchi disease condition in real time, simplifies detection steps and greatly improves detection efficiency.
Drawings
Fig. 1 is a system block diagram of the present invention.
Detailed Description
The following describes in detail the examples of the present invention, which are implemented on the premise of the technical solution of the present invention, and detailed embodiments and specific operation procedures are given, but the scope of protection of the present invention is not limited to the following examples.
As shown in fig. 1, this embodiment provides a technical solution: the litchi damage real-time detection system based on edge calculation and yolov7-tiny comprises an image recognition module, a yolov7-tiny module, a attention CBAM module, a mAP module and an edge deployment module;
the image recognition module is used for receiving the influence data transmitted by the shooting device and forwarding the data;
the mAP module is used for comprehensively measuring the detection result, and the calculation mode is mAP=average precision summation of all the categories divided by all the categories;
the attention CBAM module is used for a yolov7-tiny module, and the model is made to grasp the key points in the neural network;
the feature scaling module is used for scaling the image features;
the improved Y0L0v7-tiny module is used for learning data which is not successfully detected, and the learning method is based on network deep learning of Y0L0v 7;
the edge deployment module is used for deploying and storing weight parameters to judge new disease image data;
the image recognition module, the mAP module, the improved Y0L0v7-tiny module, the edge deployment module and the interaction module are sequentially in communication connection.
The mAP module comprises the following specific processing steps:
and (S1) S: the mAP module receives the database and compares the database with data stored in the model database, and the comparison results comprise four types, namely TP, FP, FN, TN, wherein TP is judged to be positive in the positive type, FP is judged to be negative in the negative type, FN is judged to be negative in the positive type, and TN is judged to be negative in the negative type;
s2: calculating accuracy and recall rate;
the calculation formula of the accuracy is as follows: precision = TP/(tp+fp)
The calculation formula of the recall rate is as follows: recall=tp/(tp+fn);
s3: calculating average accuracy of each category: average Precision per class = (Precision 1+precision2+, +precision N)/N
Calculating average recall rates of all the categories: average Recall per class = (recal1+recal2+ & gt+recalln)/N
Where N represents the total number of categories, precision1, precision2, precision N represents the Precision of each category, recall1, recall2, recall N represents the Recall of each category, respectively;
s4: calculate all class average accuracy Mean Average Precision = (Average Precision1+ Average Precision2 +2..+ Average PrecisionN)/N
Calculate Average Recall for all categories Mean Average Recall = (Average recal1+average recal2+, +average RecallN)/N
Where N represents the total number of categories, average Precision1, average Precision2,.. Average PrecisionN represent the Average accuracy of each category, respectively, average Recall1, average Recall2,..and Average Recall N represent the Average Recall of each category;
s5: the average value mAP of each category is obtained after the calculation, and the closer the mAP is to 1, the higher the detection accuracy is, and the lower the detection accuracy is otherwise.
The scaling method of the feature scaling module is mean normalization, and the feature scaling formula is as follows:
a=(x-μ)/σ
where a is the value normalized by the average, x represents the original value, μ represents the average value, and σ represents the standard deviation.
The attention module works specifically as follows:
given an intermediate feature map, the CBAM module would infer the attention map sequentially along two independent dimensions (channels and spaces) and then multiply the attention map with the input feature map for adaptive feature optimization, since CBAM is a lightweight generic module, negligible overhead of this module would seamlessly integrate it into any CNN architecture and could perform end-to-end training with the underlying CNN.
The edge deployment module adopts Jetson nano equipment, and Jetson nano has a camera interface, and a nano camera is connected through the camera interface to acquire images.
The edge deployment module embeds an identification program that includes a trained yolov7tiny model.
The litchi diseases real-time detection method based on edge calculation and yolov7-tiny specifically comprises the following steps:
step one: constructing an improved yolov7-tiny model, wherein the model improvement module comprises a convolution attention mechanism module (CBAM) and a feature scaling module, and training the yolov7-tiny model by using a training set to obtain a final trained model;
step two: deploying the trained yolov7-tiny model into an edge deployment module equipment end Jetson nano;
step three: the image recognition module, namely a nano camera, receives a real-time disease video stream, wherein the video stream comprises a plurality of continuous images;
step four: inputting the video stream into external equipment to a final target detection model deployed on an Injeida nano development board for target detection, and outputting a target detection result.
While embodiments of the present invention have been shown and described above, it will be understood that the above embodiments are illustrative and not to be construed as limiting the invention, and that variations, modifications, alternatives and variations may be made to the above embodiments by one of ordinary skill in the art within the scope of the invention.

Claims (6)

1. The litchi disease real-time detection system based on edge calculation and yolov7-tiny is characterized by comprising an image recognition module, a yolov7-tiny module, a attention CBAM module, a mAP module and an edge deployment module;
the image recognition module is used for receiving the influence data transmitted by the shooting device and forwarding the data;
the mAP module is used for comprehensively measuring the detection result, and the calculation mode is mAP=average precision summation of all categories divided by all categories;
the attention CBAM module is used for a yolov7-tiny module and enables a model to grasp important points in a neural network;
the feature scaling module is used for scaling the image features;
the improved Y0L0v7-tiny module is used for learning data which is not successfully detected, and the learning method is based on network deep learning of Y0L0v 7;
the edge deployment module is used for deploying storage weight parameters to judge new disease image data;
the image recognition module, the mAP module, the improved Y0L0v7-tiny module, the edge deployment module and the interaction module are sequentially in communication connection.
2. The litchi chinensis real-time detection system based on edge calculation and yolov 7-tini according to claim 1, wherein the system is characterized in that: the mAP module comprises the following specific processing steps:
and (S1) S: the mAP module receives the database and compares the database with data stored in the model database, and the comparison results comprise four types, namely TP, FP, FN, TN, wherein TP is judged to be positive in the positive type, FP is judged to be negative in the negative type, FN is judged to be negative in the positive type, and TN is judged to be negative in the negative type;
s2: calculating accuracy and recall rate;
the calculation formula of the accuracy is as follows: precision = TP/(tp+fp)
The calculation formula of the recall rate is as follows: recall=tp/(tp+fn);
s3: calculating average accuracy of each category: averageprecision inclusion= (Precision 1+precision2+ & gt+precision N)/N
Calculating average recall rates of all the categories: averagerecallperclass= (recal1+recal2+ & gt RecallN)/N
Where N represents the total number of categories, precision1, precision2, precision N represents the Precision of each category, recall1, recall2, recall N represents the Recall of each category, respectively;
s4: calculate all class average accuracy meaneaageprecision= (Average Precision1+averageprecision2+, +averageprecision N)/N
The average recall ratio meaneagerecall= (averagerecall1+averagerecall2+.+ AverageRecallN)/N for all categories was calculated
Where N represents the total number of categories, averagePrecision1, averagePrecision2,..averageprecision N represents the average precision of each category, averageRecall1, averageRecall2,..averagerecall N represents the average recall of each category, respectively.
S5: the average value mAP of each category is obtained after the calculation, and the closer the mAP is to 1, the higher the detection accuracy is, and the lower the detection accuracy is otherwise.
3. The litchi chinensis real-time detection system based on edge calculation and yolov 7-tini according to claim 1, wherein the system is characterized in that: the scaling method of the feature scaling module is mean normalization, and the feature scaling formula is as follows:
a=(x-μ)/σ
where a is the value normalized by the average, x represents the original value, μ represents the average value, and σ represents the standard deviation.
4. The litchi chinensis real-time detection system based on edge calculation and yolov 7-tini according to claim 1, wherein the system is characterized in that: the attention module works specifically as follows:
given an intermediate feature map, the CBAM module would infer the attention map sequentially along two independent dimensions (channels and spaces) and then multiply the attention map with the input feature map for adaptive feature optimization, since CBAM is a lightweight generic module, negligible overhead of this module would seamlessly integrate it into any CNN architecture and could perform end-to-end training with the underlying CNN.
5. The litchi chinensis real-time detection system based on edge calculation and yolov 7-tini according to claim 1, wherein the system is characterized in that: the edge deployment module is embedded with an identification program, and the program comprises a trained yolov7tiny model.
6. The real-time litchi disease detection method based on edge calculation and yolov7-tiny is characterized by comprising the following steps of: the method specifically comprises the following steps:
step one: constructing an improved yolov7-tiny model, wherein the model improvement module comprises a convolution attention mechanism module (CBAM) and a feature scaling module, and training the yolov7-tiny model by using a training set to obtain a final trained model;
step two: deploying the trained yolov7-tiny model into an edge deployment module equipment end Jetson nano;
step three: the image recognition module, namely a nano camera, receives a real-time disease video stream, wherein the video stream comprises a plurality of continuous images;
step four: and inputting the video stream into external equipment for target detection, and outputting a target detection result.
CN202310721957.4A 2023-06-19 2023-06-19 Litchi diseases real-time detection system and method based on edge calculation and yolov7-tiny Pending CN116824369A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310721957.4A CN116824369A (en) 2023-06-19 2023-06-19 Litchi diseases real-time detection system and method based on edge calculation and yolov7-tiny

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310721957.4A CN116824369A (en) 2023-06-19 2023-06-19 Litchi diseases real-time detection system and method based on edge calculation and yolov7-tiny

Publications (1)

Publication Number Publication Date
CN116824369A true CN116824369A (en) 2023-09-29

Family

ID=88123493

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310721957.4A Pending CN116824369A (en) 2023-06-19 2023-06-19 Litchi diseases real-time detection system and method based on edge calculation and yolov7-tiny

Country Status (1)

Country Link
CN (1) CN116824369A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117036361A (en) * 2023-10-10 2023-11-10 云南大学 Power grid transmission line smoke detection method, system, electronic equipment and medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117036361A (en) * 2023-10-10 2023-11-10 云南大学 Power grid transmission line smoke detection method, system, electronic equipment and medium
CN117036361B (en) * 2023-10-10 2024-02-20 云南大学 Power grid transmission line smoke detection method, system, electronic equipment and medium

Similar Documents

Publication Publication Date Title
CN111611878A (en) Method for crowd counting and future people flow prediction based on video image
CN106709462A (en) Indoor positioning method and device
CN112560745B (en) Method for discriminating personnel on electric power operation site and related device
CN111209832B (en) Auxiliary obstacle avoidance training method, equipment and medium for substation inspection robot
CN116824369A (en) Litchi diseases real-time detection system and method based on edge calculation and yolov7-tiny
WO2022048582A1 (en) Method and device for optical flow information prediction, electronic device, and storage medium
CN110633643A (en) Abnormal behavior detection method and system for smart community
CN111161315A (en) Multi-target tracking method and system based on graph neural network
CN113903081A (en) Visual identification artificial intelligence alarm method and device for images of hydraulic power plant
CN106846378A (en) Across video camera object matching and tracking that a kind of combination topology of spacetime is estimated
CN112001347A (en) Motion recognition method based on human skeleton shape and detection target
CN110519582A (en) A kind of crusing robot data collection system and collecting method
CN108764456A (en) Airborne target identification model construction platform, airborne target recognition methods and equipment
CN112349057A (en) Deep learning-based indoor smoke and fire detection method
CN109389156A (en) A kind of training method, device and the image position method of framing model
CN108537825A (en) A kind of method for tracking target based on transfer learning Recurrent networks
CN116824626A (en) Artificial intelligent identification method for abnormal state of animal
CN112766305B (en) Visual SLAM closed loop detection method based on end-to-end measurement network
CN114170686A (en) Elbow bending behavior detection method based on human body key points
CN110841143B (en) Method and system for predicting state of infusion pipeline
CN102497495B (en) Target association method for multi-camera monitoring system
CN114627496A (en) Robust pedestrian re-identification method based on depolarization batch normalization of Gaussian process
CN113344968A (en) Orchard fruit identification and yield statistical system and method
CN113516232A (en) Training method of neural network model based on self-attention mechanism
CN112257568B (en) Intelligent real-time supervision and error correction system and method for individual soldier queue actions

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination