CN112926473A - Bee mite identification method and equipment based on image identification - Google Patents

Bee mite identification method and equipment based on image identification Download PDF

Info

Publication number
CN112926473A
CN112926473A CN202110246273.4A CN202110246273A CN112926473A CN 112926473 A CN112926473 A CN 112926473A CN 202110246273 A CN202110246273 A CN 202110246273A CN 112926473 A CN112926473 A CN 112926473A
Authority
CN
China
Prior art keywords
bee
mite
image
mites
identification
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110246273.4A
Other languages
Chinese (zh)
Other versions
CN112926473B (en
Inventor
王红芳
孙晓勇
韩金玉
胥保华
刘振国
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Agricultural University
Original Assignee
Shandong Agricultural University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Agricultural University filed Critical Shandong Agricultural University
Priority to CN202110246273.4A priority Critical patent/CN112926473B/en
Publication of CN112926473A publication Critical patent/CN112926473A/en
Application granted granted Critical
Publication of CN112926473B publication Critical patent/CN112926473B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/69Microscopic objects, e.g. biological cells or cellular parts
    • G06V20/695Preprocessing, e.g. image segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06MCOUNTING MECHANISMS; COUNTING OF OBJECTS NOT OTHERWISE PROVIDED FOR
    • G06M11/00Counting of objects distributed at random, e.g. on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/02Agriculture; Fishing; Forestry; Mining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30242Counting objects in image

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Resources & Organizations (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • Strategic Management (AREA)
  • Economics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Tourism & Hospitality (AREA)
  • Evolutionary Biology (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Quality & Reliability (AREA)
  • General Business, Economics & Management (AREA)
  • Computing Systems (AREA)
  • Biophysics (AREA)
  • Marketing (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computational Linguistics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Mining & Mineral Resources (AREA)
  • Primary Health Care (AREA)
  • Marine Sciences & Fisheries (AREA)
  • Animal Husbandry (AREA)
  • Agronomy & Crop Science (AREA)
  • Development Economics (AREA)
  • Game Theory and Decision Science (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Operations Research (AREA)

Abstract

The present disclosure provides a bee mite identification method and apparatus based on image identification, the bee mite identification method comprises the following steps: acquiring a honeycomb image; identifying the bee mite condition by adopting a deep learning method and combining an image processing method; and periodically executing the steps to count the bee mite-staining conditions to predict the disease incidence rule of the bee mites, and generating a bee mite prevention and control scheme according to the disease incidence rule. Through using computer visual identification technique, carry out the analysis statistics to the mite bee of dying, supplementary breed decision-making for the honeybee is bred and has more intelligent, more high-efficient, more accurate and easy operation's advantage.

Description

Bee mite identification method and equipment based on image identification
Technical Field
The disclosure relates to the technical field of bee breeding, in particular to a bee mite identification method and device based on image identification.
Background
The statements in this section merely provide background information related to the present disclosure and may not necessarily constitute prior art.
Bee mites are the most serious biological threat in global bee breeding and are mainly treated by chemical miticides at present. However, the miticide is often used in a mess in production, the drug resistance of the bee mites is increased by frequent and excessive use, the miticide effect cannot be achieved due to low dose and untimely use, and bee colonies are seriously lost. The main reasons for these problems are that the bee-keeping person cannot quickly and accurately detect the parasitic rate of the mites in the bee colony, predict the temporal-spatial law of the occurrence of the mites, and understand the relationship between the parasitic rate of the bees and the dosage of the drug, so that the proper and proper dosage of the drug cannot be taken timely, properly, and the accurate advanced prevention and control measures cannot be taken.
The inventor finds that the existing identification method is a method of visual identification generally adopted in bee breeding production, but bee mites are small in size, the number of bees in a colony is large, the bee mites cannot be identified one by one, the parasitic rate of the bee mites can be estimated only according to the frequency of the bee mites observed in the process of inspecting the bee colony and the body condition of the bees, accurate data cannot be given, and only rough grade evaluation can be carried out by high, medium and low. In research, in order to obtain describable data, a sampling survey method is generally adopted, namely 300 bees are randomly taken from each bee colony, bee mites are eluted from the bees by a detergent or alcohol rinsing method, the bee mites are artificially counted, and the parasitic rate of the bee mites (bee mite number/bee number) is calculated. The method is complex to operate, large in sampling error and system error and difficult to rapidly obtain accurate data. Therefore, an efficient, accurate and simple-operation method for counting the parasitic rate of the bee mites is lacked in both bee keeping production and scientific research of the bee mites at present.
Disclosure of Invention
In order to solve the problems, the disclosure provides a bee mite identification method and device based on image identification, and by using a computer vision identification technology, the bee mite-infected bees are analyzed and counted, and breeding decision is assisted, so that the bee breeding has the advantages of being more intelligent, more efficient, more accurate and simple to operate.
In order to achieve the purpose, the following technical scheme is adopted in the disclosure:
one or more embodiments provide a bee mite identification method based on image identification, including the steps of:
acquiring a honeycomb image;
identifying the bee mite condition by adopting a deep learning method and combining an image processing method;
and periodically executing the steps to count the bee mite-staining conditions to predict the disease incidence rule of the bee mites, and generating a bee mite prevention and control scheme according to the disease incidence rule.
An electronic device comprising a memory and a processor and computer instructions stored on the memory and executed on the processor, the computer instructions, when executed by the processor, performing the steps of the above method.
A computer readable storage medium storing computer instructions which, when executed by a processor, perform the steps of the above method.
Compared with the prior art, the beneficial effect of this disclosure is:
(1) according to the bee mite recognition method, the deep learning method is combined with the image processing method through the deep learning algorithm to recognize the bee mite condition, the recognition efficiency is improved, and the intelligence of bee mite recognition can be improved.
(2) In the method, the yolo3 model and the MobileNet V2 model are set to identify the bee mite condition, and the combination of the two models not only ensures the accuracy, but also has the characteristics of high operation speed and light weight.
Advantages of additional aspects of the disclosure will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the disclosure.
Drawings
The accompanying drawings, which are included to provide a further understanding of the disclosure, illustrate embodiments of the disclosure and together with the description serve to explain the disclosure and not to limit the disclosure.
Fig. 1 is a flowchart of a bee mite identification method according to embodiment 1 of the present disclosure;
fig. 2 is a fusion model structure diagram of embodiment 1 of the present disclosure.
The specific implementation mode is as follows:
the present disclosure is further described with reference to the following drawings and examples.
It should be noted that the following detailed description is exemplary and is intended to provide further explanation of the disclosure. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments according to the present disclosure. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, and it should be understood that when the terms "comprises" and/or "comprising" are used in this specification, they specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof, unless the context clearly indicates otherwise. It should be noted that, in the case of no conflict, the embodiments and features in the embodiments in the present disclosure may be combined with each other. The embodiments will be described in detail below with reference to the accompanying drawings.
Example 1
In one or more embodiments, as shown in fig. 1-2, the bee mite identification method based on image identification includes the following steps:
step 1, acquiring a honeycomb image;
step 2, identifying the bee mite condition by adopting a deep learning method and combining an image processing method;
and 3, periodically executing the steps to count the bee mite-staining conditions and predict the disease incidence rule of the bee mites, and generating a bee mite prevention and control scheme according to the disease incidence rule.
In the embodiment, the deep learning method is combined with the image processing method through the deep learning algorithm to identify the bee mite condition, so that the identification efficiency is improved, and the intellectualization of bee mite identification can be improved.
In the step 1, a high-resolution camera is used for photographing the bees and collecting images. The operation is to take the honeycomb out of the beehive and take the honeycomb image by using a high-resolution camera.
In step 2, the infection condition may include the type of bees infected with bee mites, the number or proportion of each type of bees infected with bee mites, the type of bee mites infected with bee mites, and the proportion of each type of bee mites.
Optionally, identifying the mite-staining condition of the bees in the image includes the following steps:
2.1 preprocessing the honeycomb image, and identifying whether bees infect the bee mites by adopting a deep learning algorithm aiming at the preprocessed image;
2.2 if the bee mites are infected, continuously identifying the type of the bees by adopting a deep learning algorithm, and counting; wherein the bee types include worker bees, drone bees, etc.
2.3 adopting a deep learning counting method to identify whether the bee mites are varroa jacobsoni or varroa parvu, and calculating the proportion of the bee mites infecting the varroa jacobsoni.
Optionally, the deep learning method in the above steps 2.1 and 2.2 may be a convolution algorithm, and specifically, a yolo3 model may be adopted to fuse a MobileNetV2 model.
The yolo3 model and the MobileNet V2 model can be fused into a specific structure as shown in FIG. 2, and the specific structure comprises a yolov3 part and a MovileNet V2 part which are connected in sequence, and the recognition result of the yolov3 part is used as the input of a MovileNet V2 part.
Specifically, the yolov3 part comprises an input layer, a plurality of first convolution layers and a first output layer, and the detection result is an image matrix of a part framed by the bee; the MobileNet V2 part comprises a second convolution layer, a multilayer inverse residual error structure and a second output layer, wherein the multilayer inverse residual error structure firstly carries out dimension increasing on an image, then extracts features, finally carries out dimension reducing operation, and outputs bee types as identification results through the second output layer.
In this embodiment, the upper half is yolov3 part, and the lower half is MovileNetV2 part: each layer of the yovov3 part is a convolution layer, the detection result is an image matrix of the framed part of the bee, and then the lower half part receives the matrix to judge the bee type.
The MobileNet V2 part comprises a multilayer inverse residual error structure and is characterized in that dimension increasing is carried out on an image, then the feature is extracted, and finally dimension reducing operation is carried out.
The yolo3 model comprises a multilayer convolution, a residual error and an up-sampling structure, and the final model outputs a convolution calculation result; and the image recognition adopts models such as MobileNet V2 and the like, comprises a plurality of layers of convolution and full connection layers, and finally outputs a classification result.
Optionally, in step 2.3, a deep learning counting method is adopted to identify whether the bee mites are varroa jacobsoni or varroa parvus, and the proportion of the bee mites infecting the varroa jacobsoni is calculated, wherein the method for identifying the bee mites as varroa jacobsoni or varroa parvus specifically comprises the following steps: the method comprises the steps of synchronously obtaining morphology pictures of varroa jacobsoni and varroa jacobsoni, training a yolo3 model, inputting obtained bee spleen images into a trained yolo3 model, identifying whether the varroa jacobsoni is varroa jacobsoni or varroa jacobsoni through deep learning images, and calculating the proportion of the varroa jacobsoni infecting the varroa jacobsoni or the varroa jacobsoni infecting the varroa.
In the embodiment, the yolo3 model and the MobileNet V2 model are set to identify the bee mite condition, and the combination of the two models not only ensures the accuracy, but also has the characteristics of high operation speed and light weight.
The counting method can be as follows: and (5) setting marking frames at the positions of the mites in the image by adopting the yolo3 model, and identifying the number of the marking frames for counting.
The bee mite detection method has the advantages that the type and occupied proportion of bees infected with bee mites are simultaneously identified by adopting a machine algorithm, comprehensive analysis on the mite staining condition of the whole bee colony is facilitated, meanwhile, the marking frames are adopted for direct counting, the identification efficiency is high, the data processing speed is high, and the method is simple and convenient to operate.
Furthermore, after the mite staining condition of the bees in the image is identified, the bee mite staining condition is counted to predict the disease incidence rule of the bee mites, and the method comprises the following steps:
and acquiring the position data of the honeycomb while acquiring the image of the honeycomb, counting the relationship between the position of the honeycomb in a bee colony and the parasitic rate of the bee mites, and guiding the key area of the mite damage prevention and control pesticide.
Wherein the honeycomb position comprises a lateral spleen, a middle spleen and the like.
Furthermore, after the mite staining condition of the bees in the image is identified, the bee mite staining condition is counted to predict the disease incidence rule of the bee mites, and the method further comprises the following steps:
and synchronously acquiring bee colony parameters while acquiring all the bee spleen images of the whole swarm, counting the relationship between the bee mite parasitic rate and the bee colony parameters, predicting the optimal feeding management conditions, and guiding production.
Optionally, the swarm parameters include data such as swarm vigor, swarm internal temperature, queen age, and queen spawning capacity.
The method for predicting the disease incidence rule of the bee mites by counting the bee mite-staining conditions comprises the following specific steps:
3.1 obtaining the result of the number of mites on worker bees and male bees according to the identification and statistics of the types of the mite-infected bees and the number of the bee mites, and drawing a distribution map;
3.2 calculating the parasitic rate of the mite damage of the bee colony under different worker bee and drone proportions according to the distribution map, counting the correlation between the worker bee and drone proportions and the parasitic rate of the bee mites, and predicting the optimal worker-male proportion, so that the bee mite prevention and control scheme adopts the optimal proportion for configuration.
The embodiment realizes visual display through the distribution map, and is favorable for determining a bee mite prevention and control scheme.
This embodiment still provides bee mite identification equipment based on image recognition, its characterized by: the method comprises an image acquisition device and a processor, wherein the processor executes the steps of the method.
Example 2
The present embodiment provides an electronic device comprising a memory and a processor, and computer instructions stored on the memory and executed on the processor, wherein the computer instructions, when executed by the processor, perform the steps of the method of embodiment 1.
Example 3
The present embodiment provides a computer readable storage medium for storing computer instructions which, when executed by a processor, perform the steps of the method of embodiment 1.
The electronic device provided by the present disclosure may be a mobile terminal and a non-mobile terminal, where the non-mobile terminal includes a desktop computer, and the mobile terminal includes a Smart Phone (such as an Android Phone and an IOS Phone), Smart glasses, a Smart watch, a Smart bracelet, a tablet computer, a notebook computer, a personal digital assistant, and other mobile internet devices capable of performing wireless communication.
It should be understood that in the present disclosure, the processor may be a central processing unit CPU, but may also be other general purpose processors, digital signal processors DSP, application specific integrated circuits ASIC, off-the-shelf programmable gate arrays FPGA or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory may include both read-only memory and random access memory, and may provide instructions and data to the processor, and a portion of the memory may also include non-volatile random access memory. For example, the memory may also store device type information.
In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software. The steps of a method disclosed in connection with the present disclosure may be embodied directly in a hardware processor, or in a combination of the hardware and software modules within the processor. The software modules may be located in ram, flash, rom, prom, or eprom, registers, among other storage media as is well known in the art. The storage medium is located in a memory, and a processor reads information in the memory and completes the steps of the method in combination with hardware of the processor. To avoid repetition, it is not described in detail here. Those of ordinary skill in the art will appreciate that the various illustrative elements, i.e., algorithm steps, described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present disclosure, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is merely a division of one logic function, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some interfaces, and may be in an electrical, mechanical or other form.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present disclosure may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present disclosure. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes. The above description is only a preferred embodiment of the present disclosure and is not intended to limit the present disclosure, and various modifications and changes may be made to the present disclosure by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present disclosure should be included in the protection scope of the present disclosure.
Although the present disclosure has been described with reference to specific embodiments, it should be understood that the scope of the present disclosure is not limited thereto, and those skilled in the art will appreciate that various modifications and changes can be made without departing from the spirit and scope of the present disclosure.

Claims (10)

1. The bee mite identification method based on image identification is characterized by comprising the following steps of:
acquiring a honeycomb image;
identifying the bee mite condition by adopting a deep learning method and combining an image processing method;
and periodically executing the steps to count the bee mite-staining conditions to predict the disease incidence rule of the bee mites, and generating a bee mite prevention and control scheme according to the disease incidence rule.
2. The bee mite identification method based on image identification as claimed in claim 1, wherein: the full condition comprises the type of bees infected with the bee mites, the number or proportion of each type of bees infected with the bee mites, the type of the bee mites infected and the proportion of each type of bee mites.
3. The bee mite identification method based on image identification as claimed in claim 1, wherein: identifying the mite staining condition of the bees in the image, comprising the following steps:
preprocessing a bee spleen image, and identifying whether bees infect bee mites by adopting a deep learning algorithm aiming at the preprocessed image;
if the bee mites are infected, continuously identifying the type of the bees by adopting a deep learning algorithm, and counting; wherein the bee types include worker bee change and drone;
and identifying whether the bee mites are varroa jacobsoni or varroa parvu by adopting a deep learning counting method, and calculating the proportion of infecting the varroa jacobsoni and the varroa parvu.
4. The bee mite identification method based on image identification as claimed in claim 1, wherein: the deep learning method is a convolution algorithm, and specifically adopts a yolo3 model to fuse a Mobi LeNetV2 model;
or the specific structure of the yolo3 model fused with the MobileNet V2 model is as follows: the method comprises a yolov3 part and a MovileNetV2 part which are sequentially connected, wherein the identification result of the yolov3 part is used as the input of the Movi LeNetV2 part;
specifically, the yolov3 part comprises an input layer, a plurality of first convolution layers and a first output layer, and the detection result is an image matrix of a part framed by the bee; the MobileNet V2 part comprises a second convolution layer, a multilayer inverse residual error structure and a second output layer, wherein the multilayer inverse residual error structure firstly carries out dimension increasing on an image, then extracts features, finally carries out dimension reducing operation, and identifies a result through the second output layer.
5. The bee mite identification method based on image identification as claimed in claim 1, wherein: the method for identifying the bee mites as varroa jacobsoni or varroa parva specifically comprises the following steps: the method comprises the steps of synchronously obtaining morphology pictures of varroa jacobsoni and varroa jacobsoni, training a yolo3 model, inputting obtained bee spleen images into a trained yolo3 model, identifying whether the varroa jacobsoni is varroa jacobsoni or varroa jacobsoni through deep learning images, and calculating the proportion of the varroa jacobsoni infecting the varroa jacobsoni or the varroa jacobsoni infecting the varroa.
6. The bee mite identification method based on image identification as claimed in claim 1, wherein: the proportion of the bee mites infecting the big and small bees is calculated by counting, wherein the technical method specifically comprises the following steps: and (5) setting marking frames at the positions of the mites in the image by adopting the yolo3 model, and identifying the number of the marking frames for counting.
7. The bee mite identification method based on image identification as claimed in claim 1, wherein: after the mite staining condition of the bees in the image is identified, the bee mite staining condition is counted to predict the bee mite incidence rule, and the method comprises the following steps:
acquiring a bee spleen image, acquiring bee spleen position data, and counting the relationship between the position of a bee spleen in a bee colony and the parasitic rate of bee mites;
or, the method also comprises the following steps:
synchronously acquiring swarm parameters while acquiring all the bee spleen images of the whole swarm, and counting the relationship between the parasitic rate of bee mites and the swarm parameters;
or
The method for counting and predicting the disease incidence rule of the bee mites comprises the following steps:
obtaining the result of the number of mites on worker bees and male bees according to the identification and statistics of the types of the mite-infected bees and the number of the bee mites, and drawing a distribution map;
and calculating the parasitic rate of the mite damage of the bee colony under different worker bee and drone proportions according to the distribution diagram, counting the correlation between the worker bee and drone proportions and the parasitic rate of the bee mites, and predicting the optimal worker-male proportion, so that the optimal proportion is adopted for configuration in the bee mite prevention and control scheme.
8. Bee mite identification equipment based on image recognition, characterized by: comprising image acquisition means and a processor performing the steps of the method according to any one of claims 1 to 7.
9. An electronic device comprising a memory and a processor and computer instructions stored on the memory and executable on the processor, the computer instructions when executed by the processor performing the steps of the method of any of claims 1 to 7.
10. A computer-readable storage medium storing computer instructions which, when executed by a processor, perform the steps of the method of any one of claims 1 to 7.
CN202110246273.4A 2021-03-05 2021-03-05 Bee mite identification method and equipment based on image identification Active CN112926473B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110246273.4A CN112926473B (en) 2021-03-05 2021-03-05 Bee mite identification method and equipment based on image identification

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110246273.4A CN112926473B (en) 2021-03-05 2021-03-05 Bee mite identification method and equipment based on image identification

Publications (2)

Publication Number Publication Date
CN112926473A true CN112926473A (en) 2021-06-08
CN112926473B CN112926473B (en) 2022-10-04

Family

ID=76173454

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110246273.4A Active CN112926473B (en) 2021-03-05 2021-03-05 Bee mite identification method and equipment based on image identification

Country Status (1)

Country Link
CN (1) CN112926473B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113678794A (en) * 2021-10-26 2021-11-23 中国农业科学院蜜蜂研究所 Breeding method, device and application of bee mite-resistant bee species
CN114155757A (en) * 2021-12-03 2022-03-08 广东省健卫病媒预防控制中心(普通合伙) Quality certificate level identification platform in pest control industry

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2858749A1 (en) * 2003-08-14 2005-02-18 Marcel Legris Bee-house for apiculture, has sloping section that forms funnel cup for collecting and driving detritus and Varroa jacobsoni towards passage that leads towards exterior of body
WO2013143552A2 (en) * 2012-03-27 2013-10-03 Humal Priit An apparatus for diagnosis and control of honeybee varroatosis, image processing method and software for recognition of parasite
CN106719463A (en) * 2017-02-22 2017-05-31 中国农业科学院蜜蜂研究所 The detection method and device of a kind of honeybee mite infestationss rate
CN206933010U (en) * 2017-06-23 2018-01-30 四川天府蜂谷科技有限公司 The automatic intelligent beehive for carrying out honeybee and controlling mite
CN108739719A (en) * 2018-05-07 2018-11-06 杨波 A kind of intelligent recognition laser bee mite kills robot and its killing method
CN110502987A (en) * 2019-07-12 2019-11-26 山东农业大学 A kind of plant pest recognition methods and system based on deep learning
CN111797934A (en) * 2020-07-10 2020-10-20 北京嘉楠捷思信息技术有限公司 Road sign identification method and device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2858749A1 (en) * 2003-08-14 2005-02-18 Marcel Legris Bee-house for apiculture, has sloping section that forms funnel cup for collecting and driving detritus and Varroa jacobsoni towards passage that leads towards exterior of body
WO2013143552A2 (en) * 2012-03-27 2013-10-03 Humal Priit An apparatus for diagnosis and control of honeybee varroatosis, image processing method and software for recognition of parasite
CN106719463A (en) * 2017-02-22 2017-05-31 中国农业科学院蜜蜂研究所 The detection method and device of a kind of honeybee mite infestationss rate
CN206933010U (en) * 2017-06-23 2018-01-30 四川天府蜂谷科技有限公司 The automatic intelligent beehive for carrying out honeybee and controlling mite
CN108739719A (en) * 2018-05-07 2018-11-06 杨波 A kind of intelligent recognition laser bee mite kills robot and its killing method
CN110502987A (en) * 2019-07-12 2019-11-26 山东农业大学 A kind of plant pest recognition methods and system based on deep learning
CN111797934A (en) * 2020-07-10 2020-10-20 北京嘉楠捷思信息技术有限公司 Road sign identification method and device

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
SIMON BILIK ET AL: "Visual diagnosis of the Varroa destructor parasitic mite in honeybees using object detector techniques", 《ARXIV:2103.03133V1》 *
刘劲军: "《蜜蜂生态养殖》", 31 May 2018 *
刘璇昕: "基于深度学习的昆虫轻量级检测模型研究", 《中国优秀博硕士学位论文全文数据库(硕士)农业科技辑》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113678794A (en) * 2021-10-26 2021-11-23 中国农业科学院蜜蜂研究所 Breeding method, device and application of bee mite-resistant bee species
CN114155757A (en) * 2021-12-03 2022-03-08 广东省健卫病媒预防控制中心(普通合伙) Quality certificate level identification platform in pest control industry

Also Published As

Publication number Publication date
CN112926473B (en) 2022-10-04

Similar Documents

Publication Publication Date Title
US11798662B2 (en) Methods for identifying biological material by microscopy
CN112926473B (en) Bee mite identification method and equipment based on image identification
WO2021093451A1 (en) Pathological section image processing method, apparatus, system, and storage medium
WO2021000423A1 (en) Pig weight measurement method and apparatus
CN111260677B (en) Cell analysis method, device, equipment and storage medium based on microscopic image
CN109964235A (en) For carrying out the prediction model of vision sorter to insect
CN116051574A (en) Semi-supervised segmentation model construction and image analysis method, device and system
CN106066934A (en) A kind of Alzheimer based on Spark platform assistant diagnosis system in early days
CN111402217A (en) Image grading method, device, equipment and storage medium
CN109784200A (en) Milk cow behavior image based on binocular vision obtains and body condition intelligent monitor system
Li et al. Cow individual identification based on convolutional neural network
Singh et al. Performance evaluation of plant leaf disease detection using deep learning models
CN113902669A (en) Method and system for reading urine exfoliative cell fluid-based smear
Vinicki et al. Using convolutional neural networks for determining reticulocyte percentage in cats
CN113780145A (en) Sperm morphology detection method, sperm morphology detection device, computer equipment and storage medium
CN107292340A (en) Lateral line scales recognition methods based on convolutional neural networks
Qin et al. Malaria cell detection using evolutionary convolutional deep networks
US20230245495A1 (en) Face recognition systems data collection process
CN107194918B (en) Data analysis method and device
Yoo et al. BeeNet: An End-To-End Deep Network For Bee Surveillance
Singh et al. Malaria parasite recognition in thin blood smear images using squeeze and excitation networks
Moen et al. Age interpretation of cod otoliths using deep learning
Mishra et al. Analysis of different Machine Learning and Deep Learning Techniques for Malaria Parasite Detection
CN117496274B (en) Classification counting method, system and storage medium based on liquid drop images
CN113706449B (en) Pathological image-based cell analysis method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant