CN113506579A - Insect pest recognition method based on artificial intelligence and sound and robot - Google Patents

Insect pest recognition method based on artificial intelligence and sound and robot Download PDF

Info

Publication number
CN113506579A
CN113506579A CN202110584234.5A CN202110584234A CN113506579A CN 113506579 A CN113506579 A CN 113506579A CN 202110584234 A CN202110584234 A CN 202110584234A CN 113506579 A CN113506579 A CN 113506579A
Authority
CN
China
Prior art keywords
sound
severity
insect
insect pest
type
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110584234.5A
Other languages
Chinese (zh)
Other versions
CN113506579B (en
Inventor
朱定局
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China Normal University
Original Assignee
South China Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China Normal University filed Critical South China Normal University
Priority to CN202110584234.5A priority Critical patent/CN113506579B/en
Publication of CN113506579A publication Critical patent/CN113506579A/en
Application granted granted Critical
Publication of CN113506579B publication Critical patent/CN113506579B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L17/00Speaker identification or verification techniques
    • G10L17/26Recognition of special voice characteristics, e.g. for use in lie detectors; Recognition of animal voices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L25/00Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
    • G10L25/27Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 characterised by the analysis technique
    • G10L25/30Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 characterised by the analysis technique using neural networks
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11CSTATIC STORES
    • G11C7/00Arrangements for writing information into, or reading information out from, a digital store
    • G11C7/16Storage of analogue signals in digital stores using an arrangement comprising analogue/digital [A/D] converters, digital memories and digital/analogue [D/A] converters 

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Multimedia (AREA)
  • Acoustics & Sound (AREA)
  • Human Computer Interaction (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Signal Processing (AREA)
  • Catching Or Destruction (AREA)

Abstract

Insect pest identification method and robot based on artificial intelligence and sound, comprising: a position acquisition step; a sound collection step; marking insect pests by experts; training a pest recognition model; and (5) predicting the insect sound recognition model. According to the method, the system and the robot, a new pest identification mode is created through a pest sound identification method, the traditional pest identification based on vision is very difficult, the pest is very difficult to capture by a camera and is difficult to identify through a computer, the pest sound is very obvious in pest, the pest identification through the pest sound is original, and the artificial intelligence technology is combined, so that automatic and rapid identification and prediction can be achieved, and the possibility is provided for accurate pest killing.

Description

Insect pest recognition method based on artificial intelligence and sound and robot
Technical Field
The invention relates to the technical field of artificial intelligence, in particular to a method and a robot for identifying insect pests based on artificial intelligence and sound.
Background
In the process of implementing the invention, the inventor finds that at least the following problems exist in the prior art: traditional pest identification based on vision is very difficult because the pest is very little and is difficult to be caught by the camera, is more difficult to discern through the computer.
Accordingly, the prior art is yet to be improved and developed.
Disclosure of Invention
Based on this, it is necessary to provide based on artificial intelligence and sound discernment insect pest method and robot to the defect or not enough of prior art to solve the insect pest and be difficult to carry out the insect pest discernment through the vision now, combine artificial intelligence technique, can accomplish automatic quick discernment and prediction, provide probably for accurate deinsectization, protection against insects.
In a first aspect, an embodiment of the present invention provides an artificial intelligence method, where the method includes:
a position acquisition step: acquiring position information of a plurality of positions where a plurality of sound recording devices in a preset place are located as a plurality of sound recording positions;
a sound collection step: collecting sound through the plurality of sound recording devices, and dividing the sound file into sound segments with preset time length, wherein the time period corresponding to the sound segments is used as a sound recording time period; the sound fragments are associated with a recording device for collecting the sound, the position of the recording device and the recording time period and then stored into sound big data;
and (3) marking insect pests by experts: acquiring expert labels of each sound segment, wherein the expert labels comprise types and severity of insect pests;
training a pest recognition model: acquiring a sound segment for training and an expert label thereof, taking the sound segment as input, taking the expert label of the sound segment as expected output, and training a preset first deep learning model to obtain a worm sound recognition deep learning model;
predicting the insect sound recognition model: and acquiring a sound segment for prediction, and inputting the sound segment into the insect sound recognition deep learning model to obtain the type and severity of the insect damage corresponding to the sound segment.
Preferably, the method further comprises:
insect pest visualization step: predicting the sound segments recorded by each recording device in the preset place to obtain the type and severity of insect pests corresponding to the sound segments recorded by each recording device in the preset place; the type and severity of the insect pest are marked on an electronic map with different signs (such as different colors and color depths, or different insect images and insect concentration signs), and the electronic map not only has a space dimension but also has a time dimension.
Preferably, the method further comprises:
training a future insect pest model: acquiring the type and the severity of the insect pest of each time period of each recording position for training, taking the type and the severity of the insect pest of the first M time periods of the recording position as input, taking the type and the severity of the insect pest of the last N time periods of the recording position as expected output, and training a preset second deep learning model to obtain a future insect pest prediction model;
predicting a future insect pest model: the method comprises the steps of obtaining the type and the severity of the insect pest of each time period of each recording position for prediction, inputting the type and the severity of the insect pest of the latest M time periods of the recording position as input into a future prediction model of the insect pest, and calculating to obtain the type and the severity of the insect pest of the future N time periods.
Preferably, the method further comprises:
insect pest space fitting step: performing spatial fitting of the type and the severity of the insect pests in the preset place according to the types and the severity of the insect pests predicted and obtained on different recording positions to obtain spatial distribution of the type and the severity of the insect pests in the preset place;
insect pest space-time fitting step: and predicting the type and the severity of the insect pest obtained in different periods of time at different recording positions, and performing space-time fitting on the type and the severity of the insect pest in the preset place to obtain the space-time distribution of the type and the severity of the insect pest in the preset place.
In a second aspect, an embodiment of the present invention provides an artificial intelligence apparatus, where the apparatus includes:
a position acquisition model: acquiring position information of a plurality of positions where a plurality of sound recording devices in a preset place are located as a plurality of sound recording positions;
a sound collection model: collecting sound through the plurality of sound recording devices, and dividing the sound file into sound segments with preset time length, wherein the time period corresponding to the sound segments is used as a sound recording time period; the sound fragments are associated with a recording device for collecting the sound, the position of the recording device and the recording time period and then stored into sound big data;
and (3) marking a pest model by an expert: acquiring expert labels of each sound segment, wherein the expert labels comprise types and severity of insect pests;
training a worm sound recognition model: acquiring a sound segment for training and an expert label thereof, taking the sound segment as input, taking the expert label of the sound segment as expected output, and training a preset first deep learning model to obtain a worm sound recognition deep learning model;
the insect sound recognition model prediction model comprises the following steps: and acquiring a sound segment for prediction, and inputting the sound segment into the insect sound recognition deep learning model to obtain the type and severity of the insect damage corresponding to the sound segment.
Preferably, the apparatus further comprises:
visual insect pest model: predicting the sound segments recorded by each recording device in the preset place to obtain the type and severity of insect pests corresponding to the sound segments recorded by each recording device in the preset place; the type and severity of the insect pest are marked on an electronic map with different signs (such as different colors and color depths, or different insect images and insect concentration signs), and the electronic map not only has a space dimension but also has a time dimension.
Preferably, the apparatus further comprises:
future insect pest model training model: acquiring the type and the severity of the insect pest of each time period of each recording position for training, taking the type and the severity of the insect pest of the first M time periods of the recording position as input, taking the type and the severity of the insect pest of the last N time periods of the recording position as expected output, and training a preset second deep learning model to obtain a future insect pest prediction model;
future insect pest model prediction model: the method comprises the steps of obtaining the type and the severity of the insect pest of each time period of each recording position for prediction, inputting the type and the severity of the insect pest of the latest M time periods of the recording position as input into a future prediction model of the insect pest, and calculating to obtain the type and the severity of the insect pest of the future N time periods.
Preferably, the apparatus further comprises:
insect pest space fitting model: performing spatial fitting of the type and the severity of the insect pests in the preset place according to the types and the severity of the insect pests predicted and obtained on different recording positions to obtain spatial distribution of the type and the severity of the insect pests in the preset place;
insect pest space-time fitting model: and predicting the type and the severity of the insect pest obtained in different periods of time at different recording positions, and performing space-time fitting on the type and the severity of the insect pest in the preset place to obtain the space-time distribution of the type and the severity of the insect pest in the preset place.
In a third aspect, an embodiment of the present invention provides an artificial intelligence system, where the system includes the modules of the apparatus in any one of the embodiments of the second aspect.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, where the computer program is configured to, when executed by a processor, implement the steps of the method according to any one of the embodiments of the first aspect.
In a fifth aspect, an embodiment of the present invention provides a robot system, including a memory, a processor, and an artificial intelligence robot program stored in the memory and executable on the processor, where the processor executes the program to implement the steps of the method according to any one of the embodiments of the first aspect.
The method and the robot for identifying the insect pests based on artificial intelligence and sound provided by the embodiment comprise the following steps: a position acquisition step; a sound collection step; marking insect pests by experts; training a pest recognition model; and (5) predicting the insect sound recognition model. According to the method, the system and the robot, a new pest identification mode is created through a pest sound identification method, the traditional pest identification based on vision is very difficult, the pest is very difficult to capture by a camera and is difficult to identify through a computer, the pest sound is very obvious in pest, the pest identification through the pest sound is original, and the artificial intelligence technology is combined, so that automatic and rapid identification and prediction can be achieved, and the possibility is provided for accurate pest killing.
Drawings
FIG. 1 is a flow chart of an artificial intelligence method provided by an embodiment of the invention;
FIG. 2 is a flow chart of an artificial intelligence method provided by an embodiment of the invention;
FIG. 3 is a flow chart of an artificial intelligence method provided by an embodiment of the invention.
Detailed Description
The technical solutions in the examples of the present invention are described in detail below with reference to the embodiments of the present invention.
First, the basic embodiment of the present invention
In a first aspect, an embodiment of the present invention provides an artificial intelligence method, as shown in fig. 1, where the method includes: a position acquisition step; a sound collection step; marking insect pests by experts; training a pest recognition model; and (5) predicting the insect sound recognition model. The technical effects are as follows: the method through pest sound identification has opened up the new mode of pest identification, and the tradition is based on the pest identification of vision, and the degree of difficulty is very big, because the pest is very little be difficult for caught by the camera, more difficult to discern through the computer, and pest sound is the very obvious characteristics of pest, and it is an original creation to carry out pest identification through pest sound, combines artificial intelligence technique moreover, can accomplish automatic quick discernment and prediction, provides probably for accurate deinsectization.
In a preferred embodiment, the method further comprises: and (5) insect pest visualization. The technical effects are as follows: through visualization, the user can visually feel the type and the severity of the insect pest and take corresponding insect killing measures.
In a preferred embodiment, as shown in fig. 2, the method further comprises: training a future insect pest model; and predicting a future insect pest model. The technical effects are as follows: through prediction, the future development condition of the insect pests can be known, and the insect pests can be prevented, so that the aim of preventing and controlling the insect pests is fulfilled with lower cost and less pesticide consumption.
In a preferred embodiment, as shown in fig. 3, the method further comprises: insect pest space fitting; and (5) insect pest space-time fitting. The technical effects are as follows: through fitting, the pest situation of the full airspace and the full time domain which can not be acquired can be acquired, so that the possibility is provided for omnibearing pest detection, prediction and prevention and control of a preset place, and the pest prevention and control effect is greatly improved.
In a second aspect, an embodiment of the present invention provides an artificial intelligence apparatus, where the apparatus includes: a location acquisition model; a sound collection model; an expert marks a pest model; training a model by using a pest recognition model; and (5) a worm sound identification model prediction model.
In a preferred embodiment, the apparatus further comprises: and (5) insect pest visualization model.
In a preferred embodiment, the apparatus further comprises: training a model of a future insect pest model; and (5) a future insect pest model prediction model.
In a preferred embodiment, the apparatus further comprises: fitting a model of insect pest space; and (5) insect pest space-time fitting model.
In a third aspect, an embodiment of the present invention provides an artificial intelligence system, where the system includes the modules of the apparatus in any one of the embodiments of the second aspect.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, where the computer program is configured to, when executed by a processor, implement the steps of the method according to any one of the embodiments of the first aspect.
In a fifth aspect, an embodiment of the present invention provides a robot system, including a memory, a processor, and an artificial intelligence robot program stored in the memory and executable on the processor, where the processor executes the program to implement the steps of the method according to any one of the embodiments of the first aspect.
Second, preferred embodiments of the invention
The pest singing or wing-flapping sound of a specific place such as a litchi garden is collected through equipment with a recording function such as a recorder or a mobile phone recorder or recording equipment connected with the Internet of things, the best collection time is night, the night is quiet, the noise interference is less, the pest singing or wing-flapping sound can be collected more clearly, the type and the severity of pests are judged according to the characteristics such as the size and the frequency of the pest singing or wing-flapping sound, and objective basis is provided for selection of medicines and configuration of dosage during pesticide application. The technology mainly used for identifying the pests based on the sound waves is an artificial intelligence model, firstly, a model is constructed through a large amount of sound wave data of the pests and expert knowledge, and then the types of the pests and the severity of the pests are predicted through the model.
1. Acquiring a preset place; the predetermined places include farms, forests, farmlands, orchards (litchi orchards, tomato orchards, etc.), and the like.
2. Acquiring position information (which can be acquired by a positioning device, for example, position information acquired by a mobile phone electronic map) of a plurality of positions where a plurality of recording devices (for example, mobile phone recorders or special recorders) in a preset place are located as a plurality of recording positions;
3. displaying the plurality of sound recording devices at the plurality of locations on the electronic map;
4. collecting sound through the plurality of sound recording devices, and dividing the sound file into sound segments with preset time length, wherein the time period corresponding to the sound segments is used as a sound recording time period; the sound fragments are associated with a recording device for collecting the sound, the position of the recording device and the recording time period and then stored into sound big data;
5. acquiring expert labels of each sound segment, wherein the expert labels comprise types and severity of insect pests;
6. acquiring a sound segment for training and an expert label thereof, taking the sound segment as input, taking the expert label of the sound segment as expected output, and training a preset first deep learning model to obtain a worm sound recognition deep learning model;
7. acquiring a sound segment for prediction, and inputting the sound segment into a pest sound recognition deep learning model to obtain the type and severity of pests corresponding to the sound segment;
8. predicting the sound segments recorded by each recording device in the preset place to obtain the type and severity of insect pests corresponding to the sound segments recorded by each recording device in the preset place; identifying the type and severity of the insect pest on an electronic map with different signs (such as different colors and color depths, or different insect images and insect concentration signs), wherein the electronic map not only has a space dimension but also has a time dimension;
9. acquiring the type and the severity of the insect pest of each time period of each recording position for training, taking the type and the severity of the insect pest of the first M time periods of the recording position as input, taking the type and the severity of the insect pest of the last N time periods of the recording position as expected output, and training a preset second deep learning model to obtain a future insect pest prediction model;
10. acquiring the type and the severity of the insect pest of each time period of each recording position for prediction, inputting the type and the severity of the insect pest of the latest M time periods of the recording position as input into a future prediction model of the insect pest, and calculating to obtain the type and the severity of the insect pest of the future N time periods;
11. performing spatial fitting (including interpolation fitting or fitting through a deep learning model) on the type and the severity of the insect pests in the preset place according to the types and the severity of the insect pests predicted and obtained on different recording positions to obtain spatial distribution of the type and the severity of the insect pests in the preset place;
the method comprises the steps of fitting a deep learning model, wherein spatial distribution of types and severity of insect pests predicted and obtained at different recording positions for training, types and severity of insect pests in a preset place is obtained, the types and severity of insect pests predicted and obtained at different recording positions are used as input, the spatial distribution of the types and severity of insect pests in the preset place is used as expected output, and a preset third deep learning model is trained to obtain a spatial distribution prediction model of insect pests; taking the type and severity of the insect pest predicted at different recording positions for prediction as the input of an insect pest spatial distribution prediction model, and calculating to obtain the output as the spatial distribution of the type and severity of the insect pest in a preset place;
12. performing space-time fitting (including interpolation fitting or fitting through a deep learning model) on the type and the severity of the insect pests in the preset place according to the types and the severity of the insect pests predicted at different time intervals on different recording positions to obtain the space-time distribution of the type and the severity of the insect pests in the preset place;
fitting the deep learning model, namely acquiring the type and the spatial distribution of the severity of the insect pest in the preset place and the spatial distribution of the type and the severity of the insect pest in the preset place in different recording periods for training, taking the type and the spatial distribution of the severity of the insect pest in the preset place as input, taking the type and the spatial distribution of the severity of the insect pest in the preset place as expected output, and training a preset fourth deep learning model to obtain an insect pest spatial and temporal distribution prediction model; and taking the type of the insect pest and the spatial distribution of the severity of the insect pest in the preset place in different recording time periods for prediction as the input of the insect pest spatial-temporal distribution prediction model, and calculating to obtain the output as the spatial-temporal distribution of the type of the insect pest and the severity of the insect pest in the preset place.
13. According to the predicted types and severity of the insect pests in the N time periods in the future, carrying out prevention suggestions of the insect pests, wherein the prevention suggestions comprise recommendation of pesticide types and recommendation of pesticide sprinkling irrigation time; and the killing suggestions of the pests comprise recommendation of pesticide types and recommendation of pesticide sprinkling irrigation places and time according to the types of the pests and the time-space distribution of the severity of the pests in the preset place.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present invention. It should be noted that, for those skilled in the art, various changes and modifications can be made without departing from the spirit of the present invention, and these changes and modifications are within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. An artificial intelligence method, the method comprising:
a position acquisition step: acquiring position information of a plurality of positions where a plurality of sound recording devices in a preset place are located as a plurality of sound recording positions;
a sound collection step: collecting sound through the plurality of sound recording devices, and dividing the sound file into sound segments with preset time length, wherein the time period corresponding to the sound segments is used as a sound recording time period; the sound fragments are associated with a recording device for collecting the sound, the position of the recording device and the recording time period and then stored into sound big data;
and (3) marking insect pests by experts: acquiring expert labels of each sound segment, wherein the expert labels comprise types and severity of insect pests;
training a pest recognition model: acquiring a sound segment for training and an expert label thereof, taking the sound segment as input, taking the expert label of the sound segment as expected output, and training a preset first deep learning model to obtain a worm sound recognition deep learning model;
predicting the insect sound recognition model: and acquiring a sound segment for prediction, and inputting the sound segment into the insect sound recognition deep learning model to obtain the type and severity of the insect damage corresponding to the sound segment.
2. The artificial intelligence method of claim 1, wherein the method further comprises:
insect pest visualization step: predicting the sound segments recorded by each recording device in the preset place to obtain the type and severity of insect pests corresponding to the sound segments recorded by each recording device in the preset place; the type and severity of the insect pest are marked on an electronic map with different signs (such as different colors and color depths, or different insect images and insect concentration signs), and the electronic map not only has a space dimension but also has a time dimension.
3. The artificial intelligence method of claim 1, wherein the method further comprises:
training a future insect pest model: acquiring the type and the severity of the insect pest of each time period of each recording position for training, taking the type and the severity of the insect pest of the first M time periods of the recording position as input, taking the type and the severity of the insect pest of the last N time periods of the recording position as expected output, and training a preset second deep learning model to obtain a future insect pest prediction model;
predicting a future insect pest model: the method comprises the steps of obtaining the type and the severity of the insect pest of each time period of each recording position for prediction, inputting the type and the severity of the insect pest of the latest M time periods of the recording position as input into a future prediction model of the insect pest, and calculating to obtain the type and the severity of the insect pest of the future N time periods.
4. The artificial intelligence method of claim 1, wherein the method further comprises:
insect pest space fitting step: performing spatial fitting of the type and the severity of the insect pests in the preset place according to the types and the severity of the insect pests predicted and obtained on different recording positions to obtain spatial distribution of the type and the severity of the insect pests in the preset place;
insect pest space-time fitting step: and predicting the type and the severity of the insect pest obtained in different periods of time at different recording positions, and performing space-time fitting on the type and the severity of the insect pest in the preset place to obtain the space-time distribution of the type and the severity of the insect pest in the preset place.
5. An artificial intelligence apparatus, the apparatus comprising:
a position acquisition model: acquiring position information of a plurality of positions where a plurality of sound recording devices in a preset place are located as a plurality of sound recording positions;
a sound collection model: collecting sound through the plurality of sound recording devices, and dividing the sound file into sound segments with preset time length, wherein the time period corresponding to the sound segments is used as a sound recording time period; the sound fragments are associated with a recording device for collecting the sound, the position of the recording device and the recording time period and then stored into sound big data;
and (3) marking a pest model by an expert: acquiring expert labels of each sound segment, wherein the expert labels comprise types and severity of insect pests;
training a worm sound recognition model: acquiring a sound segment for training and an expert label thereof, taking the sound segment as input, taking the expert label of the sound segment as expected output, and training a preset first deep learning model to obtain a worm sound recognition deep learning model;
the insect sound recognition model prediction model comprises the following steps: and acquiring a sound segment for prediction, and inputting the sound segment into the insect sound recognition deep learning model to obtain the type and severity of the insect damage corresponding to the sound segment.
6. The artificial intelligence device of claim 5, wherein the device further comprises:
visual insect pest model: predicting the sound segments recorded by each recording device in the preset place to obtain the type and severity of insect pests corresponding to the sound segments recorded by each recording device in the preset place; the type and severity of the insect pest are marked on an electronic map with different signs (such as different colors and color depths, or different insect images and insect concentration signs), and the electronic map not only has a space dimension but also has a time dimension.
7. The artificial intelligence device of claim 5, wherein the device further comprises:
future insect pest model training model: acquiring the type and the severity of the insect pest of each time period of each recording position for training, taking the type and the severity of the insect pest of the first M time periods of the recording position as input, taking the type and the severity of the insect pest of the last N time periods of the recording position as expected output, and training a preset second deep learning model to obtain a future insect pest prediction model;
future insect pest model prediction model: the method comprises the steps of obtaining the type and the severity of the insect pest of each time period of each recording position for prediction, inputting the type and the severity of the insect pest of the latest M time periods of the recording position as input into a future prediction model of the insect pest, and calculating to obtain the type and the severity of the insect pest of the future N time periods.
8. The artificial intelligence device of claim 5, wherein the device further comprises:
insect pest space fitting model: performing spatial fitting of the type and the severity of the insect pests in the preset place according to the types and the severity of the insect pests predicted and obtained on different recording positions to obtain spatial distribution of the type and the severity of the insect pests in the preset place;
insect pest space-time fitting model: and predicting the type and the severity of the insect pest obtained in different periods of time at different recording positions, and performing space-time fitting on the type and the severity of the insect pest in the preset place to obtain the space-time distribution of the type and the severity of the insect pest in the preset place.
9. A robot comprising a memory, a processor and an artificial intelligence robot program stored on the memory and executable on the processor, wherein the steps of the method of any one of claims 1 to 4 are carried out when the program is executed by the processor.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 4.
CN202110584234.5A 2021-05-27 2021-05-27 Insect pest identification method and robot based on artificial intelligence and voice Active CN113506579B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110584234.5A CN113506579B (en) 2021-05-27 2021-05-27 Insect pest identification method and robot based on artificial intelligence and voice

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110584234.5A CN113506579B (en) 2021-05-27 2021-05-27 Insect pest identification method and robot based on artificial intelligence and voice

Publications (2)

Publication Number Publication Date
CN113506579A true CN113506579A (en) 2021-10-15
CN113506579B CN113506579B (en) 2024-01-23

Family

ID=78009240

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110584234.5A Active CN113506579B (en) 2021-05-27 2021-05-27 Insect pest identification method and robot based on artificial intelligence and voice

Country Status (1)

Country Link
CN (1) CN113506579B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101976564A (en) * 2010-10-15 2011-02-16 中国林业科学研究院森林生态环境与保护研究所 Method for identifying insect voice
CN106094008A (en) * 2016-05-20 2016-11-09 渭南师范学院 A kind of grain storage pest sound detection identification system
CN108615046A (en) * 2018-03-16 2018-10-02 北京邮电大学 A kind of stored-grain pests detection recognition methods and device
CN110797033A (en) * 2019-09-19 2020-02-14 平安科技(深圳)有限公司 Artificial intelligence-based voice recognition method and related equipment thereof
CN112560964A (en) * 2020-12-18 2021-03-26 深圳赛安特技术服务有限公司 Method and system for training Chinese herbal medicine pest and disease identification model based on semi-supervised learning

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101976564A (en) * 2010-10-15 2011-02-16 中国林业科学研究院森林生态环境与保护研究所 Method for identifying insect voice
CN106094008A (en) * 2016-05-20 2016-11-09 渭南师范学院 A kind of grain storage pest sound detection identification system
CN108615046A (en) * 2018-03-16 2018-10-02 北京邮电大学 A kind of stored-grain pests detection recognition methods and device
CN110797033A (en) * 2019-09-19 2020-02-14 平安科技(深圳)有限公司 Artificial intelligence-based voice recognition method and related equipment thereof
CN112560964A (en) * 2020-12-18 2021-03-26 深圳赛安特技术服务有限公司 Method and system for training Chinese herbal medicine pest and disease identification model based on semi-supervised learning

Also Published As

Publication number Publication date
CN113506579B (en) 2024-01-23

Similar Documents

Publication Publication Date Title
US20220107298A1 (en) Systems and methods for crop health monitoring, assessment and prediction
CN109922310B (en) Target object monitoring method, device and system
CN107392091B (en) Agricultural artificial intelligence crop detection method, mobile terminal and computer readable medium
Boissard et al. A cognitive vision approach to early pest detection in greenhouse crops
WO2018047726A1 (en) Information processing device and information processing system
JP2021514548A (en) Target object monitoring methods, devices and systems
CN109492665A (en) Detection method, device and the electronic equipment of growth period duration of rice
CN115482465A (en) Crop disease and insect pest prediction method and system based on machine vision and storage medium
US20230326105A1 (en) Systems and methods for pest pressure heat maps
CN115294518B (en) Intelligent monitoring method and system for precise greenhouse cultivation of horticultural plants
US20220414795A1 (en) Crop disease prediction and treatment based on artificial intelligence (ai) and machine learning (ml) models
KR20190015656A (en) Fruit monitoring system and method at the same
Hamilton et al. When you can't see the koalas for the trees: Using drones and machine learning in complex environments
CA3174472C (en) Systems and methods for pest pressure heat maps
CN117094532B (en) Orchard intelligent monitoring system
US20210279639A1 (en) Systems and methods for predicting pest pressure using geospatial features and machine learning
CN113506579A (en) Insect pest recognition method based on artificial intelligence and sound and robot
CN116543347A (en) Intelligent insect condition on-line monitoring system, method, device and medium
Casoli et al. Parameterizing animal sounds and motion with animal-attached tags to study acoustic communication
CN114651283A (en) Seedling emergence by search function
CN111178354B (en) Mangrove pest monitoring method and system
US20240196879A1 (en) Spot weed detection and treatment within a field of view in accordance with machine learning training
Fraser et al. Bat Echolocation Research
Devi et al. Intelligence Surveillance Robot for Bug Detection
Reddy et al. Chapter-4 Forecasting and Expert Models in Plant Diseases Management: An Overview

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant