CN115272887A - Coastal zone garbage identification method, device and equipment based on unmanned aerial vehicle detection - Google Patents

Coastal zone garbage identification method, device and equipment based on unmanned aerial vehicle detection Download PDF

Info

Publication number
CN115272887A
CN115272887A CN202210857999.6A CN202210857999A CN115272887A CN 115272887 A CN115272887 A CN 115272887A CN 202210857999 A CN202210857999 A CN 202210857999A CN 115272887 A CN115272887 A CN 115272887A
Authority
CN
China
Prior art keywords
area
garbage
aerial
sample
identification
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210857999.6A
Other languages
Chinese (zh)
Inventor
李少瑞
祝振昌
徐南豪
朱琴
蔡宴朋
杨志峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong University of Technology
Original Assignee
Guangdong University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong University of Technology filed Critical Guangdong University of Technology
Priority to CN202210857999.6A priority Critical patent/CN115272887A/en
Publication of CN115272887A publication Critical patent/CN115272887A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/54Extraction of image or video features relating to texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/70Labelling scene content, e.g. deriving syntactic or semantic representations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20092Interactive image processing based on input by user
    • G06T2207/20104Interactive definition of region of interest [ROI]

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computational Linguistics (AREA)
  • Artificial Intelligence (AREA)
  • Software Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • Remote Sensing (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Molecular Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Geometry (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to the technical field of image detection, in particular to a coastal zone garbage identification method based on unmanned aerial vehicle detection, which comprises the following steps: acquiring a sample aerial photographing set and a marked data set of a sample area photographed by an unmanned aerial vehicle, inputting the sample aerial photographing set and the marked data set into a preset convolutional neural network model, acquiring a feature training database output by the convolutional neural network model, inputting the feature training database into a neural network model to be trained, and training to acquire a coastal zone garbage recognition model; responding to the identification instruction, wherein the identification instruction comprises an aerial image of the area to be identified shot by the unmanned aerial vehicle, inputting the aerial image of the area to be identified into a coastal zone garbage identification model, and acquiring a garbage area identification result of the aerial image of the area to be identified; and responding to the display instruction, acquiring electronic map data corresponding to the aerial image of the area to be identified, and displaying and labeling the rubbish identification area according to the rubbish area identification result of the aerial image to be identified.

Description

Coastal zone garbage identification method, device and equipment based on unmanned aerial vehicle detection
Technical Field
The invention relates to the technical field of image detection, in particular to a coastal zone garbage identification method, a device, equipment and a storage medium based on unmanned aerial vehicle detection.
Background
The coastline is a datum line for dividing sea and land management areas, is an important content for human to research the interaction between sea and land, the influence of sea activity on the coastline, the comprehensive management of the coastline and the offshore area ecosystem,
the coastal zone is used as a sea-land intersection area, meanwhile, the coastal zone is influenced by artificial activities and natural factors such as tides and ocean currents, a large amount of garbage is easy to gather, the garbage seriously influences the environment of the coastal zone, the traditional method is to clean the coastal zone one by one manually, so that the position of the garbage cannot be found quickly and accurately and the amount of the garbage can not be known, and a large amount of manpower, material resources and financial resources are wasted.
Disclosure of Invention
Based on the above, an object of the present invention is to provide a method, an apparatus, a device, and a storage medium for identifying garbage in a coastal zone based on unmanned aerial vehicle detection, wherein a feature training database is constructed according to an aerial image shot by an unmanned aerial vehicle and label data associated with a garbage region in the aerial image by a deep learning method, so as to train a model for identifying garbage in a coastal zone, thereby quickly and accurately identifying the garbage region in the aerial image to be identified, and reducing labor cost and time cost for identifying garbage in a coastal zone.
In a first aspect, an embodiment of the present application provides a coastal zone garbage identification method based on unmanned detection, including the following steps:
acquiring a sample aerial photo set and a mark data set of a sample area shot by an unmanned aerial vehicle, wherein the sample aerial photo set comprises a plurality of sample aerial photo images, the sample aerial photo images comprise garbage areas, and the mark data set comprises position parameters of the garbage areas of the plurality of sample aerial photo images;
inputting the sample aerial photographing set and the marking data set into a preset convolutional neural network model to obtain a feature training database output by the convolutional neural network model, inputting the feature training database into a neural network model to be trained for training to obtain a coastal zone garbage recognition model;
responding to an identification instruction, wherein the identification instruction comprises an aerial image of a to-be-identified area shot by an unmanned aerial vehicle, inputting the aerial image of the to-be-identified area into the coastal zone garbage identification model, and acquiring a garbage area identification result of the aerial image of the to-be-identified area;
and responding to a display instruction, acquiring electronic map data corresponding to the aerial image of the area to be identified, performing frame selection of a junk area on the electronic map data according to a junk area identification result of the aerial image to be identified, acquiring the junk identification area of the aerial image to be identified, and displaying and marking the junk identification area.
In a second aspect, the present application provides a coastal zone garbage identification device based on unmanned detection, including:
the system comprises an acquisition module, a storage module and a processing module, wherein the acquisition module is used for acquiring a sample aerial photography set and a mark data set of a sample area shot by an unmanned aerial vehicle, the sample aerial photography set comprises a plurality of sample aerial images, the sample aerial images comprise garbage areas, and the mark data set comprises position parameters of the garbage areas of the plurality of sample aerial images;
the training module is used for inputting the sample aerial photography set and the marking data set into a preset convolutional neural network model, obtaining a characteristic training database output by the convolutional neural network model, inputting the characteristic training database into a neural network model to be trained, and training to obtain a coastal zone garbage recognition model;
the identifying module is used for responding to an identifying instruction, the identifying instruction comprises an aerial image of a to-be-identified area shot by an unmanned aerial vehicle, the aerial image of the to-be-identified area is input into the coastal zone garbage identifying model, and a garbage area identifying result of the aerial image of the to-be-identified area is obtained;
and the display module is used for responding to a display instruction, acquiring electronic map data corresponding to the aerial image of the area to be identified, carrying out frame selection on a garbage area on the electronic map data according to a garbage area identification result of the aerial image to be identified, acquiring a garbage identification area of the aerial image to be identified, and displaying and marking the garbage identification area.
In a third aspect, an embodiment of the present application provides a computer device, including: a processor, a memory, and a computer program stored on the memory and executable on the processor; the computer program when executed by the processor performs the steps of the method for coastal zone garbage recognition based on unmanned detection according to the first aspect.
In a fourth aspect, the present application provides a storage medium storing a computer program, which when executed by a processor implements the steps of the method for identifying garbage in a coastal zone based on unmanned detection according to the first aspect.
In the embodiment of the application, a coastal zone garbage recognition method, a device, equipment and a storage medium based on unmanned aerial vehicle detection are provided, a feature training database is constructed according to aerial images shot by an unmanned aerial vehicle and marking data associated with garbage areas in the aerial images through a deep learning method, training of a coastal zone garbage recognition model is performed, the garbage areas in the aerial images to be recognized can be recognized quickly and accurately, and the labor cost and the time cost of coastal zone garbage recognition are reduced.
For a better understanding and practice, the invention is described in detail below with reference to the accompanying drawings.
Drawings
Fig. 1 is a schematic flowchart of a method for identifying garbage in a coastal zone based on unmanned detection according to an embodiment of the present application;
fig. 2 is a schematic flowchart of a coastal zone garbage identification method based on unmanned detection according to another embodiment of the present application;
fig. 3 is a schematic flowchart of S2 in the method for identifying garbage in a coastal zone based on unmanned detection according to an embodiment of the present application;
fig. 4 is a schematic flowchart of S202 in the method for identifying garbage in a coastal zone based on unmanned detection according to an embodiment of the present application;
fig. 5 is a schematic flowchart of S4 in the method for identifying garbage in a coastal zone based on unmanned detection according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a coastal zone garbage recognition device based on unmanned detection according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the application, as detailed in the appended claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It should be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present application. The word "if as used herein may be interpreted as" at "8230; \8230when" or "when 8230; \823030, when" or "in response to a determination", depending on the context.
Referring to fig. 1, fig. 1 is a schematic flow chart of a method for identifying garbage in a coastal zone based on unmanned detection according to an embodiment of the present application, where the method includes the following steps:
s1: and acquiring a sample aerial photographing set and a marking data set of the sample area photographed by the unmanned aerial vehicle.
The main execution body of the coastal zone garbage identification method based on unmanned aerial vehicle detection is identification equipment (hereinafter referred to as identification equipment for short) of the coastal zone garbage identification method based on unmanned aerial vehicle detection, and in an optional embodiment, the identification equipment may be one computer device, a server, or a server cluster formed by combining a plurality of computer devices.
The sample aerial photography set comprises a plurality of sample aerial photography images, the sample aerial photography images comprise garbage areas, and the marking data set comprises position parameters of the garbage areas of the plurality of sample aerial photography images.
In this embodiment, the identification device may acquire, as the marker dataset, a plurality of sample aerial images of a coastal zone area photographed by the drone, wherein the sample aerial image includes a plurality of sample aerial images, and acquire, as the marker dataset, a position parameter of a garbage area of the plurality of sample aerial images input by a user.
Referring to fig. 2, fig. 2 is a schematic flowchart of a method for identifying garbage in a coastal zone based on unmanned detection according to another embodiment of the present application, including step S5, where the step S5 is before step S2, and specifically, the following steps are included:
s5: and carrying out scaling treatment on a plurality of sample aerial images in the sample aerial photographing set to obtain a plurality of scaled sample aerial images.
Because the pixels of the sample aerial images shot by the unmanned aerial vehicle are too large, in order to improve the efficiency and accuracy of the convolutional neural network model for constructing the feature training database, in this embodiment, the recognition device performs scaling on a plurality of sample aerial images in the sample aerial image set, obtains a plurality of scaled sample aerial images, and obtains the position parameters of the garbage areas of the plurality of scaled sample aerial images input by the user as the marking data set.
S2: inputting the sample aerial photography set and the marking data set into a preset convolutional neural network model, obtaining a feature training database output by the convolutional neural network model, inputting the feature training database into a neural network model to be trained, and training to obtain a coastal zone garbage recognition model.
The Convolutional Neural Network model is a Mask-RCNN (Mask Region-based probabilistic Neural Network) Network model, the Mask-RCNN is a Convolutional Neural Network model which can effectively detect an object and simultaneously output a high-quality example segmentation characteristic diagram, can be applied to aspects such as example segmentation, object detection, human body key point detection and the like,
in this embodiment, the identification device inputs the sample aerial photography set and the labeled data set to a preset convolutional neural network model, obtains a feature training database output by the convolutional neural network model, inputs the feature training database to the neural network model to be trained, and trains the neural network model to obtain the coastal zone garbage identification model.
The MaskRCNN network model comprises a feature extraction layer, a feature region selection layer and a frame cutting pooling layer which are sequentially connected. Referring to fig. 3, fig. 3 is a schematic flowchart of a process S2 in the method for identifying garbage in a coastal zone based on unmanned aerial vehicle detection according to an embodiment of the present application, including steps S201 to S203, which are as follows:
s201: inputting the sample aerial photographing set and the marking data set into the feature extraction layer, carrying out scaling treatment on the garbage area of each sample aerial photographing image according to a plurality of preset scaling scales, obtaining a plurality of scaling sub-sample aerial photographing images corresponding to each sample aerial photographing image, and obtaining a first feature map of the plurality of scaling sub-sample aerial photographing images.
The feature extraction layer is a Backbone layer, and the Backbone layer adopts ResNet-50 or ResNet-101 as a feature extractor to extract features.
Since the low-level features often contain more detailed information, such as color, contour, texture, etc., in order to better acquire the feature training database for improving the accuracy of training the coastal zone garbage recognition model, in the embodiment, the recognition device presets a plurality of scaling scales, such as 1/4, 1/8, 1/16, 1/32, 1/64;
the identification equipment inputs the sample aerial photographing set and the marking data set into the feature extraction layer, the garbage area of each sample aerial photographing image is subjected to scaling processing according to a plurality of preset scaling scales, sub-sample aerial photographing images of the plurality of scaling scales corresponding to each sample aerial photographing image are obtained, convolution processing is carried out on the sub-sample aerial photographing images of the plurality of scaling scales corresponding to each sample aerial photographing image, and a first feature map of the sub-sample aerial photographing images of the plurality of scaling scales is obtained.
S202: the method comprises the steps of obtaining a first feature map of a plurality of scaled sub-sample aerial images, inputting the first feature map into a feature area selection layer, obtaining a plurality of frames corresponding to all pixel points in the first feature map of the plurality of scaled sub-sample aerial images according to preset frame parameters, extracting a plurality of target frames from the plurality of frames corresponding to all pixel points, and obtaining a feature map corresponding to the plurality of target frames corresponding to all pixel points of the plurality of scaled sub-sample aerial images according to the plurality of target frames.
In this embodiment, the identification device obtains a plurality of frames corresponding to each pixel point in the feature maps of the sub-sample aerial images with the scaling scales, and in an optional embodiment, the frame parameters include a frame area parameter and an aspect ratio parameter, and the identification device can change the aspect ratio parameter by keeping the frame area parameter unchanged, so as to obtain a plurality of frames with different shapes and sizes corresponding to each pixel point, and judge the degree of association between the frame and the garbage Region according to the Region of the first feature map corresponding to the frame, extract a plurality of target frames from the plurality of frames corresponding to each pixel point, and obtain the Region of the first feature map corresponding to the target frames as the feature maps corresponding to the plurality of target frames, and obtain the corresponding target features of the aerial images of the sub-sample aerial images with the scaling scales according to the plurality of target frames.
Referring to fig. 4, fig. 4 is a schematic flow chart of S202 in the method for identifying garbage in a coastal zone based on unmanned aerial vehicle detection according to an embodiment of the present application, including steps S2021 to S2022, which are as follows:
s2021: and calculating the object detection probability of the plurality of frames corresponding to each pixel point.
In this embodiment, the identification device may calculate, as the object detection probability, the proportion of the pixels corresponding to the garbage region in the first feature map region corresponding to the plurality of frames corresponding to the plurality of pixels in the first feature map of the subsample aerial image of the plurality of scaling scales, and is used to display the association degree between the frame and the garbage region.
S2022: and extracting a plurality of target frames from a plurality of frames corresponding to each pixel point according to the object detection probability and a preset detection probability threshold.
The detection probability threshold includes a first detection probability threshold and a second detection probability threshold, and in an optional embodiment, the first detection probability threshold may be set to 0.7, and the second detection probability threshold may be set to 0.3.
In this embodiment, the identification device compares, in the first feature map of the plurality of scaled sub-sample aerial images, object detection probabilities corresponding to a plurality of frames corresponding to each pixel point with the first detection probability threshold and the second detection probability threshold. Specifically, when the object detection probability is greater than or equal to the first detection probability threshold, that is, the degree of association between the region of the first feature map corresponding to the frame and the garbage region is reflected, the frame is set as a target frame, and when the object detection probability is less than or equal to the first detection probability threshold, that is, the degree of non-association between the region of the first feature map corresponding to the frame and the garbage region is reflected, the frame is set as a target frame, so that a plurality of target frames are extracted from a plurality of frames corresponding to the pixels.
S203: inputting the feature maps corresponding to a plurality of target frames corresponding to each pixel point of the sub-sample aerial images with the scaling scales into the frame cutting pooling layer, cutting and pooling, and acquiring the processed feature areas corresponding to the feature maps corresponding to the target frames corresponding to each pixel point of the sub-sample aerial images with the scaling scales to serve as the feature training database.
The frame cutting pooling layer is an ROI Align layer, and the ROI Align layer extracts the characteristic regions corresponding to the characteristic graphs of the sub-sample aerial images with the scaling by adopting a bilinear interpolation method so as to prevent a part of pixel points in the characteristic regions from being lost.
In this embodiment, the recognition device inputs the feature maps corresponding to the target frames corresponding to the pixel points of the scaled sub-sample aerial images into the frame clipping pooling layer, performs clipping and pooling, and obtains feature areas corresponding to the feature maps corresponding to the target frames corresponding to the pixel points of the scaled sub-sample aerial images, as the feature training database.
S3: responding to an identification instruction, wherein the identification instruction comprises an aerial image of a to-be-identified area shot by an unmanned aerial vehicle, inputting the aerial image of the to-be-identified area into the coastal zone garbage identification model, and acquiring a garbage area identification result of the aerial image of the to-be-identified area.
The identification instruction is sent by the user and received by the identification device.
In this embodiment, the identification device acquires an identification instruction sent by a user, responds to the identification instruction, acquires an aerial image of a to-be-identified area shot by the unmanned aerial vehicle, inputs the aerial image of the to-be-identified area into the coastal zone garbage identification model, and acquires a garbage area identification result of the aerial image of the to-be-identified area.
S4: and responding to a display instruction, acquiring electronic map data corresponding to the aerial image of the area to be identified, performing frame selection of a garbage area on the electronic map data according to a garbage area identification result of the aerial image to be identified, acquiring the garbage identification area of the aerial image to be identified, and displaying and marking the garbage identification area.
The display instruction is sent by a user and received by the identification device.
In this embodiment, an identification device acquires a display instruction sent by a user, responds to the display instruction, acquires electronic map data corresponding to an aerial image of a region to be identified, frames a garbage region on the electronic map data according to a garbage region identification result of the aerial image to be identified, acquires the garbage identification region of the aerial image to be identified, acquires a target pixel point in the garbage identification region, performs color filling processing around the target pixel point, acquires the garbage identification region after the color filling processing, returns to a display interface of the identification device, and displays and labels the garbage identification region after the color filling processing.
Referring to fig. 5, fig. 5 is a schematic flowchart of a step S4 in the method for identifying garbage in a coastal zone based on unmanned detection according to an embodiment of the present application, including steps S401 to S402, which are as follows:
s401: the method comprises the steps of obtaining the pixel area of a garbage recognition area of the aerial image to be recognized, and obtaining the area parameter of the garbage recognition area of the aerial image to be recognized according to the pixel area and a preset area calculation algorithm.
In this embodiment, the identification device obtains the pixel area of the trash identification region of the aerial image to be identified, and obtains the area parameter of the trash identification region of the aerial image to be identified according to the pixel area and a preset area conversion ratio.
S402: and displaying and marking the area parameters on the electronic map data according to the area parameters of the garbage identification area of the aerial image to be identified.
In this embodiment, the identification device returns to the display interface of the identification device according to the area parameter of the spam identification area of the to-be-identified aerial image, and displays and labels the area parameter on the electronic map data.
Referring to fig. 6, fig. 6 is a schematic structural diagram of a coastal zone garbage identification apparatus based on unmanned aerial vehicle detection according to an embodiment of the present application, which may implement all or a part of the coastal zone garbage identification apparatus based on unmanned aerial vehicle detection through software, hardware or a combination of both, where the apparatus 6 includes:
the acquisition module 61 is configured to acquire a sample aerial photo set of a sample area shot by an unmanned aerial vehicle, wherein the sample aerial photo set includes a plurality of sample aerial photo images, the sample aerial photo images include a garbage area, and the marker data set includes a position parameter of the garbage area of the plurality of sample aerial photo images;
the training module 62 is configured to input the sample aerial photography set and the labeled data set to a preset convolutional neural network model, obtain a feature training database output by the convolutional neural network model, input the feature training database to the neural network model to be trained, and train the neural network model to obtain a coastal zone garbage recognition model;
the recognition module 63 is used for responding to a recognition instruction, wherein the recognition instruction comprises an aerial image of a region to be recognized, which is shot by an unmanned aerial vehicle, inputting the aerial image of the region to be recognized into the coastal zone garbage recognition model, and acquiring a garbage region recognition result of the aerial image of the region to be recognized;
and the display module 64 is used for responding to a display instruction, acquiring electronic map data corresponding to the aerial image of the area to be identified, performing frame selection of a garbage area on the electronic map data according to a garbage area identification result of the aerial image to be identified, acquiring a garbage identification area of the aerial image to be identified, and displaying and marking the garbage identification area.
In the embodiment, a sample aerial photography set and a mark data set of a sample area shot by an unmanned aerial vehicle are obtained through an obtaining module, wherein the sample aerial photography set comprises a plurality of sample aerial images, the sample aerial images comprise garbage areas, and the mark data set comprises position parameters of the garbage areas of the plurality of sample aerial images; inputting the sample aerial photographing set and the marking data set into a preset convolutional neural network model through a training module to obtain a feature training database output by the convolutional neural network model, inputting the feature training database into a neural network model to be trained for training, and obtaining a coastal zone garbage recognition model; responding to an identification instruction by an identification module, wherein the identification instruction comprises an aerial image of a to-be-identified area shot by an unmanned aerial vehicle, inputting the aerial image of the to-be-identified area into the coastal zone garbage identification model, and acquiring a garbage area identification result of the aerial image of the to-be-identified area; the method comprises the steps of responding to a display instruction through a display module, obtaining electronic map data corresponding to an aerial image of a region to be identified, carrying out frame selection on a rubbish region on the electronic map data according to a rubbish region identification result of the aerial image to be identified, obtaining the rubbish identification region of the aerial image to be identified, and displaying and marking the rubbish identification region.
Through the deep learning method, the feature training database is constructed according to the aerial images shot by the unmanned aerial vehicle and the marking data associated with the garbage areas in the aerial images, the training of the garbage recognition models in the coastal zones is carried out, the garbage areas in the aerial images to be recognized can be recognized quickly and accurately, and the labor cost and the time cost of garbage recognition in the coastal zones are reduced.
Referring to fig. 7, fig. 7 is a schematic structural diagram of a computer device according to an embodiment of the present application, where the computer device 7 includes: a processor 71, a memory 72, and a computer program 73 stored on the memory 72 and operable on the processor 71; the computer device may store a plurality of instructions, where the instructions are suitable for being loaded by the processor 71 and executing the method steps in the embodiments shown in fig. one to fig. five, and a specific execution process may refer to specific descriptions of the embodiments shown in fig. one to fig. five, which are not described herein again.
Processor 71 may include one or more processing cores, among others. The processor 71 is connected to various parts in the server by various interfaces and lines, executes various functions of the coastal zone garbage recognition device 6 based on the unmanned aerial vehicle detection and processes data by operating or executing instructions, programs, code sets or instruction sets stored in the memory 72 and calling data in the memory 72, and optionally, the processor 71 may be implemented in at least one hardware form of Digital Signal Processing (DSP), field-Programmable Gate Array (FPGA), programmable Logic Array (PLA). The processor 71 may integrate one or a combination of a Central Processing Unit (CPU) 71, a Graphics Processing Unit (GPU) 71, a modem, and the like. The CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing contents required to be displayed by the touch display screen; the modem is used to handle wireless communications. It is understood that the modem may be implemented by a single chip without being integrated into the processor 71.
The Memory 72 may include a Random Access Memory (RAM) 72 or a Read-Only Memory (Read-Only Memory) 72. Optionally, the memory 72 includes a non-transitory computer-readable medium. The memory 72 may be used to store instructions, programs, code, sets of codes, or sets of instructions. The memory 72 may include a program storage area and a data storage area, wherein the program storage area may store instructions for implementing an operating system, instructions for at least one function (such as touch instructions, etc.), instructions for implementing the various method embodiments described above, and the like; the storage data area may store data and the like referred to in the above respective method embodiments. The memory 72 may alternatively be at least one memory device located remotely from the processor 71.
The embodiment of the present application further provides a storage medium, where the storage medium may store a plurality of instructions, where the instructions are suitable for being loaded by a processor and executing the method steps in the embodiments shown in fig. one to fig. five, and a specific execution process may refer to specific descriptions of the embodiments shown in fig. one to fig. five, which is not described herein again.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium, and when the computer program is executed by a processor, the steps of the method embodiments may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc.
The present invention is not limited to the above-described embodiments, and various modifications and variations of the present invention are intended to be included within the scope of the claims and the equivalent technology of the present invention if they do not depart from the spirit and scope of the present invention.

Claims (9)

1. A coastal zone garbage identification method based on unmanned aerial vehicle detection is characterized by comprising the following steps:
acquiring a sample aerial photo set and a mark data set of a sample area shot by an unmanned aerial vehicle, wherein the sample aerial photo set comprises a plurality of sample aerial photo images, the sample aerial photo images comprise garbage areas, and the mark data set comprises position parameters of the garbage areas of the plurality of sample aerial photo images;
inputting the sample aerial photographing set and the marking data set into a preset convolutional neural network model to obtain a feature training database output by the convolutional neural network model, inputting the feature training database into a neural network model to be trained for training to obtain a coastal zone garbage recognition model;
responding to an identification instruction, wherein the identification instruction comprises an aerial image of a to-be-identified area shot by an unmanned aerial vehicle, inputting the aerial image of the to-be-identified area into the coastal zone garbage identification model, and acquiring a garbage area identification result of the aerial image of the to-be-identified area;
and responding to a display instruction, acquiring electronic map data corresponding to the aerial image of the area to be identified, performing frame selection of a garbage area on the electronic map data according to a garbage area identification result of the aerial image to be identified, acquiring the garbage identification area of the aerial image to be identified, and displaying and marking the garbage identification area.
2. The coastal zone garbage recognition method based on unmanned aerial vehicle detection as claimed in claim 1, wherein before inputting the sample aerial photography set and the labeled data set into a preset convolutional neural network model for training and obtaining a coastal zone garbage recognition model, the method comprises the following steps:
and zooming a plurality of sample aerial images in the sample aerial photographing set to obtain a plurality of zoomed sample aerial images.
3. The coastal zone garbage identification method based on unmanned aerial vehicle detection as claimed in claim 1, characterized in that: the convolutional neural network model is a MaskRCNN network model, and the MaskRCNN network model comprises a feature extraction layer, a feature region selection layer and a frame cutting pooling layer which are sequentially connected.
4. The coastal zone garbage recognition method based on unmanned aerial vehicle detection as claimed in claim 3, wherein the step of inputting the sample aerial photography set and the labeled data set into a preset convolutional neural network model to obtain a feature database output by the convolutional neural network model comprises the steps of:
inputting the sample aerial photographing set and the marking data set into the feature extraction layer, and carrying out scaling treatment on the garbage area of each sample aerial photographing image according to a plurality of preset scaling scales to obtain a plurality of scaling sub-sample aerial photographing images corresponding to each sample aerial photographing image and obtain a first feature map of the plurality of scaling sub-sample aerial photographing images;
obtaining a first feature map of the plurality of scaled sub-sample aerial images, inputting the first feature map into the feature area selection layer, obtaining a plurality of frames corresponding to each pixel point in the first feature map of the plurality of scaled sub-sample aerial images according to preset frame parameters, extracting a plurality of target frames from the plurality of frames corresponding to each pixel point, and obtaining a feature map corresponding to the plurality of target frames corresponding to each pixel point of the plurality of scaled sub-sample aerial images according to the plurality of target frames;
inputting the feature maps corresponding to a plurality of target frames corresponding to each pixel point of the sub-sample aerial images with the scaling scales into the frame cutting pooling layer, cutting and pooling, and acquiring the processed feature areas corresponding to the feature maps corresponding to the target frames corresponding to each pixel point of the sub-sample aerial images with the scaling scales to serve as the feature training database.
5. The coastal zone garbage identification method based on unmanned aerial vehicle detection as claimed in claim 4, wherein the step of extracting a plurality of target frames from a plurality of frames corresponding to the respective pixel points comprises the steps of:
calculating object detection probabilities of a plurality of frames corresponding to the pixel points;
and extracting a plurality of target frames from a plurality of frames corresponding to each pixel point according to the object detection probability and a preset detection probability threshold value, and acquiring feature maps corresponding to the plurality of target frames.
6. The coastal zone garbage identification method based on unmanned aerial vehicle detection as claimed in claim 1, wherein the step of obtaining the garbage identification area, displaying and labeling the garbage identification area, further comprises the steps of:
the method comprises the steps of obtaining the pixel area of the garbage recognition area of the aerial image to be recognized, obtaining the area parameter of the garbage recognition area of the aerial image to be recognized according to the pixel area and a preset area conversion proportion, and displaying and marking the area parameter on electronic map data according to the area parameter of the garbage recognition area of the aerial image to be recognized.
7. The utility model provides a coastal zone rubbish recognition device based on unmanned aerial vehicle detects which characterized in that includes:
the system comprises an acquisition module, a storage module and a processing module, wherein the acquisition module is used for acquiring a sample aerial photography set and a mark data set of a sample area shot by an unmanned aerial vehicle, the sample aerial photography set comprises a plurality of sample aerial images, the sample aerial images comprise garbage areas, and the mark data set comprises position parameters of the garbage areas of the plurality of sample aerial images;
the training module is used for inputting the sample aerial photographing set and the marking data set into a preset convolutional neural network model, obtaining a characteristic training database output by the convolutional neural network model, inputting the characteristic training database into a neural network model to be trained, and training to obtain a coastal zone garbage recognition model;
the identification module is used for responding to an identification instruction, the identification instruction comprises an aerial image of a to-be-identified area shot by an unmanned aerial vehicle, the aerial image of the to-be-identified area is input into the coastal zone garbage identification model, and a garbage area identification result of the aerial image of the to-be-identified area is obtained;
and the display module is used for responding to a display instruction, acquiring electronic map data corresponding to the aerial image of the area to be identified, carrying out frame selection on a garbage area on the electronic map data according to a garbage area identification result of the aerial image to be identified, acquiring a garbage identification area of the aerial image to be identified, and displaying and marking the garbage identification area.
8. A computer device, comprising: a processor, a memory, and a computer program stored on the memory and executable on the processor; the computer program when being executed by the processor realizes the steps of the method for identifying garbage in a coastal zone based on unmanned aerial detection according to any of claims 1 to 6.
9. A storage medium, characterized by: the storage medium stores a computer program which, when executed by a processor, carries out the steps of the method for identifying coastal zone garbage based on unmanned aerial detection as claimed in any of claims 1 to 6.
CN202210857999.6A 2022-07-20 2022-07-20 Coastal zone garbage identification method, device and equipment based on unmanned aerial vehicle detection Pending CN115272887A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210857999.6A CN115272887A (en) 2022-07-20 2022-07-20 Coastal zone garbage identification method, device and equipment based on unmanned aerial vehicle detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210857999.6A CN115272887A (en) 2022-07-20 2022-07-20 Coastal zone garbage identification method, device and equipment based on unmanned aerial vehicle detection

Publications (1)

Publication Number Publication Date
CN115272887A true CN115272887A (en) 2022-11-01

Family

ID=83768461

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210857999.6A Pending CN115272887A (en) 2022-07-20 2022-07-20 Coastal zone garbage identification method, device and equipment based on unmanned aerial vehicle detection

Country Status (1)

Country Link
CN (1) CN115272887A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116958815A (en) * 2023-07-26 2023-10-27 武汉市万睿数字运营有限公司 River bank garbage detection method, device, equipment and storage medium
US20230348120A1 (en) * 2023-07-10 2023-11-02 Brian Panahi Johnson System and method for identifying trash within a predetermined geographic boundary using unmanned aerial vehicles
CN117765482A (en) * 2024-02-22 2024-03-26 交通运输部天津水运工程科学研究所 garbage identification method and system for garbage enrichment area of coastal zone based on deep learning

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230348120A1 (en) * 2023-07-10 2023-11-02 Brian Panahi Johnson System and method for identifying trash within a predetermined geographic boundary using unmanned aerial vehicles
CN116958815A (en) * 2023-07-26 2023-10-27 武汉市万睿数字运营有限公司 River bank garbage detection method, device, equipment and storage medium
CN117765482A (en) * 2024-02-22 2024-03-26 交通运输部天津水运工程科学研究所 garbage identification method and system for garbage enrichment area of coastal zone based on deep learning
CN117765482B (en) * 2024-02-22 2024-05-14 交通运输部天津水运工程科学研究所 Garbage identification method and system for garbage enrichment area of coastal zone based on deep learning

Similar Documents

Publication Publication Date Title
CN115272887A (en) Coastal zone garbage identification method, device and equipment based on unmanned aerial vehicle detection
CN113160257B (en) Image data labeling method, device, electronic equipment and storage medium
CN111986099A (en) Tillage monitoring method and system based on convolutional neural network with residual error correction fused
CN110570350A (en) two-dimensional follicle detection method and device, ultrasonic equipment and readable storage medium
Du et al. Segmentation and sampling method for complex polyline generalization based on a generative adversarial network
CN110163864B (en) Image segmentation method and device, computer equipment and storage medium
CN111178355A (en) Seal identification method and device and storage medium
CN112966548A (en) Soybean plot identification method and system
CN112989995B (en) Text detection method and device and electronic equipment
CN112464766A (en) Farmland automatic identification method and system
CN112883900B (en) Method and device for bare-ground inversion of visible images of remote sensing images
CN115019274A (en) Pavement disease identification method integrating tracking and retrieval algorithm
WO2020061648A1 (en) Apparatus and method for three-dimensional object recognition
CN110598705B (en) Semantic annotation method and device for image
CN111027545A (en) Card picture mark detection method and device, computer equipment and storage medium
CN113642582B (en) Ammeter reading identification method and device, electronic equipment and storage medium
Dong et al. A cloud detection method for GaoFen-6 wide field of view imagery based on the spectrum and variance of superpixels
CN116704324A (en) Target detection method, system, equipment and storage medium based on underwater image
CN115908363B (en) Tumor cell statistics method, device, equipment and storage medium
CN115345895A (en) Image segmentation method and device for visual detection, computer equipment and medium
CN112669426B (en) Three-dimensional geographic information model rendering method and system based on generation countermeasure network
CN115346138A (en) Target detection method, device and equipment of aerial image based on unmanned aerial vehicle
CN114742955A (en) Flood early warning method and device, electronic equipment and storage medium
CN112633158A (en) Power transmission line corridor vehicle identification method, device, equipment and storage medium
CN112330660A (en) Sperm tail detection method and system based on neural network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination