CN109376660B - Target monitoring method, device and system - Google Patents

Target monitoring method, device and system Download PDF

Info

Publication number
CN109376660B
CN109376660B CN201811261900.6A CN201811261900A CN109376660B CN 109376660 B CN109376660 B CN 109376660B CN 201811261900 A CN201811261900 A CN 201811261900A CN 109376660 B CN109376660 B CN 109376660B
Authority
CN
China
Prior art keywords
target
data
client
aerial vehicle
unmanned aerial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811261900.6A
Other languages
Chinese (zh)
Other versions
CN109376660A (en
Inventor
虞华
杨猛
黄紫橙
李启娟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianyu Jingwei Beijing Technology Co ltd
China Oil and Gas Pipeline Network Corp East China Branch
Original Assignee
Tianyu Jingwei Beijing Technology Co ltd
China Oil and Gas Pipeline Network Corp East China Branch
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianyu Jingwei Beijing Technology Co ltd, China Oil and Gas Pipeline Network Corp East China Branch filed Critical Tianyu Jingwei Beijing Technology Co ltd
Priority to CN201811261900.6A priority Critical patent/CN109376660B/en
Publication of CN109376660A publication Critical patent/CN109376660A/en
Application granted granted Critical
Publication of CN109376660B publication Critical patent/CN109376660B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Remote Sensing (AREA)
  • Astronomy & Astrophysics (AREA)
  • Signal Processing (AREA)
  • Image Analysis (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

The invention discloses a target monitoring method, a device and a system, which relate to the technical field of aircraft control, and the target monitoring method applied to a ground server end comprises the following steps: receiving image data sent by an airborne end of an unmanned aerial vehicle in real time; classifying and filtering the image data, and screening classified image data needing complex data analysis at the cloud end; sending the classified image data to a cloud end; receiving target data sent by a cloud; identifying the target data to obtain target identification data; sending the target identification data to the client; and receiving instruction data sent by the client, and forwarding the instruction data to the airborne terminal of the unmanned aerial vehicle, so that the airborne terminal of the unmanned aerial vehicle controls the load according to the instruction data to realize real-time target tracking shooting or target focusing operation. The invention has the advantage of high real-time performance, can complete the real-time control of targets beyond hundreds of kilometers and even thousands of kilometers from the client interface, and greatly improves the crisis processing efficiency.

Description

Target monitoring method, device and system
Technical Field
The invention relates to the technical field of aircraft control, in particular to a target monitoring method, device and system.
Background
With the rapid development of economy and society in China and the continuous promotion of airspace management reform, a low-altitude airspace is gradually opened, and an airborne end of an unmanned aerial vehicle can be vigorously developed and widely applied, for example, the airborne end of the unmanned aerial vehicle can be applied to the fields of electric power, communication, meteorology, agriculture and forestry, oceans, exploration, insurance and the like, and can be particularly applied to the aspects of earth observation, forest fire prevention and extinguishment, disaster detection, communication relay, marine monitoring, oil and gas pipeline inspection, pesticide spraying, land resource investigation, wild animal monitoring, flood prevention and drought resistance monitoring, fish school detection, movie and aviation, drug smuggling, border patrol, public security and counter-terrorism and the like. The target monitoring system based on the airborne end of the unmanned aerial vehicle can be greatly developed and widely applied.
The target monitoring system mostly carries out centralized remote monitoring and management by transmitting high-definition videos or photos recorded by a camera or a camera to a monitoring center in real time. However, real-time video and photo delivery can continue to produce large-scale data streams, with statistical data showing that data transmission speeds of up to 40Gb/s are required to monitor traffic flow conditions on 8-lane highways. The mass data generated by the unmanned aerial vehicle airborne end and the wireless service all the time brings a difficult challenge to a target monitoring system, if the data generated by the multiple devices and the sensors are all transmitted to the server end to be processed, huge burden is brought to network communication, and meanwhile the computing capacity of the server end is also difficult to meet the requirement of continuous increase of the mass data. Moreover, the server is far away from the user, and data transmission is rather limited by factors such as bandwidth and the like. Thus, this often causes problems with long round trip delay, network congestion, quality of service degradation, etc. In addition, the real-time target monitoring method for the unmanned aerial vehicle provided in the prior art is basically a process of later observation and manual interpretation processing.
The defects of the two aspects in the prior art result in poor real-time performance of the existing unmanned aerial vehicle target monitoring system, incapability of controlling the target monitoring in real time by a user and low crisis processing efficiency.
Disclosure of Invention
Therefore, the technical problems to be solved by the embodiments of the present invention are that the target monitoring system in the prior art has poor real-time performance, a user cannot perform real-time control on target monitoring, and crisis processing efficiency is low.
Therefore, the target monitoring method provided by the embodiment of the invention is applied to a ground server side and comprises the following steps:
receiving image data sent by an airborne end of an unmanned aerial vehicle in real time;
classifying and filtering the image data, and screening classified image data needing complex data analysis at the cloud end;
sending the classified image data to a cloud end, and performing complex data analysis on the classified image data at the cloud end to obtain target data;
receiving target data sent by a cloud;
identifying the target data to obtain target identification data;
the target identification data is sent to a client side, and instruction data containing target tracking or target identification is obtained at the client side based on the target identification data;
and receiving the instruction data sent by the client, and forwarding the instruction data to the airborne terminal of the unmanned aerial vehicle, so that the airborne terminal of the unmanned aerial vehicle can control the load according to the instruction data to realize real-time target tracking shooting or target focusing operation.
Preferably, the step of identifying the target data and obtaining target identification data includes:
constructing training set data and test set data, acquiring target data, and training a deep learning network;
predicting the target data to obtain a target data prediction result;
sending the target data prediction result to a client, and obtaining a target object judgment result and target identification data based on the target data prediction result at the client;
receiving target identification data sent by a client;
and adding the target identification data to training set data, retraining the deep learning network again, and obtaining an updated deep learning network.
The target monitoring method provided by the embodiment of the invention is applied to a cloud, and comprises the following steps:
receiving classified image data sent by a ground server;
acquiring image frames of adjacent n frames based on the classified image data;
registering the image frames to obtain registered images;
differentiating the registration image to obtain a differential image;
refining the difference image, and extracting static characteristics to obtain suspected target data;
repeating the step of obtaining the image frames of the adjacent n frames based on the classified image data to the step of refining the difference image, extracting suspected target data for K times through static characteristics to obtain K suspected target data, performing target association processing on the K suspected target data, obtaining target data through dynamic characteristic screening, and filtering the track of a target to obtain target track data;
and sending the target data and the target track data to a ground server.
The target monitoring method provided by the embodiment of the invention is applied to an airborne end of an unmanned aerial vehicle, and comprises the following steps:
sending the image data to a ground server in real time;
receiving instruction data sent by a ground server;
judging whether the content contained in the instruction data is target tracking or target identification;
when the content contained in the instruction data is target tracking, generating a first parameter control value and outputting the first parameter control value to the load for controlling the load to realize target real-time tracking shooting operation;
and when the content contained in the instruction data is target identification, generating a second parameter control value and outputting the second parameter control value to the load for controlling the load to realize target focusing operation.
The embodiment of the invention provides a target monitoring device for a ground server, which comprises:
the image data receiving unit is used for receiving image data sent by an airborne end of the unmanned aerial vehicle in real time;
the classification filtering unit is used for classifying and filtering the image data and screening classified image data needing complex data analysis at the cloud end;
the classified image data sending unit is used for sending the classified image data to a cloud end and carrying out complex data analysis on the classified image data at the cloud end to obtain target data;
the target data receiving unit is used for receiving target data sent by the cloud end;
the identification unit is used for identifying the target data to obtain target identification data;
the target identification data sending unit is used for sending the target identification data to the client and acquiring instruction data containing target tracking or target identification at the client based on the target identification data;
and the command data receiving and sending unit is used for receiving the command data sent by the client, forwarding the command data to the unmanned aerial vehicle airborne end and enabling the unmanned aerial vehicle airborne end to control the load according to the command data to realize target real-time tracking shooting or target focusing operation.
Preferably, the identification unit includes:
the deep learning network training unit is used for constructing training set data and test set data, acquiring target data and training a deep learning network;
the target data prediction unit is used for predicting the target data to obtain a target data prediction result;
a target data prediction result sending unit, configured to send the target data prediction result to a client, and configured to obtain, at the client, a target object determination result and target identification data based on the target data prediction result;
the target identification data receiving unit is used for receiving the target identification data sent by the client;
and the deep learning network updating unit is used for adding the target identification data to training set data, retraining the deep learning network again and obtaining the updated deep learning network.
The embodiment of the invention provides a target monitoring device for a cloud, which comprises:
the classified image data receiving unit is used for receiving the classified image data sent by the ground server;
an adjacent image frame acquiring unit for acquiring image frames of adjacent n frames based on the classified image data;
the registration unit is used for registering the image frames to obtain registered images;
the difference unit is used for carrying out difference on the registration image to obtain a difference image;
the suspected target data extraction unit is used for carrying out fine processing on the difference image and obtaining suspected target data through static characteristic extraction;
a target data screening unit, configured to repeat the step of obtaining image frames of n adjacent frames based on the classified image data until the difference map is refined, obtain K pieces of suspected target data through K times of the step of extracting the suspected target data by using static features, perform target association processing on the K pieces of suspected target data, obtain target data through dynamic feature screening, and filter a track of a target to obtain target track data;
and the target data sending unit is used for sending the target data and the target track data to the ground server side.
The embodiment of the invention provides a target monitoring device for an airborne end of an unmanned aerial vehicle, which comprises:
the image data sending unit is used for sending the image data to the ground server end in real time;
the command data receiving unit is used for receiving command data sent by the ground server;
a judging unit configured to judge whether content included in the instruction data is target tracking or target authentication;
the first parameter control value generation output unit is used for generating a first parameter control value and outputting the first parameter control value to the load when the content contained in the instruction data is target tracking, and is used for controlling the load to realize target real-time tracking shooting operation;
and the second parameter control value generation output unit is used for generating and outputting a second parameter control value to the load when the content contained in the instruction data is target identification, and is used for controlling the load to realize target focusing operation.
An object monitoring system according to an embodiment of the present invention includes: the system comprises an unmanned aerial vehicle airborne end, a ground server end, a cloud end and a client end, wherein the ground server end is respectively connected with the unmanned aerial vehicle airborne end, the cloud end and the client end in real time through an unmanned aerial vehicle measurement and control network;
the ground server end comprises a target monitoring device for the ground server end, and the target monitoring device is used for executing the target monitoring method;
the cloud comprises a target monitoring device used for the cloud and is used for executing the target monitoring method;
the unmanned aerial vehicle airborne end comprises a target monitoring device for the unmanned aerial vehicle airborne end, and the target monitoring device is used for executing the target monitoring method;
the client is used for receiving the target identification data sent by the ground server, performing interface display on the target identification data, acquiring instruction data containing target tracking or target identification of a processing mode selected by a user according to the target interface display, and sending the instruction data to the ground server.
Preferably, the client is further configured to receive a target data prediction result sent by the ground server, perform interface display on the target data prediction result, obtain a target object determination result and target identification data obtained by a user according to target object determination and identification performed by the interface display, and send the target identification data to the ground server.
The technical scheme of the embodiment of the invention has the following advantages:
according to the target monitoring method, device and system provided by the embodiment of the invention, the image data of the onboard end of the unmanned aerial vehicle is transmitted to the ground server end through the network, the ground server end receives the image data and completes the processes of primary classification, analysis, filtering and the like of the data, and then the data needing to be processed by the AI system is pushed to the cloud end for further data processing, so that the data distribution processing is realized, the excessive burden on the network communication of the cloud end is avoided, the massive computing pressure is provided for the cloud end, the network congestion and the computing time delay are reduced, and the real-time performance is effectively improved. Meanwhile, the ground server receives the data resolved by the AI system, completes labeling and superposition processing of the target, and finally distributes the processed data to the client interface, and the user gives a next processing instruction according to the content displayed by the interface, so that the user can control the target monitoring in real time, the target monitoring is controlled in real time from the client interface for hundreds of kilometers or even thousands of kilometers away, resources are saved, and the crisis processing efficiency is greatly improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a flowchart of a specific example of an object monitoring method in embodiment 1 of the present invention;
fig. 2 is a flowchart of a specific example of target identification in embodiment 1 of the present invention;
fig. 3 is a flowchart of a specific example of the target monitoring method in embodiment 2 of the present invention;
fig. 4 is a flowchart of a specific example of the target monitoring method in embodiment 3 of the present invention;
fig. 5 is a schematic block diagram of a specific example of an object monitoring apparatus according to embodiment 4 of the present invention;
fig. 6 is a schematic block diagram of a specific example of the object monitoring system in embodiment 7 of the present invention.
Detailed Description
The technical solutions of the present invention will be described clearly and completely with reference to the accompanying drawings, and it should be understood that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In describing the present invention, it is to be understood that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. The terms "comprises" and/or "comprising," when used in this specification, are intended to specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. The term "and/or" includes any and all combinations of one or more of the associated listed items. The terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. The terms "connected" and "coupled" are to be interpreted broadly, e.g., as meaning either directly connected to one another or indirectly connected to one another through intervening elements, or both; either a wireless or a wired connection. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
While the exemplary embodiments are described as performing an exemplary process using multiple units, it is understood that the exemplary process can also be performed by one or more modules. In addition, it is to be understood that the term controller/control unit refers to a hardware device that includes a memory and a processor. The memory is configured as a memory module and the processor is specifically configured to execute the processes stored in the memory module to thereby execute one or more processes.
In addition, the technical features involved in the different embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
Example 1
The embodiment provides a target monitoring method, which is applied to a ground server side, the system background of the target monitoring method mainly comprises an unmanned aerial vehicle airborne side, a ground server side, a cloud side and a client side, and real-time connection is realized between the ground server side and the unmanned aerial vehicle airborne side, the cloud side and the client side through an unmanned aerial vehicle measurement and control network. The measurement and control network realizes the downloading of image data and the uploading of instruction data. The server of the ground server is a data processing terminal deployed in a local computer room, and mainly completes a simpler data processing process, while the complex machine learning process and the target extraction process need to be realized on a cloud high-performance server cluster. The target monitoring method is shown in fig. 1 and comprises the following steps:
s11, receiving image data sent by an airborne end of the unmanned aerial vehicle in real time; the image data is acquired by the unmanned aerial vehicle in real time, and the airborne task computer acquires and sends the image data acquired in real time; the method comprises the steps that image data are collected from a load end of an airborne terminal, the collected data need to meet basic requirements, namely the data need to be high-definition images, preferably, the collected data are compressed through a standard protocol of H.264, and air-ground real-time transmission is completed through a measurement and control network; preferably, the image data information may include a timestamp corresponding to a time when each frame of image data is transmitted into the measurement and control network, and accordingly, a time sequence for joining the network may be established, and the ground server performs data processing and distribution on the image data according to the time information;
s12, classifying and filtering the image data, and screening classified image data needing to be subjected to complex data analysis at the cloud end; preferably, the video data transmitted in step S11 is the data content after h.264 compression, and the data content needs to be decompressed first after the video data arrives at the ground server end in this step;
s13, sending the classified image data to a cloud end, and performing complex data analysis on the classified image data at the cloud end to obtain target data; the analysis of the data by the cloud AI system is the key for realizing target detection, whether the accurate identification and later stage identification of the target object can be realized depends on the process of training machine learning on a large number of relevant target object characteristic points by the AI system; the cloud AI system carries out key point identification and feature matching on the data, realizes intelligent interpretation on target content, and finally transmits the target object back to the local server; data to be processed by the AI system mainly refers to image information, and the AI system is a mature system capable of executing tasks in real time through training and learning of a large amount of image data in the early stage.
S14, receiving target data sent by the cloud;
s15, identifying the target data to obtain target identification data; after the cloud identifies the target object, the work of identifying the target object, superposing related data information and distributing data is mainly realized on a ground server. The ground server end finishes marking and overlapping processing on the target and finally distributes the processed data to the client interface, preferably, the processed final data can be automatically updated and distributed to each client node according to network configuration;
s16, sending the target identification data to the client for obtaining the instruction data containing target tracking or target identification based on the target identification data at the client; the client gives a next processing instruction through the content displayed on the interface; the client displays the target object identified in the video, after the user sees the real-time data from the monitoring interface, the user carries out artificial secondary interpretation on the target object, and can select a processing option for the target object on the operation interface of the client to finish remote controllable operation on the remote target object.
And S17, receiving the instruction data sent by the client, and forwarding the instruction data to the unmanned aerial vehicle airborne terminal, so that the unmanned aerial vehicle airborne terminal can control the load according to the instruction data to realize real-time target tracking shooting or target focusing operation. Preferably, the task computer at the airborne terminal completes the analysis and forwarding work of the ground instruction, if the analyzed instruction content is specific to the target object, the instruction is transmitted to the target identification and tracking module, and the target identification and tracking module adjusts the load parameter, so that the real-time monitoring process of the interested target is realized.
According to the target monitoring method, the image data of the airborne end of the unmanned aerial vehicle is transmitted to the ground server end through the network, the ground server end receives the image data and completes the processes of preliminary classification, analysis, filtering and the like of the data, and then the data needing to be processed by the AI system is pushed to the cloud end for further data processing, so that the data distribution processing is realized, the excessive burden caused by network communication of the cloud end is avoided, massive computing pressure is provided for the cloud end, the network congestion and the computing time delay are reduced, and the real-time performance is effectively improved. Meanwhile, the ground server receives the data resolved by the AI system, completes labeling and superposition processing of the target, and finally distributes the processed data to the client interface, and the user gives a next processing instruction according to the content displayed by the interface, so that the user can control the target monitoring in real time, the target monitoring is controlled in real time from the client interface for hundreds of kilometers or even thousands of kilometers away, resources are saved, and the crisis processing efficiency is greatly improved.
Preferably, as shown in fig. 2, the step of identifying the target data in step S15 and obtaining the target identification data includes:
s15-1, constructing training set data;
s15-2, constructing test set data;
s15-3, acquiring target data and training a deep learning network;
s15-4, predicting the target data to obtain a target data prediction result;
s15-5, sending the target data prediction result to the client, and obtaining a target object judgment result and target identification data based on the target data prediction result at the client; by feeding the prediction result back to the client, the user can visually observe the result of the automatic target identification, and the user can judge and identify the result, so that the man-machine interaction is improved, and the reliability of the target identification is further improved.
And S15-6, receiving the target identification data sent by the client, returning to the step S15-1, adding the target identification data to the training set data, retraining the deep learning network, and obtaining the updated deep learning network.
The real-time identification information of the target object by the ground server side can comprise geographic information and attribute identification. For example, the geographic identifier may be longitude and latitude height identifier information, which is used to limit status information within the operation range, and may allow a specific geographic name to be labeled to improve the identification degree. Attribute identification is primarily a classification specification of the target object to determine the processing level by type attribute.
Example 2
The embodiment provides a target monitoring method applied to a cloud, as shown in fig. 3, including the following steps:
s21, receiving classified image data sent by a ground server;
s22, acquiring image frames of adjacent n frames based on the classified image data; after the data reaches the cloud AI system, the AI system samples the decoded image information according to the set sampling frequency to acquire the image data to be processed;
s23, registering the image frames to obtain registered images; when the unmanned aerial vehicle moves, the backgrounds of image data shot at different moments are different, but the backgrounds of adjacent image data are partially overlapped, so that image registration needs to be carried out on adjacent images after the image data are obtained, and the corresponding relation between the image backgrounds is found;
s24, differentiating the registration image to obtain a differential image; after determining the corresponding relation of the image backgrounds, carrying out difference on the two images to obtain a difference image, wherein the difference image can detect the change between the two images and is the basis of foreground (namely target) detection;
s25, refining the difference image, and extracting static characteristics to obtain suspected target data; the image registration is affected by interference and noise, and no error exists, so that the corresponding relation between the images also has an error, a simple difference image cannot determine the change of a foreground, fine processing is required, the interference is removed through methods such as morphological processing and gray scale characteristics, and a suspected target is determined; repeating the steps of S22-S25K times to obtain K suspected target data;
s26, performing target association processing on the K suspected target data, obtaining target data through dynamic feature screening, and filtering the track of the target to obtain target track data; a plurality of suspected targets can be obtained by a plurality of adjacent image pairs, wherein some suspected targets are the same object, so that the suspected targets can be associated, whether the suspected targets are the required targets can be further judged through the relationship (such as the moving speed of the object, the color change and the like) between the suspected targets, and the track of the target is filtered to obtain a more accurate actual track;
and S27, sending the target data and the target track data to a ground server.
Example 3
The embodiment provides a target monitoring method, which is applied to an airborne terminal of an unmanned aerial vehicle, and as shown in fig. 4, the method comprises the following steps:
s31, sending the image data to a ground server in real time;
s32, receiving instruction data sent by a ground server;
s33, judging whether the content contained in the instruction data is target tracking or target identification, namely controlling the parameter of the load; when the content contained in the instruction data is the target tracking, the flow proceeds to step S34; when the content contained in the instruction data is the target authentication, the flow proceeds to step S35;
s34, generating a first parameter control value and outputting the first parameter control value to the load, wherein the first parameter control value is used for controlling the load to realize target real-time tracking shooting operation;
and S35, generating a second parameter control value and outputting the second parameter control value to the load, wherein the second parameter control value is used for controlling the load to realize the target focusing operation. The parameter control content of the load is obtained through analysis, so that a corresponding parameter control value is sent to the load, the load responds to the related parameters, and the target instruction processing can be completed.
Example 4
Corresponding to embodiment 1, this embodiment provides an object monitoring device for a ground server, as shown in fig. 5, including:
the image data receiving unit 11 is used for receiving image data sent by an airborne end of the unmanned aerial vehicle in real time;
the classification filtering unit 12 is configured to classify and filter the image data, and screen out classified image data that needs to be subjected to complex data analysis at the cloud;
a classified image data sending unit 13, configured to send the classified image data to a cloud, and perform complex data analysis on the classified image data at the cloud to obtain target data;
a target data receiving unit 14, configured to receive target data sent by a cloud;
an identification unit 15, configured to identify the target data to obtain target identification data;
a target identification data sending unit 16, configured to send target identification data to the client, and obtain instruction data including target tracking or target authentication based on the target identification data at the client;
and the instruction data receiving and sending unit 17 is used for receiving instruction data sent by the client, forwarding the instruction data to the unmanned aerial vehicle onboard end, and enabling the unmanned aerial vehicle onboard end to control the load according to the instruction data to realize target real-time tracking shooting or target focusing operation.
Above-mentioned target monitoring device, unmanned aerial vehicle machine carries end image data and passes to ground server end through the network, ground server end receives image data and accomplishes the preliminary classification to data, processes such as analysis and filtration, the data propelling movement that will need AI system to handle again carries out further data processing to high in the clouds, the reposition of redundant personnel processing of data has been realized, avoid causing excessive burden and provide massive computational pressure for the high in the clouds for the network communication of high in the clouds, network congestion and calculation time delay have been reduced, the real-time has effectively been improved. Meanwhile, the ground server receives the data resolved by the AI system, completes labeling and superposition processing of the target, and finally distributes the processed data to the client interface, and the user gives a next processing instruction according to the content displayed by the interface, so that the user can control the target monitoring in real time, the target monitoring is controlled in real time from the client interface for hundreds of kilometers or even thousands of kilometers away, resources are saved, and the crisis processing efficiency is greatly improved.
Preferably, the identification unit comprises:
the deep learning network training unit is used for constructing training set data and test set data and training a deep learning network;
the target data prediction unit is used for predicting the target data to obtain a target data prediction result;
the target data prediction result sending unit is used for sending the target data prediction result to the client and obtaining a target object judgment result and target identification data based on the target data prediction result at the client;
the target identification data receiving unit is used for receiving the target identification data sent by the client;
and the deep learning network updating unit is used for adding the target identification data to the training set data, retraining the deep learning network again and obtaining the updated deep learning network.
Example 5
Corresponding to embodiment 2, this embodiment provides a target monitoring device for high in the clouds, including:
the classified image data receiving unit is used for receiving the classified image data sent by the ground server;
an adjacent image frame acquiring unit for acquiring image frames of adjacent n frames based on the classified image data;
the registration unit is used for registering the image frames to obtain registered images;
the difference unit is used for carrying out difference on the registration image to obtain a difference image;
the suspected target data extraction unit is used for carrying out fine processing on the difference image and obtaining suspected target data through static characteristic extraction;
the target data screening unit is used for repeating the step of acquiring the image frames of the adjacent n frames based on the classified image data to the step of refining the difference image, obtaining K suspected target data through the step K of extracting the suspected target data through static characteristics, carrying out target association processing on the K suspected target data, obtaining target data through dynamic characteristic screening, and filtering the track of the target to obtain target track data;
and the target data sending unit is used for sending the target data and the target track data to the ground server side.
Example 6
Corresponding to embodiment 3, this embodiment provides a target monitoring device for unmanned aerial vehicle airborne terminal, includes:
the image data sending unit is used for sending the image data to the ground server end in real time;
the command data receiving unit is used for receiving command data sent by the ground server;
a judging unit configured to judge whether content included in the instruction data is target tracking or target authentication;
the first parameter control value generation output unit is used for generating a first parameter control value and outputting the first parameter control value to the load when the content contained in the instruction data is target tracking, and is used for controlling the load to realize target real-time tracking shooting operation;
and the second parameter control value generation output unit is used for generating and outputting a second parameter control value to the load when the content contained in the instruction data is the target identification, and is used for controlling the load to realize the target focusing operation.
Example 7
The present embodiment provides an object monitoring system, as shown in fig. 6, including: the system comprises an unmanned aerial vehicle airborne terminal 3, a ground server terminal 1, a cloud terminal 2 and a client terminal 4, wherein the ground server terminal is respectively connected with the unmanned aerial vehicle airborne terminal, the cloud terminal and the client terminal in real time through an unmanned aerial vehicle measurement and control network;
the ground server side 1 comprises a target monitoring device 10 for the ground server side, which is used for executing the target monitoring method of the embodiment 1;
the cloud 2 comprises a target monitoring device 20 for the cloud, which is used for executing the target monitoring method of embodiment 2;
the unmanned aerial vehicle airborne terminal 3 comprises a target monitoring device 30 for the unmanned aerial vehicle airborne terminal, and is used for executing the target monitoring method of embodiment 3;
and the client 4 is used for receiving the target identification data sent by the ground server, performing interface display on the target identification data, acquiring instruction data containing target tracking or target identification of a processing mode selected by a user according to the target interface display, and sending the instruction data to the ground server.
Preferably, the client 4 is further configured to receive a target data prediction result sent by the ground server, perform interface display on the target data prediction result, obtain a target object determination result and target identification data obtained by a user according to target object determination and identification performed by the interface display, and send the target identification data to the ground server.
Above-mentioned target monitoring system, unmanned aerial vehicle machine carries end image data and passes to ground server end through the network, ground server end receives image data and accomplishes the preliminary classification to data, processes such as analysis and filtration, the data propelling movement that will need AI system to handle again carries out further data processing to high in the clouds, the reposition of redundant personnel processing of data has been realized, avoid causing excessive burden and provide massive computational pressure for the high in the clouds for the network communication of high in the clouds, network congestion and calculation time delay have been reduced, the real-time has effectively been improved. Meanwhile, the ground server receives the data resolved by the AI system, completes labeling and superposition processing of the target, and finally distributes the processed data to the client interface, and the user gives a next processing instruction according to the content displayed by the interface, so that the user can control the target monitoring in real time, the target monitoring is controlled in real time from the client interface for hundreds of kilometers or even thousands of kilometers away, resources are saved, and the crisis processing efficiency is greatly improved.
It should be understood that the above examples are only for clarity of illustration and are not intended to limit the embodiments. Other variations and modifications will be apparent to persons skilled in the art in light of the above description. And are neither required nor exhaustive of all embodiments. And obvious variations or modifications therefrom are within the scope of the invention.

Claims (2)

1. An object monitoring system, comprising: the system comprises an unmanned aerial vehicle airborne end, a ground server end, a cloud end and a client end, wherein the ground server end is respectively connected with the unmanned aerial vehicle airborne end, the cloud end and the client end in real time through an unmanned aerial vehicle measurement and control network;
the ground server end comprises a target monitoring device for the ground server end and is used for receiving image data sent by the airborne end of the unmanned aerial vehicle in real time; classifying and filtering the image data, and screening classified image data needing complex data analysis at the cloud end; sending the classified image data to a cloud end, and performing complex data analysis on the classified image data at the cloud end to obtain target data; receiving target data sent by a cloud; identifying the target data to obtain target identification data; the target identification data is sent to a client side, and instruction data containing target tracking or target identification is obtained at the client side based on the target identification data; receiving the instruction data sent by the client, and forwarding the instruction data to the airborne terminal of the unmanned aerial vehicle, so that the airborne terminal of the unmanned aerial vehicle can control the load according to the instruction data to realize real-time target tracking shooting or target focusing operation;
the step of identifying the target data to obtain target identification data comprises:
constructing training set data and test set data, acquiring target data, and training a deep learning network; predicting the target data to obtain a target data prediction result; sending the target data prediction result to a client, and obtaining a target object judgment result and target identification data based on the target data prediction result at the client; receiving target identification data sent by a client; adding the target identification data to training set data, retraining the deep learning network again, and obtaining an updated deep learning network;
the cloud comprises a target monitoring device used for the cloud, and the target monitoring device is used for receiving classified image data sent by a ground server; acquiring image frames of adjacent n frames based on the classified image data; registering the image frames to obtain registered images; differentiating the registration image to obtain a differential image; refining the difference image, and extracting static characteristics to obtain suspected target data; repeating the step of obtaining the image frames of the adjacent n frames based on the classified image data to the step of refining the difference image, extracting suspected target data for K times through static characteristics to obtain K suspected target data, performing target association processing on the K suspected target data, obtaining target data through dynamic characteristic screening, and filtering the track of a target to obtain target track data; sending the target data and the target track data to a ground server;
the unmanned aerial vehicle airborne end comprises a target monitoring device for the unmanned aerial vehicle airborne end, and is used for sending image data to the ground server end in real time; receiving instruction data sent by a ground server; judging whether the content contained in the instruction data is target tracking or target identification; when the content contained in the instruction data is target tracking, generating a first parameter control value and outputting the first parameter control value to the load for controlling the load to realize target real-time tracking shooting operation; when the content contained in the instruction data is target identification, generating a second parameter control value and outputting the second parameter control value to the load for controlling the load to realize target focusing operation;
the client is used for receiving the target identification data sent by the ground server, performing interface display on the target identification data, acquiring instruction data containing target tracking or target identification of a processing mode selected by a user according to the target interface display, and sending the instruction data to the ground server.
2. The target monitoring system of claim 1, wherein the client is further configured to receive a target data prediction result sent by the ground server, perform interface display on the target data prediction result, obtain a target object determination result and target identification data obtained by a user according to target object determination and identification performed by the interface display, and send the target identification data to the ground server.
CN201811261900.6A 2018-10-26 2018-10-26 Target monitoring method, device and system Active CN109376660B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811261900.6A CN109376660B (en) 2018-10-26 2018-10-26 Target monitoring method, device and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811261900.6A CN109376660B (en) 2018-10-26 2018-10-26 Target monitoring method, device and system

Publications (2)

Publication Number Publication Date
CN109376660A CN109376660A (en) 2019-02-22
CN109376660B true CN109376660B (en) 2022-04-08

Family

ID=65390071

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811261900.6A Active CN109376660B (en) 2018-10-26 2018-10-26 Target monitoring method, device and system

Country Status (1)

Country Link
CN (1) CN109376660B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200356774A1 (en) * 2019-05-06 2020-11-12 Sap National Security Services, Inc. Systems and methods for aerostat management including identifying, classifying and determining predictive trends of an entity of interest
CN110298866A (en) * 2019-06-06 2019-10-01 武汉易科空间信息技术股份有限公司 Ground object tracking and system based on unmanned plane image technology
CN112327935A (en) * 2019-08-05 2021-02-05 旭日蓝天(武汉)科技有限公司 AI technology-based unmanned aerial vehicle cloud object identification and tracking system and method
CN110958294B (en) * 2019-10-22 2021-12-17 国网浙江省电力有限公司宁波供电公司 Power transmission and transformation inspection video processing method based on fog calculation
CN111766579A (en) * 2020-06-28 2020-10-13 中国科学院空天信息创新研究院 SAR air-ground combined processing method, airborne SAR and ground controller
CN112365468A (en) * 2020-11-11 2021-02-12 南通大学 AA-gate-Unet-based offshore wind power tower coating defect detection method

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107168352B (en) * 2014-07-30 2020-07-14 深圳市大疆创新科技有限公司 Target tracking system and method
CN104932529B (en) * 2015-06-05 2018-01-02 北京中科遥数信息技术有限公司 A kind of high in the clouds control system of unmanned plane autonomous flight
CN106407984B (en) * 2015-07-31 2020-09-11 腾讯科技(深圳)有限公司 Target object identification method and device
CN106981073B (en) * 2017-03-31 2019-08-06 中南大学 A kind of ground moving object method for real time tracking and system based on unmanned plane
CN109447056A (en) * 2018-12-07 2019-03-08 苏州米特希赛尔人工智能有限公司 The feature-extraction images sensor that DSP is realized

Also Published As

Publication number Publication date
CN109376660A (en) 2019-02-22

Similar Documents

Publication Publication Date Title
CN109376660B (en) Target monitoring method, device and system
CN108037770B (en) Unmanned aerial vehicle power transmission line inspection system and method based on artificial intelligence
CN104200671B (en) A kind of virtual bayonet socket management method based on large data platform and system
CN111612933A (en) Augmented reality intelligent inspection system based on edge cloud server
CN111444014A (en) Ocean observation network system and method based on EC-REST system structure
CN104834920A (en) Intelligent forest fire recognition method and device based on multispectral image of unmanned plane
CN110659391A (en) Video detection method and device
CN115220479B (en) Dynamic and static cooperative power transmission line refined inspection method and system
CN113284144B (en) Tunnel detection method and device based on unmanned aerial vehicle
CN113191252A (en) Visual identification system for production control and production control method
CN113269039A (en) On-duty personnel behavior identification method and system
CN114020043A (en) Unmanned aerial vehicle building project supervision system and method, electronic equipment and storage medium
CN114035606A (en) Pole tower inspection system, pole tower inspection method, control device and storage medium
CN115082813A (en) Detection method, unmanned aerial vehicle, detection system and medium
CN114494916A (en) Black-neck crane monitoring and tracking method based on YOLO and DeepsORT
CN113971666A (en) Power transmission line machine inspection image self-adaptive identification method based on depth target detection
CN112016380B (en) Wild animal monitoring method and system
CN111738312B (en) Power transmission line state monitoring method and device based on GIS and virtual reality fusion and computer readable storage medium
CN115542951B (en) Unmanned aerial vehicle centralized management and control method, system, equipment and medium based on 5G network
Zheng et al. Forest farm fire drone monitoring system based on deep learning and unmanned aerial vehicle imagery
CN115083229B (en) Intelligent recognition and warning system of flight training equipment based on AI visual recognition
CN114592411B (en) Carrier parasitic type intelligent inspection method for highway damage
CN116318365A (en) Space-time service big data platform with multiple elements
CN111060079A (en) River foreign matter identification method and river foreign matter monitoring platform system
Forkan et al. Mobile IoT-RoadBot: an AI-powered mobile IoT solution for real-time roadside asset management

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20210309

Address after: 100079 6357, 1-13, 6th floor, building 2, yard 1, Haiying Road, Fengtai District, Beijing

Applicant after: TIANYU JINGWEI (BEIJING) TECHNOLOGY Co.,Ltd.

Applicant after: East China branch of National Petroleum Pipeline Network Group Co.,Ltd.

Address before: 100079 room 617, 6th floor, building 2, yard 1, Hangfeng Road, Fengtai District, Beijing (room 717, elevator floor)

Applicant before: TIANYU JINGWEI (BEIJING) TECHNOLOGY Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant