CN115407803A - Target monitoring method and device based on unmanned aerial vehicle - Google Patents
Target monitoring method and device based on unmanned aerial vehicle Download PDFInfo
- Publication number
- CN115407803A CN115407803A CN202211344787.4A CN202211344787A CN115407803A CN 115407803 A CN115407803 A CN 115407803A CN 202211344787 A CN202211344787 A CN 202211344787A CN 115407803 A CN115407803 A CN 115407803A
- Authority
- CN
- China
- Prior art keywords
- aerial vehicle
- unmanned aerial
- target
- information
- image information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000012544 monitoring process Methods 0.000 title claims abstract description 74
- 238000000034 method Methods 0.000 title claims abstract description 47
- 230000004927 fusion Effects 0.000 claims description 17
- 238000004590 computer program Methods 0.000 claims description 16
- 230000033001 locomotion Effects 0.000 claims description 12
- 238000004891 communication Methods 0.000 claims description 11
- 238000011084 recovery Methods 0.000 claims description 9
- 238000004364 calculation method Methods 0.000 claims description 8
- 238000012937 correction Methods 0.000 claims description 6
- 238000012806 monitoring device Methods 0.000 claims description 2
- 238000012423 maintenance Methods 0.000 abstract description 8
- 230000000694 effects Effects 0.000 abstract description 3
- 238000005516 engineering process Methods 0.000 abstract description 3
- 238000007689 inspection Methods 0.000 abstract 1
- 230000005540 biological transmission Effects 0.000 description 8
- 230000008569 process Effects 0.000 description 8
- 238000004458 analytical method Methods 0.000 description 5
- 230000008859 change Effects 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 238000012545 processing Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000000151 deposition Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 239000000779 smoke Substances 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Traffic Control Systems (AREA)
Abstract
The embodiment of the invention provides a target monitoring method and device based on an unmanned aerial vehicle, an electronic device and an unmanned aerial vehicle storage device, and relates to the technical field of unmanned aerial vehicle inspection technologies. The method comprises the following steps: acquiring a first monitoring instruction, sending the first monitoring instruction to a first unmanned machine based on the first monitoring instruction to indicate the first unmanned machine to move to a target height, and collecting first image information of a target area; acquiring power information of the first unmanned machine under the condition that the first unmanned machine moves to a target height; executing a first relay operation under the condition that the power information meets a first condition; according to the invention, the problems of high operation and maintenance cost and low monitoring efficiency are solved, and the effects of reducing the operation and maintenance cost and improving the monitoring efficiency are further achieved.
Description
Technical Field
The embodiment of the invention relates to the field of communication, in particular to a target monitoring method and device based on an unmanned aerial vehicle, a storage medium, an electronic device and an unmanned aerial vehicle accommodating device.
Background
In the field of traffic control, a radar fusion technology is usually adopted for road traffic monitoring at present, but due to reasons such as weather and vehicle distance, characteristic recognition of a license plate of a vehicle is unclear, and recognition errors are caused.
At present, unmanned aerial vehicles are adopted to patrol roads to realize traffic monitoring of the roads, but the cases often have the problems that in the cruising process, along with the change of road images, a large amount of calculation power needs to be consumed to carry out real-time road identification on the images, and the operation and maintenance cost of equipment is increased.
There is currently no good solution to the above problems.
Disclosure of Invention
The embodiment of the invention provides a target monitoring method and device based on an unmanned aerial vehicle, a storage medium, an electronic device and an unmanned aerial vehicle storage device, and at least solves the problem of high operation and maintenance cost in the related technology.
According to one embodiment of the invention, a target monitoring method based on an unmanned aerial vehicle is provided, which comprises the following steps:
acquiring a first monitoring instruction, sending the first monitoring instruction to a first unmanned machine based on the first monitoring instruction to indicate the first unmanned machine to move to a target height, and collecting first image information of a target area;
acquiring power information of the first unmanned machine under the condition that the first unmanned machine moves to a target height;
executing a first relay operation under the condition that the power information meets a first condition, wherein the first relay operation comprises the following steps: sending a first recovery instruction to the first drone to instruct the first drone to land to a target location along a first path; sending a second monitoring instruction to a second unmanned aerial vehicle to indicate the second unmanned aerial vehicle to move to the target height along a second path, and acquiring second image information of the target area;
and carrying out image relay fusion operation on the first image information and the second image information to obtain target image information.
In one exemplary embodiment, the image relay fusion operation comprises:
acquiring position information of all target objects contained in the first image information and the second image information;
determining an average position error of all the target objects in a target time frame based on the position information;
in case the average position error satisfies a second condition, switching the information stream to an information stream of second image information starting at the target time frame.
In an exemplary embodiment, in a case where the average position error satisfies a second condition, the method further includes:
acquiring the motion height information of the second unmanned aerial vehicle;
in a case where it is determined that the motion altitude information satisfies the fourth condition, switching the information stream to an information stream of second image information starting from the target time frame.
In an exemplary embodiment, before said sending the second monitoring instruction to the second drone, the method further comprises:
acquiring position coordinate information of the first unmanned aerial vehicle and the second unmanned aerial vehicle;
determining a position difference between the first drone and the second drone based on the position coordinate information;
and sending an adjusting instruction to the second unmanned aerial vehicle to indicate the second unmanned aerial vehicle to move along the second path under the condition that the position difference does not meet the safety condition.
In one exemplary embodiment, the method further comprises:
and under the condition that the first unmanned aerial vehicle and/or the second unmanned aerial vehicle move to the target position, performing synchronization time correction operation on the first unmanned aerial vehicle and/or the second unmanned aerial vehicle so as to enable the information flow time of the first unmanned aerial vehicle and/or the second unmanned aerial vehicle to be target time.
According to another embodiment of the present invention, there is provided an unmanned aerial vehicle-based target monitoring apparatus, including:
the instruction receiving and sending module is used for acquiring a first monitoring instruction, sending the first monitoring instruction to a first unmanned machine based on the first monitoring instruction so as to indicate the first unmanned machine to move to a target height, and collecting first image information of a target area;
the power monitoring module is used for acquiring power information of the first unmanned machine under the condition that the first unmanned machine moves to a target height;
a first relay module, configured to execute a first relay operation when the power information satisfies a first condition, where the first relay operation includes: sending a first recovery instruction to the first drone to instruct the first drone to land to a target location along a first path; sending a second monitoring instruction to a second unmanned aerial vehicle to indicate the second unmanned aerial vehicle to move to the target height along a second path, and acquiring second image information of the target area;
and the image fusion module is used for carrying out image relay fusion operation on the first image information and the second image information so as to obtain target image information.
In one exemplary embodiment, the image fusion module includes:
the position information acquisition unit is used for acquiring the position information of all target objects contained in the first image information and the second image information;
an average error calculation unit, configured to determine an average position error of all the target objects in a target time frame based on the position information;
and an information stream access unit, configured to switch the information stream to an information stream of the second image information starting from the target time frame if the average position error satisfies a second condition.
According to another embodiment of the present invention, there is also provided an unmanned aerial vehicle storage device, including:
the parking place is used for parking the first unmanned aerial vehicle and the second unmanned aerial vehicle;
the communication module is in communication connection with the first unmanned aerial vehicle and the second unmanned aerial vehicle and is used for receiving a first monitoring instruction;
and the power supply module is used for charging the first unmanned aerial vehicle and the second unmanned aerial vehicle.
According to a further embodiment of the present invention, there is also provided a computer-readable storage medium having a computer program stored thereon, wherein the computer program is arranged to perform the steps of any of the above method embodiments when executed.
According to yet another embodiment of the present invention, there is also provided an electronic device, including a memory in which a computer program is stored and a processor configured to execute the computer program to perform the steps in any of the above method embodiments.
According to the invention, because the first unmanned aerial vehicle and the second unmanned aerial vehicle carry out relay monitoring on the same target area, the computational stress caused by continuous change of images is avoided, so that the problem of high operation and maintenance cost can be solved, and the effects of reducing the operation and maintenance cost and improving the monitoring efficiency are achieved.
Drawings
Fig. 1 is a block diagram of a hardware structure of a mobile terminal of a target monitoring method based on an unmanned aerial vehicle according to an embodiment of the present invention;
fig. 2 is a flowchart of a target monitoring method based on an unmanned aerial vehicle according to an embodiment of the present invention;
fig. 3 is a block diagram of a target monitoring apparatus based on an unmanned aerial vehicle according to an embodiment of the present invention.
Detailed Description
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings in conjunction with the embodiments.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order.
The method embodiments provided in the embodiments of the present application may be executed in a mobile terminal, a computer terminal, or a similar computing device. Taking the operation on a mobile terminal as an example, fig. 1 is a hardware structure block diagram of the mobile terminal of the target monitoring method based on the unmanned aerial vehicle according to the embodiment of the present invention. As shown in fig. 1, the mobile terminal may include one or more (only one shown in fig. 1) processors 102 (the processor 102 may include, but is not limited to, a processing device such as a microprocessor MCU or a programmable logic device FPGA), and a memory 104 for storing data, wherein the mobile terminal may further include a transmission device 106 for communication functions and an input-output device 108. It will be understood by those of ordinary skill in the art that the structure shown in fig. 1 is only an illustration and is not intended to limit the structure of the mobile terminal. For example, the mobile terminal may also include more or fewer components than shown in FIG. 1, or have a different configuration than shown in FIG. 1.
The memory 104 may be used to store a computer program, for example, a software program and a module of an application software, such as a computer program corresponding to the target monitoring method based on a drone in an embodiment of the present invention, and the processor 102 executes various functional applications and data processing by running the computer program stored in the memory 104, that is, implements the method described above. The memory 104 may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 104 may further include memory located remotely from the processor 102, which may be connected to the mobile terminal over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission device 106 is used for receiving or transmitting data via a network. Specific examples of the network described above may include a wireless network provided by a communication provider of the mobile terminal. In one example, the transmission device 106 includes a Network adapter (NIC), which can be connected to other Network devices through a base station so as to communicate with the internet. In one example, the transmission device 106 may be a Radio Frequency (RF) module, which is used to communicate with the internet in a wireless manner.
In this embodiment, a target monitoring method based on an unmanned aerial vehicle is provided, and fig. 2 is a flowchart of the target monitoring method based on the unmanned aerial vehicle according to the embodiment of the present invention, as shown in fig. 2, the flowchart includes the following steps:
step S202, a first monitoring instruction is obtained, and based on the first monitoring instruction, the first monitoring instruction is sent to a first unmanned machine to indicate the first unmanned machine to move to a target height, and first image information of a target area is collected;
in this embodiment, control first unmanned aerial vehicle according to first monitoring instruction and move to the target height and monitor, be in order to realize the nimble control to the target area through unmanned aerial vehicle's flexibility, avoid remote area to lack supervisory equipment and supervisory equipment to the untimely problem of far away target monitoring.
Wherein, first unmanned aerial vehicle can (but not limited to) arrange through the website of shutting down, be provided with equipment or structures such as communication module in the website of shutting down, the charger, the parking stall, unmanned aerial vehicle fixing device, thereby realize first unmanned aerial vehicle's remote control and automatic management and control, it is corresponding, first monitoring instruction can be through the external control center with the website communication connection of shutting down or mobile control equipment (such as cell-phone etc.) is sent, also can be through the timing module timing back generation of the website self of shutting down of depositing of unmanned aerial vehicle, can also be through the sending of other modes.
The first image information includes, but is not limited to, pictures, video, sound, and the like.
It should be noted that the number of the first unmanned machines may be one or multiple, and may be selectively adjusted according to the actual application environment.
Step S204, acquiring power information of the first unmanned machine under the condition that the first unmanned machine moves to a target height;
in this embodiment, the power information is acquired to perform relay in advance when the power condition of the first unmanned battery reaches a limit, so as to continuously monitor the target area and avoid the phenomenon of image interruption.
The power information includes (but is not limited to) information such as the battery remaining amount, the engine temperature, the engine speed, the flight time and the like of the unmanned aerial vehicle.
Step S206, executing a first relay operation under the condition that the power information meets a first condition, wherein the first relay operation comprises the following steps: sending a first recovery instruction to the first drone to instruct the first drone to land to a target location along a first path; sending a second monitoring instruction to a second unmanned aerial vehicle to indicate the second unmanned aerial vehicle to move to the target height along a second path, and acquiring second image information of the target area;
in the embodiment, when the power of the first unmanned aerial vehicle is insufficient, the second unmanned aerial vehicle is used for relaying, so that the continuity of image monitoring is ensured; and set up first route and second route in order to avoid the collision of first unmanned aerial vehicle and second unmanned aerial vehicle to guarantee first unmanned aerial vehicle and second unmanned aerial vehicle's flight safety.
Wherein the first condition includes (but is not limited to) the remaining capacity being less than a threshold, the engine temperature being greater than a threshold, the time of flight being greater than a threshold, the engine speed being greater than a threshold, etc.; the target location may be (but is not limited to) an aircraft stand, ground, etc. location of an aircraft-stopping site; the second unmanned aerial vehicle can be one, also can be many, and first unmanned aerial vehicle can be located the different shut down positions of same shut down website with second unmanned aerial vehicle jointly, also can be in the different shut down websites in unified place.
And step S208, carrying out image relay fusion operation on the first image information and the second image information to obtain target image information.
In this embodiment, the image relay fusion operation performed on the first image information and the second image information is to connect an information flow of the first image information with an information flow of the second image information, so as to ensure integrity and accuracy of an image for monitoring a target area, and avoid a situation of a monitoring blind spot period.
It should be noted that, in order to ensure accurate relay of the first image information and the second image information, the first drone and the second drone both perform corresponding image acquisition in the recovery process and the movement process, so as to avoid image interruption caused by lack of images in the recovery process and the movement process of the second drone.
Through the steps, because the first unmanned aerial vehicle and the second unmanned aerial vehicle monitor the same target area in a relay mode, the situation that the computational power is insufficient due to continuous change of images is avoided, meanwhile, the target area is flexibly monitored through the unmanned aerial vehicle, and the monitoring efficiency of the target area is improved, so that the problems of high operation and maintenance cost and low monitoring efficiency can be solved, and the effects of reducing the operation and maintenance cost and improving the monitoring efficiency are achieved.
The main body of the above steps may be a base station, a terminal, etc., but is not limited thereto.
In an optional embodiment, the image relay fusion operation includes:
step S2082, obtaining position information of all target objects included in both the first image information and the second image information;
step S2084, based on the position information, determining the average position error of all the target objects in the target time frame;
step S2086, switching the information stream to the information stream of the second image information starting from the target time frame when the average position error satisfies a second condition.
In this embodiment, the average position error calculation of the position information of the target object is performed to ensure that the first image information and the second image information can be accurately matched when being fused, so as to avoid the loss of the related information in the second image information.
The target objects include (but are not limited to) vehicles, pedestrians, buildings (such as lamp posts and road posts) and the like in a target area, and the position information includes GPS/Beidou longitude and latitude coordinate information and the like of the target objects; the average position error may be an average value of differences between position coordinates of the same target object in different image information, for example, coordinates of a lamp post in the first image information are (x, y), coordinates in the second image information are (x 1, y 1), and the position difference of the lamp post is (x-x 1, y-y 1), then the same difference calculation is performed on the vehicle and the pedestrian respectively, and finally the difference value is averaged to obtain a final average position error (Δ x, Δ y), and the second condition is satisfied, that is, the target time frame image information with the smallest selected average position error.
It should be noted that, in order to ensure the accuracy of image matching, the first drone and the second drone may be time-calibrated first.
For example, the image frames acquired by the first drone in the recovery process are (a, b, c, d), where a, b, c, d are respectively a start point and an end point of a time period, that is, an ab time period, a bc time period, and a cd time period, and the image frames acquired by the second drone in the movement process are (e, f, g, h), where a point corresponds to e point, b point corresponds to f point, and the other end points are analogized, and then the average position error in each time period is calculated, and assuming that the average position error in the fg time period is the smallest, the information stream is switched to the information stream of the second drone using the f point as the start point.
In an optional embodiment, in a case that the average position error satisfies a second condition, the method further comprises:
step S20862, obtaining the movement height information of the second unmanned aerial vehicle;
step S20864, in a case where it is determined that the motion altitude information satisfies the fourth condition, switching the information stream to the information stream of the second image information starting from the target time frame.
In this embodiment, because unmanned aerial vehicle is at the flight in-process, the image of gathering can change along with the change of height, therefore in order to guarantee the stability of image quality and image content, need carry out the access of information flow again after the motion altitude of second unmanned aerial vehicle reaches the default to avoid flight in-process factors such as wind direction to the interference of image.
The fourth condition is a preset height value, and in the height value, the acquisition range of the image can cover the whole target area, and the quality of the image can meet the requirement of subsequent application.
In an optional embodiment, before the sending the second monitoring instruction to the second drone, the method further comprises:
step S2062, acquiring the position coordinate information of the first unmanned aerial vehicle and the second unmanned aerial vehicle;
step S2064, determining the position difference between the first unmanned aerial vehicle and the second unmanned aerial vehicle based on the position coordinate information;
step S2066, in a case that the position difference does not satisfy the safety condition, sending an adjustment instruction to the second drone to instruct the second drone to move along the second path.
In this embodiment, confirm the position difference and be in order to guarantee the safe distance between first unmanned aerial vehicle and the second unmanned aerial vehicle, avoid appearing unmanned aerial vehicle's damage.
In an optional embodiment, the method further comprises:
step S2010, when the first drone and/or the second drone move to the target position, performing a synchronization time correction operation on the first drone and/or the second drone so that an information flow time of the first drone and/or the second drone is a target time.
In this embodiment, the synchronization time correction operation may be (but is not limited to) NTP timing for the drone, and the NTP timing server uses an NTP server with the same central end, so as to ensure that an error between the NTP time used by the video stream of the drone and the central server time is sufficiently small, where the NTP timestamp of the picture time is to be recorded in the video stream transmitted by the drone.
It should be noted that the synchronization time correction operation may be performed before the drone is used, during the charging of the drone, or during other periods.
Through the description of the foregoing embodiments, it is clear to those skilled in the art that the method according to the foregoing embodiments may be implemented by software plus a necessary general hardware platform, and certainly may also be implemented by hardware, but the former is a better implementation mode in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
In this embodiment, a target monitoring device based on an unmanned aerial vehicle is further provided, and the device is used for implementing the above embodiments and preferred embodiments, and the description of which has been already made is omitted. As used below, the term "module" may be a combination of software and/or hardware that implements a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware, or a combination of software and hardware is also possible and contemplated.
Fig. 3 is a block diagram of a target monitoring apparatus based on an unmanned aerial vehicle according to an embodiment of the present invention, and as shown in fig. 3, the apparatus includes:
the instruction receiving and sending module 32 is configured to obtain a first monitoring instruction, send the first monitoring instruction to a first unmanned machine based on the first monitoring instruction, indicate the first unmanned machine to move to a target height, and acquire first image information of a target area;
the power monitoring module 34 is configured to obtain power information of the first unmanned machine when the first unmanned machine moves to a target height;
a first relay module 36, configured to execute a first relay operation when the power information satisfies a first condition, where the first relay operation includes: sending a first recovery instruction to the first drone to instruct the first drone to land to a target location along a first path; sending a second monitoring instruction to a second unmanned aerial vehicle to indicate the second unmanned aerial vehicle to move to the target height along a second path, and acquiring second image information of the target area;
an image fusion module 38, configured to perform image relay fusion operation on the first image information and the second image information to obtain target image information.
In an alternative embodiment, the image fusion module 38 includes:
a position information acquisition unit 382, configured to acquire position information of all target objects included in the first image information and the second image information;
an average error calculation unit 384 for determining an average position error of all the target objects in a target time frame based on the position information;
an information stream accessing unit 386, configured to switch the information stream to the information stream of the second image information starting from the target time frame if the average position error satisfies a second condition.
In an optional embodiment, the information flow access unit 386 includes:
the height information acquisition unit 3862 is configured to acquire motion height information of the second unmanned aerial vehicle when the average position error satisfies a second condition;
an information stream accessing unit 3864, configured to switch the information stream to the information stream of the second image information starting from the target time frame if it is determined that the motion height information satisfies the fourth condition.
In an optional embodiment, the apparatus further comprises:
a position coordinate acquisition unit 362, configured to acquire position coordinate information of the first drone and the second drone before sending the second monitoring instruction to the second drone;
a position difference determination unit 364 for determining a position difference between the first drone and the second drone based on the position coordinate information;
an adjusting unit 366, configured to send an adjusting instruction to the second drone to instruct the second drone to move along the second path when the position difference does not satisfy a safety condition.
In an optional embodiment, the apparatus further comprises:
a time calibration module 310, configured to perform a synchronization time correction operation on the first drone and/or the second drone when the first drone and/or the second drone move to the target location, so that an information flow time of the first drone and/or the second drone is a target time.
It should be noted that, the above modules may be implemented by software or hardware, and for the latter, the following may be implemented, but not limited to: the modules are all positioned in the same processor; alternatively, the modules are located in different processors in any combination.
Still provide an unmanned aerial vehicle storage device in this embodiment, include:
the parking place is used for parking the first unmanned aerial vehicle and the second unmanned aerial vehicle;
the communication module is in communication connection with the first unmanned aerial vehicle and the second unmanned aerial vehicle and is used for receiving a first monitoring instruction;
and the power supply module is used for charging the first unmanned aerial vehicle and the second unmanned aerial vehicle.
The present invention is illustrated by the following specific examples.
In the aspect of the hardware, involve one set of unmanned aerial vehicle workstation (corresponding aforementioned shut down website and unmanned aerial vehicle storage device) that can fly away automatically and retrieve, these workstations can be near the construction in unmanned aerial vehicle place. This set of unmanned aerial vehicle workstation involves automatic basic functions such as flying, charging, video wireless transmission, unmanned aerial vehicle software firmware update, unmanned aerial vehicle system time synchronization, power supply, fiber network, can carry out analysis processes with unmanned aerial vehicle video real-time transmission to central computer lab. Typically such a set of drone workstations can manage 2 or more drones.
In the aspect of software, functions such as unmanned aerial vehicle management, video access, video AI analysis are related to, and its basic function is as follows:
s1, the unmanned aerial vehicle (corresponding to the first unmanned aerial vehicle and/or the second unmanned aerial vehicle) can take off and return at fixed time and fixed point, through setting in software, a certain unmanned aerial vehicle can be arranged to fly to a certain preset coordinate and height at fixed time, a set section of road is monitored, and real-time video is subjected to return processing (corresponding to the step S202). When the battery of the unmanned aerial vehicle is insufficient, flying another replacing unmanned aerial vehicle, and after the replacing unmanned aerial vehicle flies in place, the previous unmanned aerial vehicle navigates back to charge (corresponding to the foregoing steps S204-S206), for example, when the remaining battery capacity of the previous unmanned aerial vehicle is reduced to a set threshold, for example, 10%, the monitoring system starts a replacing program to fly the replacing unmanned aerial vehicle; it should be noted that a wider range can be covered by relaying through a plurality of working point positions of the unmanned aerial vehicle;
software such as video access is the existing system, the unmanned aerial vehicle video is accessed to the software system, meanwhile, a small target recognition model needs to be trained, and vehicle recognition and tracking are carried out on the video at the visual angle of the unmanned aerial vehicle through the small target recognition model.
Because the working height and the coverage range of the unmanned aerial vehicle are set, the distance and the position of a road can be calibrated, so that the coordinates of a vehicle in the real world can be restored, and the calculation speed, the event position and the like can be more accurate.
With the coverage of the larger range, the traffic condition can be grasped on the whole, various traffic jam conditions, traffic situations, vehicle speed, traffic flow, vehicle running tracks and the like can be analyzed, abundant high-speed traffic management monitoring and management software can be developed on the basis, and the problems of short coverage of a fixed camera, high cost, a large number of visual blind areas and the like are solved.
For some high-speed events, the identification or verification can be carried out in the view of the unmanned aerial vehicle, such as abnormal parking, congestion, flame smoke and the like. Some events can be combined with event perception of the fixed camera to perform joint judgment.
The system can also be extended to urban scenes, but the urban scenes have some difficulties, such as the problem that some roads cannot be completely covered because of high buildings and tree shelters, and some areas have no-flight restriction on the unmanned aerial vehicles.
S2, the unmanned aerial vehicle workstation can carry out firmware, software update during unmanned aerial vehicle charges to all carry out NTP timing to the unmanned aerial vehicle system during charging at every turn, this NTP timing server uses the same NTP server of center end, guarantees that the NTP time that the video stream of unmanned aerial vehicle used keeps the error to be enough little with center server time. The video stream transmitted by the drone includes an NTP timestamp in which the picture time is recorded (corresponding to step S2010).
S3, each unmanned aerial vehicle is provided with a fixed working point position which can be uniquely determined by longitude and latitude and height, and the working point positions of alternate unmanned aerial vehicles can be set at the same height and are separated by one section of recorded longitude and latitude, so that the two unmanned aerial vehicles are ensured to be in a safe distance, and the collision risk is avoided. This can be added to the monitoring system as a distance limit to ensure that the working positions of the drones are not too close (corresponding to steps S2062-2066).
S4, after the unmanned aerial vehicle flies to the designated working point, informing the monitoring system, and starting a video stream switching program by the monitoring system (correspondingly signing steps S20862-S20864);
the video streams of the system working unmanned aerial vehicle and the alternate unmanned aerial vehicle are simultaneously accessed into the video analysis system to perform time alignment operation, and time and space matching is performed on the vehicle targets on the road to ensure that the targets are mapped to the positions of the real world, and the switching is continuous in time and space.
The specific method is that for the two paths of videos, each identified target has an absolute position at each moment (each frame image); within a certain time range, error estimation is carried out on the images of the two paths of videos within a certain time, the average error calculation of the position is carried out on the target calculated by the two paths of video streams after time matching, the time when the average error in the two paths of video pictures is the minimum is taken, and the time error of the difference between the two paths of video pictures is taken as the matching result. And (4) formally accessing the video stream of the alternate unmanned aerial vehicle after the time error is matched, and finishing the alternate process (corresponding to the step S208).
For example, if the total average error of the target position in the picture of the working unmanned aerial vehicle at the time T and the target position in the picture of the succeeding unmanned aerial vehicle at the time T +. Δ T is the minimum, the analysis result of the video time T +. Δ T of the substituting unmanned aerial vehicle is formally accessed at the certain time T, so that the continuity of the analysis is ensured.
And S5, after the video switching process is completed, the monitoring system indicates the previous working unmanned aerial vehicle to return, and the unmanned aerial vehicle relay process is completed.
Embodiments of the present invention also provide a computer-readable storage medium having a computer program stored thereon, wherein the computer program is arranged to perform the steps of any of the above-mentioned method embodiments when executed.
In an exemplary embodiment, the computer-readable storage medium may include, but is not limited to: various media capable of storing computer programs, such as a usb disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk.
Embodiments of the present invention further provide an electronic device, comprising a memory in which a computer program is stored and a processor configured to execute the computer program to perform the steps in any of the above method embodiments.
In an exemplary embodiment, the electronic apparatus may further include a transmission device and an input/output device, wherein the transmission device is connected to the processor, and the input/output device is connected to the processor.
For specific examples in this embodiment, reference may be made to the examples described in the above embodiments and exemplary embodiments, and details of this embodiment are not repeated herein.
It will be apparent to those skilled in the art that the modules or steps of the present invention described above may be implemented in a general purpose computing device, they may be centralized in a single computing device or distributed across a network of multiple computing devices, and they may be implemented in program code that is executable by a computing device, such that they may be stored in a memory device and executed by a computing device, and in some cases, the steps shown or described may be executed in an order different from that shown or described herein, or they may be separately fabricated into individual integrated circuit modules, or multiple modules or steps therein may be fabricated into a single integrated circuit module. Thus, the present invention is not limited to any specific combination of hardware and software.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the principle of the present invention should be included in the protection scope of the present invention.
Claims (10)
1. A target monitoring method based on an unmanned aerial vehicle is characterized by comprising the following steps:
acquiring a first monitoring instruction, sending the first monitoring instruction to a first unmanned machine based on the first monitoring instruction to indicate the first unmanned machine to move to a target height, and collecting first image information of a target area;
acquiring power information of the first unmanned machine under the condition that the first unmanned machine moves to a target height;
executing a first relay operation under the condition that the power information meets a first condition, wherein the first relay operation comprises the following steps: sending a first recovery instruction to the first drone to instruct the first drone to land to a target location along a first path; sending a second monitoring instruction to a second unmanned aerial vehicle to indicate the second unmanned aerial vehicle to move to the target height along a second path, and acquiring second image information of the target area;
and carrying out image relay fusion operation on the first image information and the second image information to obtain target image information.
2. The method according to claim 1, wherein the image relay fusion operation comprises:
acquiring position information of all target objects contained in the first image information and the second image information;
determining an average position error of all the target objects in a target time frame based on the position information;
switching the information stream to an information stream of second image information starting from the target time frame if the average position error satisfies a second condition.
3. The method according to claim 2, wherein in case the average position error satisfies a second condition, the method further comprises:
acquiring the motion height information of the second unmanned aerial vehicle;
in a case where it is determined that the motion altitude information satisfies the fourth condition, switching the information stream to an information stream of second image information starting from the target time frame.
4. The method of claim 1, wherein prior to said sending a second monitoring instruction to a second drone, the method further comprises:
acquiring position coordinate information of the first unmanned aerial vehicle and the second unmanned aerial vehicle;
determining a position difference between the first drone and the second drone based on the position coordinate information;
and sending an adjusting instruction to the second unmanned aerial vehicle to indicate the second unmanned aerial vehicle to move along the second path under the condition that the position difference does not meet the safety condition.
5. The method of claim 1, further comprising:
and under the condition that the first unmanned aerial vehicle and/or the second unmanned aerial vehicle move to the target position, performing synchronization time correction operation on the first unmanned aerial vehicle and/or the second unmanned aerial vehicle so as to enable the information flow time of the first unmanned aerial vehicle and/or the second unmanned aerial vehicle to be target time.
6. The utility model provides a target monitoring device based on unmanned aerial vehicle which characterized in that includes:
the instruction receiving and sending module is used for acquiring a first monitoring instruction, sending the first monitoring instruction to a first unmanned machine based on the first monitoring instruction so as to indicate the first unmanned machine to move to a target height, and collecting first image information of a target area;
the power monitoring module is used for acquiring power information of the first unmanned machine under the condition that the first unmanned machine moves to a target height;
a first relay module, configured to execute a first relay operation when the power information satisfies a first condition, where the first relay operation includes: sending a first recovery instruction to the first drone to instruct the first drone to land to a target location along a first path; sending a second monitoring instruction to a second unmanned aerial vehicle to indicate the second unmanned aerial vehicle to move to the target height along a second path, and acquiring second image information of the target area;
and the image fusion module is used for carrying out image relay fusion operation on the first image information and the second image information so as to obtain target image information.
7. The apparatus of claim 6, wherein the image fusion module comprises:
the position information acquisition unit is used for acquiring the position information of all target objects contained in the first image information and the second image information;
an average error calculation unit, configured to determine an average position error of all the target objects in a target time frame based on the position information;
and an information stream access unit, configured to switch the information stream to an information stream of second image information starting from the target time frame when the average position error satisfies a second condition.
8. The utility model provides an unmanned aerial vehicle storage device which characterized in that includes:
the parking place is used for parking the first unmanned aerial vehicle and the second unmanned aerial vehicle;
the communication module is in communication connection with the first unmanned aerial vehicle and the second unmanned aerial vehicle and is used for receiving a first monitoring instruction;
and the power supply module is used for charging the first unmanned aerial vehicle and the second unmanned aerial vehicle.
9. A computer-readable storage medium, in which a computer program is stored, wherein the computer program is arranged to perform the method of any of claims 1 to 5 when executed.
10. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, and wherein the processor is arranged to execute the computer program to perform the method of any of claims 1 to 5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211344787.4A CN115407803A (en) | 2022-10-31 | 2022-10-31 | Target monitoring method and device based on unmanned aerial vehicle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211344787.4A CN115407803A (en) | 2022-10-31 | 2022-10-31 | Target monitoring method and device based on unmanned aerial vehicle |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115407803A true CN115407803A (en) | 2022-11-29 |
Family
ID=84167381
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211344787.4A Pending CN115407803A (en) | 2022-10-31 | 2022-10-31 | Target monitoring method and device based on unmanned aerial vehicle |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115407803A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117572885A (en) * | 2023-11-20 | 2024-02-20 | 鸣飞伟业技术有限公司 | Night tracking method, system and related device based on thermal infrared camera of unmanned aerial vehicle |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105763423A (en) * | 2016-03-22 | 2016-07-13 | 临沂高新区翔鸿电子科技有限公司 | Information exchange method of unmanned aerial vehicles |
CN110176955A (en) * | 2019-07-01 | 2019-08-27 | 北京有感科技有限责任公司 | UAV Communication base station, communication system and communication system construction method |
CN111402286A (en) * | 2018-12-27 | 2020-07-10 | 杭州海康威视***技术有限公司 | Target tracking method, device and system and electronic equipment |
CN113052876A (en) * | 2021-04-25 | 2021-06-29 | 合肥中科类脑智能技术有限公司 | Video relay tracking method and system based on deep learning |
WO2022143181A1 (en) * | 2020-12-29 | 2022-07-07 | 青岛千眼飞凤信息技术有限公司 | Information processing method and apparatus, and information processing system |
-
2022
- 2022-10-31 CN CN202211344787.4A patent/CN115407803A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105763423A (en) * | 2016-03-22 | 2016-07-13 | 临沂高新区翔鸿电子科技有限公司 | Information exchange method of unmanned aerial vehicles |
CN111402286A (en) * | 2018-12-27 | 2020-07-10 | 杭州海康威视***技术有限公司 | Target tracking method, device and system and electronic equipment |
CN110176955A (en) * | 2019-07-01 | 2019-08-27 | 北京有感科技有限责任公司 | UAV Communication base station, communication system and communication system construction method |
WO2022143181A1 (en) * | 2020-12-29 | 2022-07-07 | 青岛千眼飞凤信息技术有限公司 | Information processing method and apparatus, and information processing system |
CN113052876A (en) * | 2021-04-25 | 2021-06-29 | 合肥中科类脑智能技术有限公司 | Video relay tracking method and system based on deep learning |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117572885A (en) * | 2023-11-20 | 2024-02-20 | 鸣飞伟业技术有限公司 | Night tracking method, system and related device based on thermal infrared camera of unmanned aerial vehicle |
CN117572885B (en) * | 2023-11-20 | 2024-05-31 | 鸣飞伟业技术有限公司 | Night tracking method, system and related device based on thermal infrared camera of unmanned aerial vehicle |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114035614B (en) | Unmanned aerial vehicle autonomous inspection method and system based on prior information and storage medium | |
US9851716B2 (en) | Unmanned aerial vehicle and methods for controlling same | |
US20170345317A1 (en) | Dynamic routing based on captured data quality | |
CN102654940A (en) | Traffic information acquisition system based on unmanned aerial vehicle and processing method of traffic information acquisition system | |
CN110879071B (en) | High-precision positioning system and positioning method based on vehicle-road cooperation | |
CN105157708A (en) | Unmanned aerial vehicle autonomous navigation system and method based on image processing and radar | |
CN107289953A (en) | A kind of navigation control method of unmanned aerial vehicle group | |
CN110427041A (en) | A kind of heat-net-pipeline unmanned plane cruise system and method | |
CN110636255A (en) | Unmanned aerial vehicle image and video transmission and distribution system and method based on 4G network | |
CN112162565B (en) | Uninterrupted self-main-pole tower inspection method based on multi-machine collaborative operation | |
CN115407803A (en) | Target monitoring method and device based on unmanned aerial vehicle | |
WO2021088683A1 (en) | Method for adjusting self-discharge cycle of battery, and unmanned aerial vehicle | |
CN115686063A (en) | Mobile unmanned aerial vehicle and nest management and control system for distribution network circuit inspection | |
CN116136613A (en) | Automatic inspection method, device, equipment and medium for data center | |
CN116580567A (en) | Road congestion cause acquisition method, system and equipment based on intelligent traffic light | |
CN111429723A (en) | Communication and perception data fusion method based on road side equipment | |
CN115421503B (en) | Unmanned aerial vehicle inspection system for bridge | |
CN114512005B (en) | Road self-inspection method and device, unmanned aerial vehicle and storage medium | |
CN115775417A (en) | Base station inspection system, method and readable storage medium | |
CN206741541U (en) | A kind of project progress tracking and monitoring mechanism | |
CN107886590A (en) | Operation on the sea cruising inspection system and method based on unmanned plane | |
WO2018053754A1 (en) | Function control method and device based on aerial vehicle | |
CN114237281A (en) | Control method and device for unmanned aerial vehicle inspection and inspection system | |
CN113110584A (en) | Multi-rotor aircraft cloud background network system and control method thereof | |
CN114690804B (en) | Unmanned aerial vehicle cluster landing method and system based on smart lamp pole parking apron |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |