CN117950422B - Unmanned aerial vehicle inspection system and inspection method - Google Patents

Unmanned aerial vehicle inspection system and inspection method Download PDF

Info

Publication number
CN117950422B
CN117950422B CN202410342861.1A CN202410342861A CN117950422B CN 117950422 B CN117950422 B CN 117950422B CN 202410342861 A CN202410342861 A CN 202410342861A CN 117950422 B CN117950422 B CN 117950422B
Authority
CN
China
Prior art keywords
aerial vehicle
unmanned aerial
target
real
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410342861.1A
Other languages
Chinese (zh)
Other versions
CN117950422A (en
Inventor
胡媛彦
乔雷章
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangda Holly Tianjin Aviation Technology Co ltd
Original Assignee
Hangda Holly Tianjin Aviation Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangda Holly Tianjin Aviation Technology Co ltd filed Critical Hangda Holly Tianjin Aviation Technology Co ltd
Priority to CN202410342861.1A priority Critical patent/CN117950422B/en
Publication of CN117950422A publication Critical patent/CN117950422A/en
Application granted granted Critical
Publication of CN117950422B publication Critical patent/CN117950422B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses an unmanned aerial vehicle inspection system and an inspection method, which belong to the technical field of unmanned aerial vehicle position and channel control, wherein the unmanned aerial vehicle inspection system comprises an information acquisition module, which acquires geographic information and/or target real-time state information by controlling the channel and three-dimensional position of the unmanned aerial vehicle, and comprises an orthographic camera and an oblique camera, wherein the lens of the orthographic camera faces to the right lower side, the lens of the oblique camera faces to the oblique lower side, the shooting pictures of the orthographic camera and the oblique camera are at least partially overlapped, and a real-time detection module comprises a photoelectric pod carried by the unmanned aerial vehicle and transmits video information acquired by the information acquisition module to a ground server in real time; the data management module supports user to check and manage, and the data interpretation and application module distinguishes the target characteristics and the behavior thereof, and obtains the data of the related geographic information and the real-time state of the target. The invention can control and correct the three-dimensional position and the channel of the unmanned aerial vehicle so as to acquire more accurate information.

Description

Unmanned aerial vehicle inspection system and inspection method
Technical Field
The invention belongs to the technical field of unmanned aerial vehicle position and channel control, and particularly relates to an unmanned aerial vehicle inspection system and an inspection method.
Background
The unmanned aerial vehicle inspection system is a system for inspecting and monitoring by utilizing unmanned aerial vehicle technology. The method acquires images, videos and data in real time by controlling the navigation channel and the three-dimensional position of the unmanned aerial vehicle and equipment such as a sensor, a camera and the like which are mounted on the unmanned aerial vehicle, and transmits the images, the videos and the data to a ground control center for analysis and processing. The unmanned aerial vehicle inspection system can be applied to various fields, such as security protection, environmental monitoring, agriculture and the like, and has the advantages of being quick, efficient, flexible and the like. The system can replace manual work to carry out dangerous or difficult-to-reach area inspection, and improves work efficiency and safety. In addition, unmanned aerial vehicle inspection system can also combine artificial intelligence technique, realizes functions such as autonomous flight, target identification, further promotes the intelligent level of system.
When the unmanned aerial vehicle performs various tasks, such as aerial photography, map making, target reconnaissance and the like, accurate geographic information and target positions need to be obtained. The existing unmanned aerial vehicle cruising system generally adopts a single camera to collect information, the collected information is not accurate enough, and a target is easily lost when the information is collected, so that how to control and correct the three-dimensional position and the channel of the unmanned aerial vehicle so as to acquire more accurate information is a problem to be solved at present.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides an unmanned aerial vehicle inspection system and an unmanned aerial vehicle inspection method, which have the advantages of controlling and correcting the three-dimensional position and the channel of the unmanned aerial vehicle so as to acquire more accurate information, and solve the problems in the prior art.
The unmanned aerial vehicle inspection system comprises a central processing module, wherein the central processing module is in signal connection with an information acquisition module, the information acquisition module is used for acquiring geographic information and/or real-time state information of a target by controlling a channel and a three-dimensional position of an unmanned aerial vehicle, the unmanned aerial vehicle inspection system comprises an orthographic camera and an oblique camera, a lens of the orthographic camera faces to the right lower side, a lens of the oblique camera faces to the oblique lower side, and shooting pictures of the orthographic camera and the oblique camera at least partially overlap;
The central processing module signal is also connected with a real-time detection module, a data management module, a data interpretation and application module and a feedback control module; the real-time investigation module comprises a photoelectric pod carried by an unmanned aerial vehicle, and video information acquired by the information acquisition module is transmitted to a ground server in real time; the data management module divides the data received by the ground server into a photo, an orthographic tiff image and a video, and supports the user to view and manage; the data interpretation and application module is used for distinguishing the characteristics of the target and judging the behaviors of the target, acquiring data related to geographic information and the real-time state of the target, and the feedback control module is used for further correcting the channel and the three-dimensional position of the unmanned aerial vehicle through the geographic information and the result data of the real-time state of the target.
Preferably, the information acquisition module includes:
An oblique camera angle adjusting unit for adjusting the oblique angle of the oblique camera according to the flying height and the size of the target area;
The double-camera shooting trigger unit triggers the oblique shooting camera to shoot after the flight control sends out shooting signals, and simultaneously sends out level triggering signals through the hot shoe wire when the oblique shooting camera shoots, triggers the orthoshooting camera to shoot simultaneously, and achieves the function of controlling the double cameras to shoot simultaneously by a single signal;
The flight route design unit is used for automatically tracking a real-time monitored target according to the task area and the direction of the oblique camera to design a flight route;
And the information recording unit is used for recording the position of the unmanned aerial vehicle when the unmanned aerial vehicle flies to control the trigger camera to take a picture, forming pos information and recording the shooting position, gesture and height.
As the preferable mode of the invention, the double camera shooting trigger unit is connected with a multi-included angle trigger subunit through signals, and when the normal camera and the oblique camera reach a preset included angle, the oblique camera and the normal camera automatically shoot pictures.
As a preferred embodiment of the present invention, the real-time detection module includes:
The first signal communication unit is used for directly connecting the pod video and control with ground upper computer software through image transmission and controlling and viewing the video through the upper computer;
The second signal communication unit is used for transmitting the nacelle video to the ground directly through image transmission, controlling the nacelle motion and tracking control through the nacelle plug-in unit at the ground station and controlling the nacelle to be communicated with the flight control through the flight control serial port, and viewing the real-time image, and simultaneously viewing the actual observation direction of the nacelle at the ground station;
The medium-long distance real-time reconnaissance unit is used for carrying out large-scale air searching and monitoring and returning live videos in real time;
And the short-distance real-time reconnaissance unit enables the unmanned aerial vehicle to hover at high altitude, searches and monitors by controlling the nacelle to move and zoom, returns the field video in real time, double-click tracks and locks the target after finding the target, and flies to the position of the target displayed by the ground station to hover at the upper altitude near the target point, and observes the target detail by zooming.
As a preferred embodiment of the present invention, the medium-and-long-distance real-time reconnaissance unit includes:
A first task application subunit: the method comprises the steps of being responsible for receiving and processing task applications, including task types and target area information, and performing preliminary evaluation and distribution;
assigning duty subunits: according to the task application and actual conditions, a specific duty unmanned aerial vehicle and a fixed operation base (FBO) are assigned to execute the task;
Optimizing a route and a airspace subunit: based on task requirements and flight safety rules, route planning and airspace optimization are carried out, so that the flight route of the unmanned aerial vehicle is ensured to be efficient and accords with related regulations and requirements;
weather condition detects subunit: monitoring and evaluating weather conditions of the target area to provide important information about the environment and flight safety evaluation, and if not, terminating the mission;
A airspace detection subunit: and the airspace conditions of the target area are detected, wherein the airspace conditions comprise an available airspace, a limited airspace and a no-fly zone of the unmanned aerial vehicle, the flight activities are ensured to meet the regulations, and if the flight activities do not meet the requirements, the mission is terminated.
A first task execution subunit: the method comprises the steps of preparation before voyage, task flight and landing, executing specified tasks, carrying out large-scale aerial searching and monitoring during the flight, transmitting back to a field video in real time, carrying out target identification, tracking and observing details by utilizing a multi-zoom photoelectric pod, and flying to the vicinity of the target to hover as required so as to acquire required information.
As a preferred embodiment of the present invention, the short-range real-time reconnaissance unit includes:
A second task application subunit: the application is responsible for receiving and processing a second task, comprises a task type and target area information, and performs preliminary evaluation and distribution;
and the take-off and landing point and task route determining subunit: after receiving the second task application, determining a suitable take-off and landing point and planning a task route;
a field preparation subunit: real-time checking and evaluating the surrounding environment, flying area and flying height of the flying spot;
Weather condition judging subunit: the method is responsible for evaluating the meteorological conditions of the current task area, including wind speed, temperature, weather conditions and the like, determining the feasibility of the flight according to the established limiting conditions, if the conditions are not met, jumping back to the site preparation subunit,
And a flight-adaptive domain judging subunit: judging available space-adaptive domains in the task area according to local regulation and policy requirements, ensuring that task execution accords with related space-domain limits, and if the task execution does not accord with the conditions, jumping back to the site preparation subunit;
a second task execution subunit: the method comprises the steps of preparation before voyage, task flight and landing, and is responsible for executing specified tasks, performing close-range real-time monitoring according to task requirements, searching and monitoring by means of the motion control and zooming functions of the photoelectric pod, and returning live videos in real time;
and a data post-processing and uploading subunit: and the method is responsible for carrying out post-processing and analysis on the acquired data, including image processing, labeling and sorting, and uploading the processed data to a designated place or server for further analysis and use.
Preferably, the data management module includes:
The data classifying unit classifies the data, displays the data according to different labels, displays the photos on the GIS map according to the longitude and latitude of the actual shooting position, and can select files for direct viewing, downloading or deleting;
The label inserting unit is used for inserting labels into the data so that the data can be displayed according to different labels;
The picture management unit is used for downloading and deleting pictures;
The data comparison unit is used for selecting photos of the same position in different periods for comparison and observing the change of the environment near the target point along with time;
And the video management unit is used for previewing, downloading and deleting the video.
Preferably, the data interpretation and application module includes:
The face recognition unit can be used for recognizing the face of the acquired video, quickly recognizing task face information in the video, classifying and sorting the face screenshots into a plurality of folders, wherein each folder is a single person face screenshot at different moments, and subsequently, the face screenshots can be butted by personnel information bases of different departments to quickly output personnel information;
a data acquisition unit: the information of the needed ships, navigation marks and floaters is acquired from the data such as images and videos transmitted by the unmanned aerial vehicle;
feature extraction unit: extracting features from the acquired images, and extracting key features for subsequent object recognition and behavior analysis;
A data processing unit: the extracted features are processed and analyzed, for example, for classification, categorization, comparison, etc. of vessels, beacons, and floats.
Data classification and identification unit: comparing the processed characteristics with the database by using the established database or the trained model, and realizing the rapid classification and identification of objects (ships, navigation marks and floaters) in the image;
and a system comparison judging unit: and judging and evaluating the identification of the object according to the comparison result, and outputting related information or warning according to the requirement.
Preferably, the feedback control module includes:
an error calculation unit: calculating an error value according to the difference between the current position of the unmanned aerial vehicle and the expected channel or the target position;
control instruction generation unit: generating a corresponding control instruction by using a control algorithm according to the error value;
and a correction unit: calculating a required course correction amount according to the target position and the current position of the unmanned aerial vehicle, and generating a corresponding control instruction to correct the course; calculating a required height correction amount according to the terrain height of the target area, and adjusting the height of the unmanned aerial vehicle through a control instruction; calculating a required channel correction amount according to the target position and the current position of the unmanned aerial vehicle, and generating a new channel;
Real-time monitoring and adjusting unit: in the correction process, the navigation channel, the altitude and the course of the unmanned aerial vehicle are continuously monitored, and the correction algorithm is adjusted and optimized according to real-time data, so that the correction accuracy and stability are improved.
The unmanned aerial vehicle inspection method uses the unmanned aerial vehicle inspection system and comprises the following steps:
the lens of the orthographic camera faces downwards, geographical information and/or target real-time state information of the right lower part are collected, and the lens of the oblique camera faces downwards obliquely, and geographical information and/or target real-time state information of the oblique lower part are collected;
transmitting the video information acquired by the information acquisition module to a ground server in real time;
Dividing data received by a ground server into a photo, an orthographic tiff image and a video, and supporting a user to view and manage;
And distinguishing the characteristics of the target and judging the behavior of the target, and acquiring data related to geographic information and real-time state of the target.
Compared with the prior art, the invention has the following beneficial effects:
1. According to the invention, through the cooperation of the orthographic camera and the oblique camera, richer and more accurate data can be acquired; the real-time detection module, the data management module and the data interpretation and application module can transmit data to the control center, manage and analyze the data and obtain result data; the feedback control module further corrects the navigation channel and the three-dimensional position of the unmanned aerial vehicle through the result data so as to acquire richer and more accurate data.
2. The invention brings remarkable improvement and advantages to the inspection system in the aspects of working efficiency, data collection, cost saving, safety improvement and the like, and has the following advantages:
the method can upload data in time, and can resolve object characteristics and judge behaviors by utilizing a camera technology, so that the method is helpful for quickly acquiring data about geographic information and real-time state of a target;
The method reduces the excessive dependence on ground personnel, such as automatic classification and arrangement of face screenshot, reduction of manual operation and other modes, saves labor and material costs, and improves the working efficiency. Meanwhile, the potential safety hazard possibly caused by manual operation is reduced;
The information is acquired by using the double cameras carried by the small and medium-sized vertical take-off and landing fixed wing unmanned aerial vehicle, so that the geographic information acquisition efficiency is improved, the acquired photos are more in line with the visual perception of people, and the problems are easier to find;
the real-time monitoring and reconnaissance tasks are realized through the small and medium-sized vertical take-off and landing fixed-wing unmanned aerial vehicle and the multi-rotor unmanned aerial vehicle which are carried with the photoelectric pod. The unmanned aerial vehicle has the advantages of high-altitude visual angle, simplicity in operation, flexibility, maneuver and the like, can quickly find a target and transmit back a video in real time, and assists personnel to know the real-time state of a target area.
Drawings
Fig. 1 is a system block diagram of an unmanned aerial vehicle inspection system provided by an embodiment of the present invention;
Fig. 2 is a schematic position diagram of an orthographic camera and an oblique camera of the unmanned aerial vehicle inspection system provided by the embodiment of the invention;
fig. 3 is a schematic flow chart of an unmanned aerial vehicle inspection system provided by an embodiment of the invention;
FIG. 4 is a detailed system block diagram of an information acquisition module provided by an embodiment of the present invention;
fig. 5 is a schematic flow chart of a dual camera shooting trigger unit according to an embodiment of the present invention;
fig. 6 is a system block diagram of a multi-included angle trigger subunit added in the unmanned aerial vehicle inspection system provided by the embodiment of the invention;
FIG. 7 is a detailed system block diagram of a real-time detection module provided by an embodiment of the present invention;
Fig. 8 is a schematic flow chart of a first signal communication unit according to an embodiment of the present invention;
fig. 9 is a schematic flow chart of a second signal communication unit according to an embodiment of the present invention;
FIG. 10 is a detailed system block diagram of a medium-to-long distance real-time reconnaissance unit provided by an embodiment of the present invention;
FIG. 11 is a block flow diagram of a middle-to-far real-time reconnaissance unit provided by an embodiment of the present invention;
FIG. 12 is a schematic flow chart of a middle-to-long distance real-time reconnaissance unit according to an embodiment of the present invention;
FIG. 13 is a detailed system block diagram of a close-range real-time reconnaissance unit provided by an embodiment of the present invention;
FIG. 14 is a block flow diagram of a close-range real-time reconnaissance unit provided by an embodiment of the present invention;
FIG. 15 is a detailed system block diagram of a data management module provided by an embodiment of the present invention;
FIG. 16 is a detailed system block diagram of a data interpretation and application module provided by an embodiment of the present invention;
FIG. 17 is a detailed system block diagram of a feedback control module provided by an embodiment of the present invention.
Detailed Description
For a further understanding of the invention, its features and advantages, reference is now made to the following examples, which are illustrated in the accompanying drawings.
The structure of the present invention will be described in detail with reference to the accompanying drawings.
Referring to fig. 1-3, an unmanned aerial vehicle inspection system provided by the embodiment of the invention comprises a central processing module, wherein the central processing module is in signal connection with an information acquisition module, the information acquisition module acquires geographic information and/or real-time state information of a target by controlling a channel and a three-dimensional position of an unmanned aerial vehicle, the unmanned aerial vehicle inspection system comprises an orthographic camera and an oblique camera, a lens of the orthographic camera faces to the right lower side, a lens of the oblique camera faces to the oblique lower side, and shooting pictures of the orthographic camera and the oblique camera at least partially overlap.
The central processing module signal is also connected with a real-time detection module, a data management module, a data interpretation and application module and a feedback control module; the real-time investigation module comprises a photoelectric pod carried by an unmanned aerial vehicle, and video information acquired by the information acquisition module is transmitted to a ground server in real time; the data management module divides the data received by the ground server into a photo, an orthographic tiff image and a video, and supports the user to view and manage; the data interpretation and application module is used for distinguishing the characteristics of the target and judging the behaviors of the target, acquiring data related to geographic information and the real-time state of the target, and the feedback control module is used for further correcting the channel and the three-dimensional position of the unmanned aerial vehicle through the geographic information and the result data of the real-time state of the target.
Illustratively, resolving the target feature and discriminating its behavior includes: carrying out face recognition and tracking on the target person, and automatically classifying and sorting face screenshots; or the object target is identified, for example, whether the object is a ship, a navigation mark or a general floater is judged, the object can be tracked, and the real-time state data of the object can be obtained.
Through the cooperation of the orthographic camera and the oblique camera, richer and more accurate data can be acquired; the real-time detection module, the data management module and the data interpretation and application module can transmit data to the control center, manage and analyze the data and obtain result data; the feedback control module further corrects the navigation channel and the three-dimensional position of the unmanned aerial vehicle through the result data so as to acquire richer and more accurate data.
Through the arrangement, the inspection system brings remarkable improvement and advantages in the aspects of work efficiency, data collection, cost saving, safety improvement and the like, and has the following advantages:
The method can upload data in time, and can resolve object characteristics and judge behaviors by utilizing a camera technology, thereby helping to quickly acquire data about geographic information and real-time states of targets.
The method reduces the excessive dependence on ground personnel, such as automatic classification and arrangement of face screenshot, reduction of manual operation and other modes, saves labor and material costs, and improves the working efficiency. Meanwhile, potential safety hazards possibly caused by manual operation are reduced.
The information acquisition is carried out by using the small and medium-sized vertical take-off and landing fixed wing unmanned aerial vehicle to carry the double cameras, so that the geographic information acquisition efficiency is improved, the acquired photos are more in line with the visual perception of people, and the problems are easier to find.
The real-time monitoring and reconnaissance tasks are realized through the small and medium-sized vertical take-off and landing fixed-wing unmanned aerial vehicle and the multi-rotor unmanned aerial vehicle which are carried with the photoelectric pod. The unmanned aerial vehicle has the advantages of high-altitude visual angle, simplicity in operation, flexibility, maneuver and the like, can quickly find a target and transmit back a video in real time, and assists personnel to know the real-time state of a target area.
It should be noted that, the combination of the orthographic camera and the oblique camera has the following advantages:
1. Shooting at multiple angles: through the cooperation of orthographic camera and oblique camera, can realize the shooting of target area under different angles and visual angles. The shooting of the orthographic camera lens downwards provides an overhead view angle, and the overall view of a target area can be presented; the shooting angle of the oblique shooting camera with the downward and inclined lens can acquire images with more details and stereoscopic impression.
2. Obtaining geographic information: the orthographic camera is mainly used for acquiring an orthographic image, can provide high-resolution and real-proportion image data, and is used for making a map, measuring geographic information acquisition tasks such as ground object size and the like. The oblique camera captures more space details and features outside the orthographic image through the oblique angle of the oblique camera, and further provides richer geographic information.
3. Target area real-time state monitoring: meanwhile, the orthographic camera and the oblique camera are adopted, so that image data of a target area can be acquired in real time, and object changes, activities or behaviors in the target area can be monitored and analyzed. By utilizing the characteristic that the shooting pictures of the two cameras are at least partially overlapped, the real-time state of the target area can be more accurately judged and tracked, and the accuracy and the efficiency of monitoring and feedback are improved.
4. Comprehensive analysis and decision: the data collected by the orthographic camera and the oblique camera can be comprehensively analyzed, displayed in a superimposed mode on a map or a three-dimensional model, combined with other information such as sensor data and the like, so that a user can be helped to know the condition of a target area more comprehensively and accurately, and decisions and plans can be made based on the information.
In summary, the use of the orthographic camera and the oblique camera in combination can provide advantages of multi-angle shooting, rich geographic information acquisition, real-time state monitoring of a target area, comprehensive analysis and decision. The combination can more comprehensively and accurately know the target area, and provides better support for the patrol and monitoring tasks in various application scenes.
Referring to fig. 4 and 5, the information acquisition module includes:
An oblique camera angle adjusting unit for adjusting the oblique angle of the oblique camera according to the flying height and the size of the target area;
And the double-camera shooting trigger unit (refer to fig. 5) triggers the oblique shooting camera to shoot after the flight control sends out shooting signals, and simultaneously sends out level trigger signals through the hot shoe wire when the oblique shooting camera shoots, triggers the orthoshooting camera to shoot simultaneously, and realizes the function of controlling the double cameras to shoot simultaneously by a single signal. It should be noted that the normal camera may be triggered first, and then the oblique camera may be triggered by the normal camera. In the prior art, unmanned aerial vehicle only has one path of photographing signal generally, adopts the Y line mode to carry out signal triggering to two cameras, and the actual measurement in-process can discover unmanned aerial vehicle POS point number and the inconsistent condition of photo quantity, can't one-to-one, leads to each photo to take a photograph the time longitude and latitude unable position of determining. And through the trigger unit that shoots of dual camera, the photo uniformity of shooing is higher, can solve above-mentioned problem.
The flight route design unit is used for automatically tracking a real-time monitored target according to the task area and the direction of the oblique camera to design a flight route;
And the information recording unit is used for recording the position of the unmanned aerial vehicle when the unmanned aerial vehicle flies to control the trigger camera to take a picture, forming pos information and recording the shooting position, gesture and height.
Through this setting, two cameras shoot the picture only need overlap can, do not have the requirement to side direction overlap ratio, is equivalent to camera angle of vision to enlarge to single normal shooting camera's 3 times, and information acquisition efficiency can improve greatly. The information acquisition module can acquire geographic information and also can acquire specific target information. The information acquisition module realizes efficient camera cooperation through functions of oblique shooting camera angle adjustment, double camera shooting triggering, flight route design and information recording, is used for acquiring geographic information and/or specific target information, and the arrangement not only meets acquisition requirements, but also improves flight efficiency and solves the problem that the position cannot be determined. And through the functions of automatically customizing a more scientific inspection route, automatically tracking and planning a route and the like, the automation and the high efficiency of the unmanned aerial vehicle inspection task are realized, and the working efficiency is greatly improved.
Illustratively, adjusting the angle of the oblique camera to accommodate different fly heights and target area sizes may be performed as follows:
1. Acquiring flight altitude and target area size information: and acquiring the flying height of the current unmanned aerial vehicle, and determining the size of a target area to be covered according to task requirements.
2. Calculating the required inclination angle: according to the flying height and the size of the target area, the inclination angle required by the oblique-shooting camera is calculated by combining the technical parameters of the camera (such as focal length, sensor size and the like).
3. Adjusting the angle of the oblique-shooting camera: and adjusting the angle of the camera by using an oblique-shooting camera angle adjusting unit according to the calculated oblique angle. The specific adjustment may be achieved by motors, servos or other mechanical means. According to the design principle of the angle adjusting unit, the angle adjusting unit is controlled to adjust the inclination angle according to a preset rule. For example, a remote control, a ground control station, or an automated program may be required to achieve precise adjustment.
4. Confirming the angle adjusting effect: after the angle adjustment is completed, whether the angle of the oblique-shooting camera is properly adjusted is verified by monitoring or playing back the photographed image or video in real time. It is necessary to ensure that the shots overlap at least partially to obtain complete geographical information.
Referring to fig. 6, the dual camera photographing triggering unit is connected with a multi-angle triggering subunit, and when the normal camera and the oblique camera reach a preset angle, the oblique camera and the normal camera automatically take pictures. For example, when the included angle is 30 degrees, one group of photos is automatically taken, and when the included angle is 35 degrees, the other group of photos is actively taken.
When the normal shooting camera and the oblique shooting camera reach a preset included angle, the automatic shooting of the photo has the following advantages:
1. multi-angle viewing angle: through automatically taking photos under different included angles, the visual angles of multiple angles of the target area can be obtained, so that the obtained geographic information is more comprehensive and has diversity, and more accurate analysis and evaluation of the target area are facilitated.
2. Rich detailed information: the orthographic camera and the oblique camera provide different image characteristics under different included angles, and more detail information including height change, topographic texture, object shape and the like can be obtained by combining photos under different included angles, so that the target area can be more comprehensively known.
3. Three-dimensional and depth perception: because the oblique shooting camera is inclined, the captured image has stronger stereoscopic impression and depth impression, and the stereoscopic image technology or the three-dimensional reconstruction algorithm can be utilized in the subsequent processing to generate more real stereoscopic model or elevation data by triggering the orthographic camera and the oblique shooting camera to shoot at the same time, so that the quality of geographic information and the visual effect are further improved.
4. Data registration and alignment: through the pictures taken under different included angles, the image registration and comparison can be carried out, and through the alignment of the images with different included angles, the conditions of change, object movement and the like can be detected more easily, and accurate measurement and analysis can be carried out.
In summary, through automatically taking the photos under different included angles, the orthographic camera and the oblique camera can provide multi-angle viewing angles, abundant detailed information, stereoscopic impression and depth impression, and more accurate data registration and comparison. This allows for more comprehensive and accurate collection of geographic information and improves the reliability and effectiveness of subsequent processing and analysis.
Referring to fig. 7-9, the real-time investigation module includes:
Referring to fig. 8, the first signal communication unit, the pod video and control are directly connected with the ground upper computer software through the image transmission, and the control and video viewing are performed through the upper computer;
Referring to fig. 9, the second signal communication unit transmits the pod video to the ground directly through image transmission, the control is communicated with the flight control through the flight control serial port, the movement and tracking control of the pod are controlled through the pod plug-in unit at the ground station, the real-time image is checked, and meanwhile, the actual observation direction of the pod can be seen at the ground station;
The medium-long distance real-time reconnaissance unit is used for carrying out large-scale air searching and monitoring and returning live videos in real time;
And the short-distance real-time reconnaissance unit enables the unmanned aerial vehicle to hover at high altitude, searches and monitors by controlling the nacelle to move and zoom, returns the field video in real time, double-click tracks and locks the target after finding the target, and flies to the position of the target displayed by the ground station to hover at the upper altitude near the target point, and observes the target detail by zooming.
The modules form a real-time reconnaissance system, which allows long-distance and short-distance tasks to be performed, and various technologies and devices are utilized to realize the searching and monitoring of the target area and the tracking and detail observation of the target. The system provides high-efficiency and flexible reconnaissance capability, can be applied to various demand scenes, and enhances the capability of on-site reconnaissance and information collection.
Referring to fig. 10-12, the middle-long distance real-time reconnaissance unit includes:
A first task application subunit: the method comprises the steps of being responsible for receiving and processing task applications, including task types and target area information, and performing preliminary evaluation and distribution;
assigning duty subunits: according to the task application and actual conditions, a specific duty unmanned aerial vehicle and a fixed operation base (FBO) are assigned to execute the task;
Optimizing a route and a airspace subunit: based on task requirements and flight safety rules, route planning and airspace optimization are carried out, so that the flight route of the unmanned aerial vehicle is ensured to be efficient and accords with related regulations and requirements;
Weather condition detects subunit: monitoring and evaluating weather conditions of a target area, including weather parameters such as wind speed, rain and snow, so as to provide important information about the environment and flight safety evaluation, and if the weather parameters do not meet the requirements, terminating the task;
A airspace detection subunit: and the airspace conditions of the target area are detected, wherein the airspace conditions comprise an available airspace, a limited airspace and a no-fly zone of the unmanned aerial vehicle, the flight activities are ensured to meet the regulations, and if the flight activities do not meet the requirements, the mission is terminated.
A first task execution subunit: the method comprises the steps of preparation before voyage, task flight and landing, executing specified tasks, carrying out large-scale aerial searching and monitoring during the flight, transmitting back to a field video in real time, carrying out target identification, tracking and observing details by utilizing a multi-zoom photoelectric pod, and flying to the vicinity of the target to hover as required so as to acquire required information.
The functions of the sub-units work cooperatively, so that the medium and long-distance real-time reconnaissance task can efficiently plan a route, optimize airspace use, consider meteorological factors, ensure flight safety, and execute a real-time monitoring task in a target area to provide instant and accurate field video data.
The small and medium-sized vertical take-off and landing fixed wing unmanned aerial vehicle carries a photoelectric pod to execute a real-time monitoring task, the unmanned aerial vehicle can carry the photoelectric pod to search and monitor in a large scale in the air and return a live video in real time, the unmanned aerial vehicle can clearly identify a target object at a high altitude through the multi-zoom photoelectric pod, double-click tracking and locking the target, then the unmanned aerial vehicle flies to the position above the target point to hover according to the target position displayed by a ground station, and the target detail is observed through zooming. The system operates on the basis of deploying an FBO (unmanned aerial vehicle fixed operation base) of the unmanned aerial vehicle along the coastline, and can quickly reach a target site, continuously monitor the target and transmit back a site video in real time by utilizing the characteristics of quick take-off and landing response, high flight speed and long aviation duration of the composite fixed wing unmanned aerial vehicle.
Referring to fig. 13 to 14, the close-range real-time reconnaissance unit includes:
A second task application subunit: the application is responsible for receiving and processing a second task, comprises a task type and target area information, and performs preliminary evaluation and distribution;
and the take-off and landing point and task route determining subunit: after receiving the second task application, determining a suitable take-off and landing point and planning a task route;
a field preparation subunit: real-time checking and evaluating the surrounding environment, flying area and flying height of the flying spot;
Weather condition judging subunit: the method is responsible for evaluating the meteorological conditions of the current task area, including wind speed, temperature, weather conditions and the like, determining the feasibility of the flight according to the established limiting conditions, if the conditions are not met, jumping back to the site preparation subunit,
And a flight-adaptive domain judging subunit: judging available space-adaptive domains in the task area according to local regulation and policy requirements, ensuring that task execution accords with related space-domain limits, and if the task execution does not accord with the conditions, jumping back to the site preparation subunit;
a second task execution subunit: the method comprises the steps of preparation before voyage, task flight and landing, and is responsible for executing specified tasks, performing close-range real-time monitoring according to task requirements, searching and monitoring by means of the motion control and zooming functions of the photoelectric pod, and returning live videos in real time.
And a data post-processing and uploading subunit: and the method is responsible for carrying out post-processing and analysis on the acquired data, including image processing, labeling and sorting, and uploading the processed data to a designated place or server for further analysis and use.
The combination of the subunits enables the short-distance real-time reconnaissance unit to meet the requirements of task application, lifting point determination, site preparation, weather judgment, airspace judgment, task execution and the like, realizes the rapid response and execution of the multi-rotor unmanned aerial vehicle in a task area, and provides reliable site data. Simultaneously, make full use of many rotor unmanned aerial vehicle's characteristics, make it be applicable to face that the region is little, task demand is frequent, respond rapidly's condition.
The multi-rotor unmanned aerial vehicle has the advantages that the multi-rotor unmanned aerial vehicle does not depend on a landing point, is short in preparation time and quick in response, and can achieve the operation of enabling a single person to drive to reach a destination quickly. Meanwhile, the multi-rotor unmanned aerial vehicle works in a suitable airspace, an airspace application is not needed, the operation requirement can be responded quickly, after a task is received, a flight person carries the unmanned aerial vehicle to respond quickly, the unmanned aerial vehicle can directly drive to a destination, the unmanned aerial vehicle can work within 5min, and meanwhile, the unmanned aerial vehicle is convenient to transfer and suitable for tasks with small area, multiple tasks and quick response.
Referring to fig. 15, the data management module includes:
The data classifying unit classifies the data, displays the data according to different labels, displays the photos on the GIS map according to the longitude and latitude of the actual shooting position, and can select files for direct viewing, downloading or deleting;
The label inserting unit is used for inserting labels into the data so that the data can be displayed according to different labels;
The picture management unit is used for downloading and deleting pictures;
The data comparison unit is used for selecting photos of the same position in different periods for comparison and observing the change of the environment near the target point along with time;
And the video management unit is used for previewing, downloading and deleting the video.
Referring to fig. 16, the data interpretation and application module includes:
The face recognition unit can be used for recognizing the face of the acquired video, quickly recognizing task face information in the video, classifying and sorting the face screenshots into a plurality of folders, wherein each folder is a single person face screenshot at different moments, and subsequently, the face screenshots can be butted by personnel information bases of different departments to quickly output personnel information;
a data acquisition unit: the information of the needed ships, navigation marks and floaters is acquired from the data such as images and videos transmitted by the unmanned aerial vehicle;
feature extraction unit: extracting features from the acquired images, and extracting key features for subsequent object recognition and behavior analysis;
A data processing unit: the extracted features are processed and analyzed, for example, for classification, categorization, comparison, etc. of vessels, beacons, and floats.
Data classification and identification unit: comparing the processed characteristics with the database by using the established database or the trained model, and realizing the rapid classification and identification of objects (ships, navigation marks and floaters) in the image;
and a system comparison judging unit: and judging and evaluating the identification of the object according to the comparison result, and outputting related information or warning according to the requirement.
The units are matched with each other, and the identification and behavior analysis of ships, navigation marks and floaters are realized in the unmanned aerial vehicle system. Through data acquisition, feature extraction, processing and classification, the system can efficiently judge the object in the image and carry out the next decision or feedback according to the recognition result. Such functionality can provide more accurate and useful information while displaying images in real time, helping users to perform object recognition and behavioral analysis.
The system platform is utilized to display data such as images and videos transmitted by the unmanned aerial vehicle in real time, the collected images are subjected to feature extraction, whether objects in the images are ships or ship marks or other floaters is rapidly distinguished by the comparison database, and high-efficiency output is performed.
And can also perform behavior recognition of the ship: the behavior data of the ship is analyzed and processed through the running state of the ship at sea, including ship speed, steering, running track, acceleration, front obstacles and the like, so that the behavior of the ship is recorded, counted, resolved (classified and compared), the running state of the ship and the driving characteristics of a driver are predicted, and the behavior of the ship is identified.
Referring to fig. 17, the feedback control module includes:
an error calculation unit: calculating an error value according to the difference between the current position of the unmanned aerial vehicle and the expected channel or the target position;
control instruction generation unit: generating a corresponding control instruction by using a control algorithm according to the error value;
and a correction unit: calculating a required course correction amount according to the target position and the current position of the unmanned aerial vehicle, and generating a corresponding control instruction to correct the course; calculating a required height correction amount according to the terrain height of the target area, and adjusting the height of the unmanned aerial vehicle through a control instruction; calculating a required channel correction amount according to the target position and the current position of the unmanned aerial vehicle, and generating a new channel;
Real-time monitoring and adjusting unit: in the correction process, the navigation channel, the altitude and the course of the unmanned aerial vehicle are continuously monitored, and the correction algorithm is adjusted and optimized according to real-time data, so that the correction accuracy and stability are improved.
Through the cooperative work of the units, the designed feedback control module can further correct the channel and the three-dimensional position of the unmanned aerial vehicle by utilizing the geographic information and the result data of the target real-time state. The feedback control can enable the unmanned aerial vehicle to accurately execute tasks, adapt to different scenes and environments, and improve flight safety and task execution effects.
Specific examples of the examples are as follows:
Assuming that the drone is performing a task, it is required to fly to near the target point and maintain a specific heading and altitude.
The error calculation unit calculates a heading error in the horizontal direction and a altitude error in the vertical direction according to the difference between the current position of the unmanned aerial vehicle and the expected channel or target position. For example, assuming that the desired heading of the target point is 90 degrees and the current actual heading of the drone is 80 degrees, the heading error is 10 degrees.
The control instruction generation unit converts the heading error and the altitude error into corresponding control instructions by using a control algorithm. For example, determining a heading angle to be adjusted according to the heading error, and generating a control surface control instruction; and determining the thrust output to be adjusted according to the height error, and generating an accelerator control instruction.
And the course correction unit calculates course correction according to the target position and the current position of the unmanned aerial vehicle, namely a required course adjustment value. For example, if the target point is on the left side relative to the current position of the unmanned aerial vehicle, the heading correction unit calculates an angle at which the heading needs to be adjusted to the right, and generates a corresponding control surface control instruction.
The height correction unit calculates a required height correction amount, i.e., a required height adjustment value, based on the terrain height of the target area. For example, if the terrain of the target area is high, the altitude correction unit calculates a value for improving the altitude of the unmanned aerial vehicle, and generates a corresponding throttle control instruction.
In the correction process, the heading, the altitude and the position of the unmanned aerial vehicle are continuously monitored, and the correction algorithm is adjusted and optimized according to the real-time data. For example, if the corrected heading or altitude is found to be unstable or out of the expected range, the real-time monitoring and adjusting unit may perform fine adjustment on the correction algorithm according to the actual situation, so as to improve the accuracy and stability of the correction.
Through the cooperative work of the units, the feedback control module can further correct the channel and the three-dimensional position of the unmanned aerial vehicle by utilizing the geographic information and the result data of the target real-time state. The unmanned plane can be adjusted according to the correction instructions of the heading and the altitude, so that the unmanned plane can more accurately approach the target point in the flight process, and the required heading and the required altitude can be maintained. The feedback control mechanism can improve the flight precision and task execution effect of the unmanned aerial vehicle.
The unmanned aerial vehicle inspection method uses the unmanned aerial vehicle inspection system and comprises the following steps:
step S1, the lens of the orthographic camera faces downwards, geographical information and/or target real-time state information of the right downwards are collected, the lens of the oblique camera faces downwards obliquely, and geographical information and/or target real-time state information of the oblique downwards are collected;
step S2, transmitting the video information acquired by the information acquisition module to a ground server in real time;
S3, dividing the data received by the ground server into a photo, an orthographic tiff image and a video, and supporting the user to view and manage;
And distinguishing the characteristics of the target and judging the behavior of the target, and acquiring data related to geographic information and real-time state of the target.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
Although embodiments of the present invention have been shown and described, it will be understood by those skilled in the art that various changes, modifications, substitutions and alterations can be made therein without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (8)

1. The unmanned aerial vehicle inspection system comprises a central processing module, wherein the central processing module is in signal connection with an information acquisition module, and the information acquisition module acquires geographic information and/or target real-time state information by controlling a channel and a three-dimensional position of an unmanned aerial vehicle;
The information acquisition module comprises an orthographic camera and an oblique camera, wherein the lens of the orthographic camera faces downwards, the lens of the oblique camera faces downwards obliquely, and shooting pictures of the orthographic camera and the oblique camera are at least partially overlapped;
the real-time investigation module comprises a photoelectric pod carried by an unmanned aerial vehicle, and video information acquired by the information acquisition module is transmitted to a ground server in real time;
The data management module divides the data received by the ground server into a photo, an orthographic tiff image and a video, and supports the user to view and manage;
the data interpretation and application module is used for distinguishing the characteristics of the target and judging the behavior of the target, and obtaining the result data of the related geographic information and the real-time state of the target;
the feedback control module is used for further correcting the channel and the three-dimensional position of the unmanned aerial vehicle through the geographic information and the result data of the target real-time state;
the information acquisition module comprises:
An oblique camera angle adjusting unit for adjusting the oblique angle of the oblique camera according to the flying height and the size of the target area;
The double-camera shooting trigger unit triggers the oblique shooting camera to shoot after the flight control sends out shooting signals, and simultaneously sends out level triggering signals through the hot shoe wire when the oblique shooting camera shoots, triggers the orthoshooting camera to shoot simultaneously, and achieves the function of controlling the double cameras to shoot simultaneously by a single signal;
The flight route design unit is used for automatically tracking a real-time monitored target according to the task area and the direction of the oblique camera to design a flight route;
The information recording unit is used for recording the position of the unmanned aerial vehicle when the unmanned aerial vehicle flies to trigger the camera to take a picture, forming pos information and recording the shooting position, gesture and height;
The double-camera shooting trigger unit is in signal connection with a multi-included angle trigger subunit, and when the normal shooting camera and the oblique shooting camera reach a preset included angle, the oblique shooting camera and the normal shooting camera automatically shoot pictures;
When the included angle between the orthographic camera and the oblique camera is alpha degrees, one group of photos is automatically shot, and when the included angle between the orthographic camera and the oblique camera is beta degrees, the other group of photos is actively shot.
2. An unmanned aerial vehicle inspection system as claimed in claim 1, wherein:
The real-time investigation module comprises:
The first signal communication unit is used for directly connecting the pod video and control with ground upper computer software through image transmission and controlling and viewing the video through the upper computer;
The second signal communication unit is used for transmitting the nacelle video to the ground directly through image transmission, controlling the nacelle motion and tracking control through the nacelle plug-in unit at the ground station and controlling the nacelle to be communicated with the flight control through the flight control serial port, and viewing the real-time image, and simultaneously viewing the actual observation direction of the nacelle at the ground station;
The medium-long distance real-time reconnaissance unit is used for carrying out large-scale air searching and monitoring and returning live videos in real time;
And the short-distance real-time reconnaissance unit enables the unmanned aerial vehicle to hover at high altitude, searches and monitors by controlling the nacelle to move and zoom, returns the field video in real time, double-click tracks and locks the target after finding the target, and flies to the position of the target displayed by the ground station to hover at the upper altitude near the target point, and observes the target detail by zooming.
3. An unmanned aerial vehicle inspection system as claimed in claim 2, wherein:
The medium-and-long-distance real-time reconnaissance unit comprises:
A first task application subunit: the method comprises the steps of being responsible for receiving and processing task applications, including task types and target area information, and performing preliminary evaluation and distribution;
Assigning duty subunits: according to the task application and actual conditions, a specific duty unmanned aerial vehicle and a fixed operation base are assigned to execute the task;
Optimizing a route and a airspace subunit: based on task requirements and flight safety rules, route planning and airspace optimization are carried out, so that the flight route of the unmanned aerial vehicle is ensured to be efficient and accords with related regulations and requirements;
weather condition detects subunit: monitoring and evaluating weather conditions of the target area to provide important information about the environment and flight safety evaluation, and if not, terminating the mission;
a airspace detection subunit: the method comprises the steps of being responsible for detecting airspace conditions of a target area, including an available airspace, a limited airspace and a no-fly zone of the unmanned aerial vehicle, ensuring that the flying activity meets the regulation, and terminating a task if the flying activity does not meet the requirement;
A first task execution subunit: the method comprises the steps of preparation before voyage, task flight and landing, executing specified tasks, carrying out large-scale aerial searching and monitoring during the flight, transmitting back to a field video in real time, carrying out target identification, tracking and observing details by utilizing a multi-zoom photoelectric pod, and flying to the vicinity of the target to hover as required so as to acquire required information.
4. An unmanned aerial vehicle inspection system as claimed in claim 2, wherein:
The close-range real-time reconnaissance unit includes:
A second task application subunit: the application is responsible for receiving and processing a second task, comprises a task type and target area information, and performs preliminary evaluation and distribution;
and the take-off and landing point and task route determining subunit: after receiving the second task application, determining a suitable take-off and landing point and planning a task route;
a field preparation subunit: real-time checking and evaluating the surrounding environment, flying area and flying height of the flying spot;
Weather condition judging subunit: the method is responsible for evaluating the meteorological conditions of the current task area, including wind speed, temperature, weather conditions and the like, determining the feasibility of the flight according to the established limiting conditions, if the conditions are not met, jumping back to the site preparation subunit,
And a flight-adaptive domain judging subunit: judging available space-adaptive domains in the task area according to local regulation and policy requirements, ensuring that task execution accords with related space-domain limits, and if the task execution does not accord with the conditions, jumping back to the site preparation subunit;
a second task execution subunit: the method comprises the steps of preparation before voyage, task flight and landing, and is responsible for executing specified tasks, performing close-range real-time monitoring according to task requirements, searching and monitoring by means of the motion control and zooming functions of the photoelectric pod, and returning live videos in real time;
and a data post-processing and uploading subunit: and the method is responsible for carrying out post-processing and analysis on the acquired data, including image processing, labeling and sorting, and uploading the processed data to a designated place or server for further analysis and use.
5. An unmanned aerial vehicle inspection system as claimed in claim 1, wherein:
The data management module comprises:
The data classifying unit classifies the data, displays the data according to different labels, displays the photos on the GIS map according to the longitude and latitude of the actual shooting position, and can select files for direct viewing, downloading or deleting;
The label inserting unit is used for inserting labels into the data so that the data can be displayed according to different labels;
The picture management unit is used for downloading and deleting pictures;
The data comparison unit is used for selecting photos of the same position in different periods for comparison and observing the change of the environment near the target point along with time;
And the video management unit is used for previewing, downloading and deleting the video.
6. An unmanned aerial vehicle inspection system as claimed in claim 1, wherein:
the data interpretation and application module comprises:
The face recognition unit can be used for recognizing the face of the acquired video, quickly recognizing task face information in the video, classifying and sorting the face screenshots into a plurality of folders, wherein each folder is a single person face screenshot at different moments, and subsequently, the face screenshots can be butted by personnel information bases of different departments to quickly output personnel information;
a data acquisition unit: the information of the needed ships, navigation marks and floaters is acquired from the data such as images and videos transmitted by the unmanned aerial vehicle;
feature extraction unit: extracting features from the acquired images, and extracting key features for subsequent object recognition and behavior analysis;
A data processing unit: processing and analyzing the extracted features;
data classification and identification unit: comparing the processed characteristics with the database by using the established database or the trained model to realize rapid classification and identification of objects in the image;
and a system comparison judging unit: and judging and evaluating the identification of the object according to the comparison result, and outputting related information or warning according to the requirement.
7. An unmanned aerial vehicle inspection system as claimed in claim 1, wherein:
The feedback control module includes:
an error calculation unit: calculating an error value according to the difference between the current position of the unmanned aerial vehicle and the expected channel or the target position;
control instruction generation unit: generating a corresponding control instruction by using a control algorithm according to the error value;
and a correction unit: calculating a required course correction amount according to the target position and the current position of the unmanned aerial vehicle, and generating a corresponding control instruction to correct the course; calculating a required height correction amount according to the terrain height of the target area, and adjusting the height of the unmanned aerial vehicle through a control instruction; calculating a required channel correction amount according to the target position and the current position of the unmanned aerial vehicle, and generating a new channel;
Real-time monitoring and adjusting unit: in the correction process, the navigation channel, the altitude and the course of the unmanned aerial vehicle are continuously monitored, and the correction algorithm is adjusted and optimized according to real-time data, so that the correction accuracy and stability are improved.
8. A method of unmanned aerial vehicle inspection, characterized in that the unmanned aerial vehicle inspection system of any of claims 1-7 is used, and comprising the steps of:
The method comprises the steps that geographic information and/or target real-time state information are collected through controlling a channel and a three-dimensional position of an unmanned aerial vehicle, a lens of an orthographic camera faces downwards, geographic information and/or target real-time state information of the right-downwards are collected, a lens of an oblique camera faces downwards, and geographic information and/or target real-time state information of the oblique downwards are collected;
transmitting the video information acquired by the information acquisition module to a ground server in real time;
Dividing data received by a ground server into a photo, an orthographic tiff image and a video, and supporting a user to view and manage;
And distinguishing the characteristics of the target and judging the behavior of the target, and acquiring data related to geographic information and real-time state of the target.
CN202410342861.1A 2024-03-25 2024-03-25 Unmanned aerial vehicle inspection system and inspection method Active CN117950422B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410342861.1A CN117950422B (en) 2024-03-25 2024-03-25 Unmanned aerial vehicle inspection system and inspection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410342861.1A CN117950422B (en) 2024-03-25 2024-03-25 Unmanned aerial vehicle inspection system and inspection method

Publications (2)

Publication Number Publication Date
CN117950422A CN117950422A (en) 2024-04-30
CN117950422B true CN117950422B (en) 2024-06-18

Family

ID=90803254

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410342861.1A Active CN117950422B (en) 2024-03-25 2024-03-25 Unmanned aerial vehicle inspection system and inspection method

Country Status (1)

Country Link
CN (1) CN117950422B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111846267A (en) * 2020-08-06 2020-10-30 成都市玄上科技有限公司 Shooting method and device for oblique photography three-dimensional modeling
CN115580708A (en) * 2022-09-15 2023-01-06 中国人民解放军国防科技大学 Unmanned aerial vehicle inspection method for optical cable line

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106813648A (en) * 2015-11-30 2017-06-09 北京中天易观信息技术有限公司 A kind of 3 camera aviation three dimensional data collection systems based on unmanned aerial vehicle platform
US20170269592A1 (en) * 2016-03-18 2017-09-21 Oceaneering International, Inc. Use of Unmanned Aerial Vehicles for NDT Inspections
KR102113807B1 (en) * 2017-10-20 2020-05-21 주식회사 삼진엘앤디 Uav patrol system and patrol method to maintain safety in the designated district
CN110189411A (en) * 2019-06-12 2019-08-30 中国民用航空飞行学院 Emergency management and rescue Search Area method for visualizing after a kind of accident of aircraft
JP2022023592A (en) * 2020-07-27 2022-02-08 株式会社トプコン Survey system, survey method and program for survey
CN111959803B (en) * 2020-08-11 2021-09-07 中国地质科学院矿产资源研究所 Unmanned aerial vehicle slope shooting platform and slope shooting unmanned aerial vehicle
CN116892992A (en) * 2023-07-14 2023-10-17 天津市勘察设计院集团有限公司 Investigation method for accurately measuring and calculating mine landslide volume

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111846267A (en) * 2020-08-06 2020-10-30 成都市玄上科技有限公司 Shooting method and device for oblique photography three-dimensional modeling
CN115580708A (en) * 2022-09-15 2023-01-06 中国人民解放军国防科技大学 Unmanned aerial vehicle inspection method for optical cable line

Also Published As

Publication number Publication date
CN117950422A (en) 2024-04-30

Similar Documents

Publication Publication Date Title
CN111145545B (en) Road traffic behavior unmanned aerial vehicle monitoring system and method based on deep learning
CN109765930B (en) Unmanned aerial vehicle vision navigation
US11017228B2 (en) Method and arrangement for condition monitoring of an installation with operating means
CN108496129B (en) Aircraft-based facility detection method and control equipment
CN108109437B (en) Unmanned aerial vehicle autonomous route extraction and generation method based on map features
CN114373138A (en) Full-automatic unmanned aerial vehicle inspection method and system for high-speed railway
CN105527969B (en) A kind of mountain garden belt investigation and monitoring method based on unmanned plane
CN111679695B (en) Unmanned aerial vehicle cruising and tracking system and method based on deep learning technology
CN105759834A (en) System and method of actively capturing low altitude small unmanned aerial vehicle
CN103941746A (en) System and method for processing unmanned aerial vehicle polling image
CN104168455A (en) Air-based large-scene photographing system and method
US20100157056A1 (en) Tracking and imaging data fusion
CN112162565B (en) Uninterrupted self-main-pole tower inspection method based on multi-machine collaborative operation
CN112327906A (en) Intelligent automatic inspection system based on unmanned aerial vehicle
CN112326686A (en) Unmanned aerial vehicle intelligent cruise pavement disease detection method, unmanned aerial vehicle and detection system
CN103942273A (en) Dynamic monitoring system and method for aerial quick response
WO2019061111A1 (en) Path adjustment method and unmanned aerial vehicle
CN108258613A (en) Intelligent line patrolling photoelectric nacelle and the method for realizing line walking
CN110647170A (en) Navigation mark inspection device and method based on unmanned aerial vehicle
CN115580708A (en) Unmanned aerial vehicle inspection method for optical cable line
CN210835732U (en) Beacon inspection device based on unmanned aerial vehicle
CN115649501A (en) Night driving illumination system and method for unmanned aerial vehicle
CN112950671A (en) Real-time high-precision parameter measurement method for moving target by unmanned aerial vehicle
CN105810023A (en) Automatic airport undercarriage retraction and extension monitoring system and method
Rojas-Perez et al. Real-time landing zone detection for UAVs using single aerial images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant