CN112288818A - Unmanned quick shooting modeling method below ten thousand square meters - Google Patents

Unmanned quick shooting modeling method below ten thousand square meters Download PDF

Info

Publication number
CN112288818A
CN112288818A CN202011300874.0A CN202011300874A CN112288818A CN 112288818 A CN112288818 A CN 112288818A CN 202011300874 A CN202011300874 A CN 202011300874A CN 112288818 A CN112288818 A CN 112288818A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
imaging area
modeling
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011300874.0A
Other languages
Chinese (zh)
Inventor
陈彬彬
陈贵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wenzhou Huxue Technology Co ltd
Original Assignee
Wenzhou Huxue Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wenzhou Huxue Technology Co ltd filed Critical Wenzhou Huxue Technology Co ltd
Priority to CN202011300874.0A priority Critical patent/CN112288818A/en
Publication of CN112288818A publication Critical patent/CN112288818A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/75Determining position or orientation of objects or cameras using feature-based methods involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Remote Sensing (AREA)
  • Computer Graphics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Traffic Control Systems (AREA)

Abstract

The unmanned quick shooting modeling method under ten thousand square meters comprises the following steps: s1, arranging unmanned aerial vehicles in each monitoring imaging area; s2, setting flight parameters of the unmanned aerial vehicle, and generating a flight route of the unmanned aerial vehicle; s3, monitoring and patrolling the monitored imaging area through an unmanned aerial vehicle and acquiring imaging area information through aerial photography; s4, the background modeling terminal processes the collected image to obtain the landform and the scene of the project; s5, modeling by the modeling software according to the processed information; and S6, searching a fire source point in the imaging area according to the three-dimensional model, and sending the fire source point information to a fire-fighting command center in real time. The invention specially optimizes the aerial photography path, realizes the functions of immediate photography, immediate transmission and immediate processing, greatly reduces the workload and the working time after collection, improves the working efficiency, can quickly and accurately carry out modeling, has more convenient modeling mode and wider model covering surface, is beneficial to quickly finding fire and timely processing, and has good use effect.

Description

Unmanned quick shooting modeling method below ten thousand square meters
Technical Field
The invention relates to the technical field of unmanned aerial vehicles, in particular to an unmanned aerial vehicle rapid shooting modeling method below ten thousand square meters.
Background
Unmanned aerial vehicle for short, it is to utilize radio remote control equipment and self-contained program control device to control the unmanned aerial vehicle, there is no cockpit on the aircraft, but install the apparatus such as autopilot, program control device, etc., the ground, on the naval vessel or mother's aircraft remote control station personnel pass the apparatus such as the radar, etc., carry on tracking, location, remote control, telemetering measurement and digital transmission to it, extensively use for aerial reconnaissance, supervision, communication, anti-dive, electronic interference, etc.; the unmanned aerial vehicle plays a role of an aerial vehicle, is a high-efficiency tool for acquiring real scene data, is matched with a high-definition camera or a laser radar to efficiently acquire spatial data of a real environment, and finally acquires point cloud data or a three-dimensional model through algorithm correction and processing, wherein the point cloud data or the three-dimensional model represents a real scene model technology; at present in the aspect of forest fire prevention monitoring, the mode of mostly using artifical tour is leading, because the forest scope is great, needs more personnel to patrol, and artifical tour is wasted time and energy, and is inefficient, is difficult to discover the condition of a fire fast, awaits improving.
Disclosure of Invention
Objects of the invention
In order to solve the technical problems in the background art, the invention provides an unmanned aerial vehicle rapid shooting modeling method below ten thousand square meters, which is used for specially optimizing an aerial shooting path, achieving the functions of immediate shooting, immediate transmission and immediate processing, greatly reducing the workload and the working time after collection, improving the working efficiency, being capable of rapidly and accurately modeling, being more convenient in modeling mode and wider in model coverage, being beneficial to rapidly finding fire and timely processing and having good use effect.
(II) technical scheme
The invention provides an unmanned aerial vehicle rapid shooting modeling method under ten thousand square meters, which comprises the following steps:
s1, acquiring GIS map information, calculating the area of the designated area, dividing the designated area into a plurality of monitoring imaging areas according to the area calculation result, and arranging an unmanned aerial vehicle in each monitoring imaging area;
s2, setting flight parameters of the unmanned aerial vehicle to generate a flight route of the unmanned aerial vehicle, wherein a flight track of the unmanned aerial vehicle adopts a spiral circular track, and the unmanned aerial vehicle carries out a multi-circle flight mode on a monitored imaging area;
s3, monitoring and patrolling the monitored imaging area through the unmanned aerial vehicle, and acquiring imaging area information through aerial photography, wherein the unmanned aerial vehicle sends the imaging area information to the background modeling terminal in real time;
s4, the background modeling terminal receives the collected information and processes the collected image to obtain the project landform and the scene;
s5, modeling by the modeling software according to the processed information to generate a three-dimensional model of the imaging area;
s6, searching a fire source point in the imaging area according to the three-dimensional model, and sending fire source point information to a fire control command center in real time when the fire source point exists in the imaging area and is found;
s7, the fire control command center makes a rescue scheme according to fire source point information, 5G technology is introduced for information transmission, the direction is determined by introducing a map and an imaging area three-dimensional model, and finally personnel are arranged for fire control treatment.
Preferably, the flight parameters include the flight speed of the unmanned aerial vehicle, the pitch angle of the holder, the take-off and landing flight height, the minimum flight height and the maximum flight height.
Preferably, in S2, the specific steps of generating the flight path of the drone are as follows:
acquiring a central point of a monitoring imaging area and midpoints of edges of the monitoring imaging area, and acquiring lengths of connecting lines of the midpoints of two pairs of opposite edges of the monitoring imaging area;
generating an elliptical virtual route by taking the central point of the monitoring imaging area as the center of an ellipse and respectively taking the lengths of connecting lines of midpoints on two pairs of opposite sides of the monitoring imaging area as the length of a long axis and the length of a short axis of the ellipse;
selecting a plurality of reference points from midpoint connecting lines on one pair of opposite sides of the monitoring imaging area, respectively using the plurality of reference points as reference lines perpendicular to the midpoint connecting lines, wherein a plurality of intersection points of the reference lines and the elliptic virtual route are a plurality of waypoint coordinates; sequentially connecting a plurality of waypoint coordinates to generate an elliptical flight route;
and reducing the length of the major axis and the length of the minor axis according to a certain numerical requirement, generating a plurality of elliptical flight paths according to the steps, and connecting the head and the tail of two adjacent elliptical flight paths to finally form a spiral circular track.
Preferably, in S4, the process of acquiring the aerial image in real time is as follows:
reconstructing each pixel point of the aerial image;
verifying assumed plane parameters for each pixel point of the acquired aerial image to obtain the depth of each pixel point and extracting dense point cloud data;
performing cross validation on the extracted dense point clouds, and filtering out noise point clouds according to the counted point cloud quality information and the picture quality information of the reconstructed point clouds;
and performing surface tiling processing on the point cloud through a Delaunay model, establishing a directed acyclic graph according to the shielding information of the fragment, and further filtering the noise fragment.
Preferably, the process further comprises the following operations:
detecting and matching aerial images acquired by an unmanned aerial vehicle, and constructing an epi-polar geometric figure; the camera pose and the scene structure are estimated in a mixed mode, and the camera pose and the scene are adjusted and optimized in a binding mode to obtain dense description of the scene; the geometry, texture and reflection properties of the scene are derived.
Preferably, the image analysis is performed by aerial triangulation to convert the series of two-dimensional aerial images into a three-dimensional dense point cloud.
Preferably, in S5, the method further includes the steps of:
the method comprises the steps of distinguishing interferents to be eliminated on a three-dimensional model, selecting and deleting the interferents, fitting a curved surface to the left-over holes after deletion through elevations around the holes, filling the holes with the curved surface, reconstructing a triangular net on the surface of the model, enabling the three-dimensional model to have a new geometric structure, and finishing elimination of all interferents and repair of the left-over holes in sequence to form the three-dimensional model with the new geometric structure.
The invention also provides an unmanned aerial vehicle rapid shooting modeling system which comprises a data acquisition unit, a signal transmission unit, a remote control terminal and a background modeling terminal; the data acquisition unit is in communication connection with the remote control terminal and the background modeling terminal through the signal transmission unit;
the data acquisition unit comprises an unmanned aerial vehicle, a high-definition camera module and a positioning module, the signal transmission unit comprises a reflector, a receiver and a controller, the remote control terminal comprises a remote control module, a navigation module, a display module and a route planning module, and the background modeling terminal comprises a processing computer, a modeling software module, a storage module and a display screen.
Preferably, the unmanned aerial vehicle further comprises a microprocessor, a power supply module, an electric quantity monitoring module and an electric quantity analysis module;
the power supply module is electrically connected with the power utilization part, the electric quantity monitoring module is in communication connection with the electric quantity analysis module, and the electric quantity analysis module is in communication connection with the microprocessor; the electric quantity monitoring module is used for monitoring the electric quantity to be measured of the power supply module and sending data to the electric quantity analysis module, the electric quantity analysis module analyzes the operation duration which can be maintained by the unmanned aerial vehicle according to the electric quantity to be measured data and sends analysis information to the microprocessor in real time, and the microprocessor sends the information back to the remote control terminal; when the power supply module waits for the electric quantity to be less than a definite value, remote control personnel control unmanned aerial vehicle to return and prevent that the electric quantity exhausts and drop the damage.
Preferably, the power supply module comprises a rechargeable lithium battery and a voltage stabilizing circuit, the lithium battery is used for providing a power supply, and the voltage stabilizing circuit is used for stabilizing voltage so as to ensure the stability of the power supply voltage; a solar power generation panel is arranged on the unmanned aerial vehicle; the solar power generation panel is electrically connected with the power supply module.
The technical scheme of the invention has the following beneficial technical effects:
arranging an unmanned aerial vehicle in each monitoring imaging area, wherein a flight track of the unmanned aerial vehicle adopts a spiral circular track, and the unmanned aerial vehicle carries out a multi-circle flight mode on the monitoring imaging area; then, monitoring and patrolling the monitoring imaging area through the unmanned aerial vehicle, and acquiring imaging area information through aerial photography, wherein the unmanned aerial vehicle transmits the imaging area information to the background modeling terminal in real time; then the background modeling terminal receives the acquired information and processes the acquired image to acquire the project landform and the scene; finally, the modeling software generates and models according to the processed information so as to generate a three-dimensional model of the imaging area; the fire source point in the imaging area is searched according to the three-dimensional model, the fire source point information is sent to the fire control command center in real time, the forest fire condition can be found in time and the response can be made quickly, the using effect is excellent, and the popularization and the use are facilitated;
the invention specially optimizes the aerial photography path, realizes the functions of immediate photography, immediate transmission and immediate processing, greatly reduces the workload and the working time after collection, improves the working efficiency, can quickly and accurately carry out modeling, has more convenient modeling mode and wider model coverage.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail with reference to the following embodiments. It should be understood that the description is intended to be exemplary only, and is not intended to limit the scope of the present invention. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present invention.
The invention provides an unmanned aerial vehicle rapid shooting modeling method under ten thousand square meters, which comprises the following steps:
s1, acquiring GIS map information, calculating the area of the designated area, dividing the designated area into a plurality of monitoring imaging areas according to the area calculation result, and arranging an unmanned aerial vehicle in each monitoring imaging area;
s2, setting flight parameters of the unmanned aerial vehicle to generate a flight route of the unmanned aerial vehicle, wherein a flight track of the unmanned aerial vehicle adopts a spiral circular track, and the unmanned aerial vehicle carries out a multi-circle flight mode on a monitored imaging area;
s3, monitoring and patrolling the monitored imaging area through the unmanned aerial vehicle, and acquiring imaging area information through aerial photography, wherein the unmanned aerial vehicle sends the imaging area information to the background modeling terminal in real time;
s4, the background modeling terminal receives the collected information and processes the collected image to obtain the project landform and the scene;
s5, modeling by the modeling software according to the processed information to generate a three-dimensional model of the imaging area;
s6, searching a fire source point in the imaging area according to the three-dimensional model, and sending fire source point information to a fire control command center in real time when the fire source point exists in the imaging area and is found;
s7, the fire control command center makes a rescue scheme according to fire source point information, 5G technology is introduced for information transmission, the direction is determined by introducing a map and an imaging area three-dimensional model, and finally personnel are arranged for fire control treatment.
In an alternative embodiment, the flight parameters include drone flight speed, pan tilt angle, take-off and landing flight height, minimum flight height, and maximum flight height.
In an alternative embodiment, in S2, the specific steps of generating the flight path of the drone are as follows:
acquiring a central point of a monitoring imaging area and midpoints of edges of the monitoring imaging area, and acquiring lengths of connecting lines of the midpoints of two pairs of opposite edges of the monitoring imaging area; generating an elliptical virtual route by taking the central point of the monitoring imaging area as the center of an ellipse and respectively taking the lengths of connecting lines of midpoints on two pairs of opposite sides of the monitoring imaging area as the length of a long axis and the length of a short axis of the ellipse; selecting a plurality of reference points from midpoint connecting lines on one pair of opposite sides of the monitoring imaging area, respectively using the plurality of reference points as reference lines perpendicular to the midpoint connecting lines, wherein a plurality of intersection points of the reference lines and the elliptic virtual route are a plurality of waypoint coordinates; sequentially connecting a plurality of waypoint coordinates to generate an elliptical flight route; and reducing the length of the major axis and the length of the minor axis according to a certain numerical requirement, generating a plurality of elliptical flight paths according to the steps, and connecting the head and the tail of two adjacent elliptical flight paths to finally form a spiral circular track.
In an alternative embodiment, the process for acquiring aerial images in real time at S4 is as follows: reconstructing each pixel point of the aerial image; verifying assumed plane parameters for each pixel point of the acquired aerial images to obtain the depth of each pixel point, extracting dense point cloud data, and performing image analysis by an aerial triangulation analysis method to convert a series of two-dimensional aerial images into three-dimensional dense point clouds; performing cross validation on the extracted dense point clouds, and filtering out noise point clouds according to the counted point cloud quality information and the picture quality information of the reconstructed point clouds; and performing surface tiling processing on the point cloud through a Delaunay model, establishing a directed acyclic graph according to the shielding information of the fragment, and further filtering the noise fragment.
In an optional embodiment, the process further comprises the following operations: detecting and matching aerial images acquired by an unmanned aerial vehicle, and constructing an epi-polar geometric figure; the camera pose and the scene structure are estimated in a mixed mode, and the camera pose and the scene are adjusted and optimized in a binding mode to obtain dense description of the scene; the geometry, texture and reflection properties of the scene are derived.
In an optional embodiment, in S5, the method further includes the following steps: the method comprises the steps of distinguishing interferents to be eliminated on a three-dimensional model, selecting and deleting the interferents, fitting a curved surface to the left-over holes after deletion through elevations around the holes, filling the holes with the curved surface, reconstructing a triangular net on the surface of the model, enabling the three-dimensional model to have a new geometric structure, and finishing elimination of all interferents and repair of the left-over holes in sequence to form the three-dimensional model with the new geometric structure.
The invention also provides an unmanned aerial vehicle rapid shooting modeling system which comprises a data acquisition unit, a signal transmission unit, a remote control terminal and a background modeling terminal; the data acquisition unit is in communication connection with the remote control terminal and the background modeling terminal through the signal transmission unit;
the data acquisition unit comprises an unmanned aerial vehicle, a high-definition camera module and a positioning module, the signal transmission unit comprises a reflector, a receiver and a controller, the remote control terminal comprises a remote control module, a navigation module, a display module and a route planning module, and the background modeling terminal comprises a processing computer, a modeling software module, a storage module and a display screen.
In an optional embodiment, the unmanned aerial vehicle further comprises a microprocessor, a power supply module, a power monitoring module and a power analysis module; the power supply module is electrically connected with the power utilization part, the electric quantity monitoring module is in communication connection with the electric quantity analysis module, and the electric quantity analysis module is in communication connection with the microprocessor; the electric quantity monitoring module is used for monitoring the electric quantity to be measured of the power supply module and sending data to the electric quantity analysis module, the electric quantity analysis module analyzes the operation duration which can be maintained by the unmanned aerial vehicle according to the electric quantity to be measured data and sends analysis information to the microprocessor in real time, and the microprocessor sends the information back to the remote control terminal; when the power waiting amount of the power supply module is lower than a certain value, the remote control personnel control the unmanned aerial vehicle to return to prevent the unmanned aerial vehicle from falling and being damaged due to power exhaustion;
the power supply module comprises a rechargeable lithium battery and a voltage stabilizing circuit, wherein the lithium battery is used for providing a power supply, and the voltage stabilizing circuit is used for stabilizing voltage so as to ensure the stability of power supply voltage; a solar power generation panel is arranged on the unmanned aerial vehicle; solar energy turns into the electric energy with solar energy, and solar panel electric connection power module helps improving unmanned aerial vehicle's duration, excellent in use effect.
When the unmanned aerial vehicle monitoring system is used, firstly, GIS map information is obtained, the area of a designated area is calculated, the designated area is divided into a plurality of monitoring imaging areas according to the area calculation result, and an unmanned aerial vehicle is arranged in each monitoring imaging area; then setting flight parameters of the unmanned aerial vehicle to generate a flight route of the unmanned aerial vehicle, wherein a flight track of the unmanned aerial vehicle adopts a spiral circular track, and the unmanned aerial vehicle carries out a multi-circle flight mode on a monitored imaging area; then, monitoring and patrolling the monitoring imaging area through the unmanned aerial vehicle, and acquiring imaging area information through aerial photography, wherein the unmanned aerial vehicle transmits the imaging area information to the background modeling terminal in real time; then the background modeling terminal receives the acquired information and processes the acquired image to acquire the project landform and the scene; finally, the modeling software generates and models according to the processed information so as to generate a three-dimensional model of the imaging area; searching a fire source point in an imaging area according to the three-dimensional model, sending fire source point information to a fire-fighting command center in real time after the fire source point exists in the imaging area and is found, making a rescue scheme by the fire-fighting command center according to the fire source point information, introducing a 5G technology for information transmission, determining the direction by introducing a map and the three-dimensional model of the imaging area, and finally arranging personnel for fire-fighting treatment, so that forest fire can be found in time and quickly reacts, the use effect is excellent, and the popularization and the use are facilitated;
in addition, the aerial photography path is specially optimized, the functions of immediate photography, immediate transmission and immediate processing are realized, the workload and the working time after collection are greatly reduced, the working efficiency is improved, the modeling can be quickly and accurately carried out, the modeling mode is more convenient and faster, and the model coverage is more extensive.
It is to be understood that the above-described embodiments of the present invention are merely illustrative of or explaining the principles of the invention and are not to be construed as limiting the invention. Therefore, any modification, equivalent replacement, improvement and the like made without departing from the spirit and scope of the present invention should be included in the protection scope of the present invention. Further, it is intended that the appended claims cover all such variations and modifications as fall within the scope and boundaries of the appended claims or the equivalents of such scope and boundaries.

Claims (10)

1. The unmanned aerial vehicle rapid shooting modeling method under ten thousand square meters is characterized by comprising the following steps:
s1, acquiring GIS map information, calculating the area of the designated area, dividing the designated area into a plurality of monitoring imaging areas according to the area calculation result, and arranging an unmanned aerial vehicle in each monitoring imaging area;
s2, setting flight parameters of the unmanned aerial vehicle to generate a flight route of the unmanned aerial vehicle, wherein a flight track of the unmanned aerial vehicle adopts a spiral circular track, and the unmanned aerial vehicle carries out a multi-circle flight mode on a monitored imaging area;
s3, monitoring and patrolling the monitored imaging area through the unmanned aerial vehicle, and acquiring imaging area information through aerial photography, wherein the unmanned aerial vehicle sends the imaging area information to the background modeling terminal in real time;
s4, the background modeling terminal receives the collected information and processes the collected image to obtain the project landform and the scene;
s5, modeling by the modeling software according to the processed information to generate a three-dimensional model of the imaging area;
s6, searching a fire source point in the imaging area according to the three-dimensional model, and sending fire source point information to a fire control command center in real time when the fire source point exists in the imaging area and is found;
s7, the fire control command center makes a rescue scheme according to fire source point information, 5G technology is introduced for information transmission, the direction is determined by introducing a map and an imaging area three-dimensional model, and finally personnel are arranged for fire control treatment.
2. The modeling method for unmanned aerial vehicle aerial speed below ten thousand square meters of claim 1, wherein the flight parameters include unmanned aerial vehicle flight speed, pan-tilt pitch angle, take-off and landing flight altitude, minimum flight altitude, and maximum flight altitude.
3. The method for modeling unmanned aerial vehicle aerial speed shooting under ten thousand square meters according to claim 1, wherein in S2, the specific steps for generating the flight path of the unmanned aerial vehicle are as follows:
acquiring a central point of a monitoring imaging area and midpoints of edges of the monitoring imaging area, and acquiring lengths of connecting lines of the midpoints of two pairs of opposite edges of the monitoring imaging area;
generating an elliptical virtual route by taking the central point of the monitoring imaging area as the center of an ellipse and respectively taking the lengths of connecting lines of midpoints on two pairs of opposite sides of the monitoring imaging area as the length of a long axis and the length of a short axis of the ellipse;
selecting a plurality of reference points from midpoint connecting lines on one pair of opposite sides of the monitoring imaging area, respectively using the plurality of reference points as reference lines perpendicular to the midpoint connecting lines, wherein a plurality of intersection points of the reference lines and the elliptic virtual route are a plurality of waypoint coordinates; sequentially connecting a plurality of waypoint coordinates to generate an elliptical flight route;
and reducing the length of the major axis and the length of the minor axis according to a certain numerical requirement, generating a plurality of elliptical flight paths according to the steps, and connecting the head and the tail of two adjacent elliptical flight paths to finally form a spiral circular track.
4. The method for modeling unmanned aerial vehicle aerial photography of ten-thousand square meters or less according to claim 1, wherein in S4, the process of acquiring aerial images in real time is as follows:
reconstructing each pixel point of the aerial image;
verifying assumed plane parameters for each pixel point of the acquired aerial image to obtain the depth of each pixel point and extracting dense point cloud data;
performing cross validation on the extracted dense point clouds, and filtering out noise point clouds according to the counted point cloud quality information and the picture quality information of the reconstructed point clouds;
and performing surface tiling processing on the point cloud through a Delaunay model, establishing a directed acyclic graph according to the shielding information of the fragment, and further filtering the noise fragment.
5. The method of claim 4, wherein the process further comprises the following operations:
detecting and matching aerial images acquired by an unmanned aerial vehicle, and constructing an epi-polar geometric figure; the camera pose and the scene structure are estimated in a mixed mode, and the camera pose and the scene are adjusted and optimized in a binding mode to obtain dense description of the scene; the geometry, texture and reflection properties of the scene are derived.
6. The method of claim 4, wherein the image analysis is performed by aerial triangulation to convert a series of two-dimensional aerial images into a three-dimensional dense point cloud.
7. The method for modeling unmanned aerial vehicle of ten-thousand square meters below according to claim 1, wherein in S5, the method further comprises the following steps:
the method comprises the steps of distinguishing interferents to be eliminated on a three-dimensional model, selecting and deleting the interferents, fitting a curved surface to the left-over holes after deletion through elevations around the holes, filling the holes with the curved surface, reconstructing a triangular net on the surface of the model, enabling the three-dimensional model to have a new geometric structure, and finishing elimination of all interferents and repair of the left-over holes in sequence to form the three-dimensional model with the new geometric structure.
8. The unmanned aerial vehicle rapid shooting modeling method under ten-thousand square meters according to any one of claims 1 to 7, further providing an unmanned aerial vehicle rapid shooting modeling system, which is characterized by comprising a data acquisition unit, a signal transmission unit, a remote control terminal and a background modeling terminal; the data acquisition unit is in communication connection with the remote control terminal and the background modeling terminal through the signal transmission unit;
the data acquisition unit comprises an unmanned aerial vehicle, a high-definition camera module and a positioning module, the signal transmission unit comprises a reflector, a receiver and a controller, the remote control terminal comprises a remote control module, a navigation module, a display module and a route planning module, and the background modeling terminal comprises a processing computer, a modeling software module, a storage module and a display screen.
9. The unmanned aerial vehicle rapid shooting modeling system of claim 8, wherein the unmanned aerial vehicle further comprises a microprocessor, a power supply module, a power monitoring module and a power analysis module;
the power supply module is electrically connected with the power utilization part, the electric quantity monitoring module is in communication connection with the electric quantity analysis module, and the electric quantity analysis module is in communication connection with the microprocessor; the electric quantity monitoring module is used for monitoring the electric quantity to be measured of the power supply module and sending data to the electric quantity analysis module, the electric quantity analysis module analyzes the operation duration which can be maintained by the unmanned aerial vehicle according to the electric quantity to be measured data and sends analysis information to the microprocessor in real time, and the microprocessor sends the information back to the remote control terminal; when the power supply module waits for the electric quantity to be less than a definite value, remote control personnel control unmanned aerial vehicle to return and prevent that the electric quantity exhausts and drop the damage.
10. The unmanned aerial vehicle rapid-shooting modeling system of claim 9, wherein the power supply module comprises a rechargeable lithium battery and a voltage stabilizing circuit, the lithium battery is used for providing a power supply, and the voltage stabilizing circuit is used for stabilizing the voltage to ensure the stability of the power supply voltage; a solar power generation panel is arranged on the unmanned aerial vehicle; the solar power generation panel is electrically connected with the power supply module.
CN202011300874.0A 2020-11-19 2020-11-19 Unmanned quick shooting modeling method below ten thousand square meters Pending CN112288818A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011300874.0A CN112288818A (en) 2020-11-19 2020-11-19 Unmanned quick shooting modeling method below ten thousand square meters

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011300874.0A CN112288818A (en) 2020-11-19 2020-11-19 Unmanned quick shooting modeling method below ten thousand square meters

Publications (1)

Publication Number Publication Date
CN112288818A true CN112288818A (en) 2021-01-29

Family

ID=74398227

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011300874.0A Pending CN112288818A (en) 2020-11-19 2020-11-19 Unmanned quick shooting modeling method below ten thousand square meters

Country Status (1)

Country Link
CN (1) CN112288818A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113344956A (en) * 2021-06-21 2021-09-03 深圳市武测空间信息有限公司 Ground feature contour extraction and classification method based on unmanned aerial vehicle aerial photography three-dimensional modeling
CN113867410A (en) * 2021-11-17 2021-12-31 武汉大势智慧科技有限公司 Unmanned aerial vehicle aerial photography data acquisition mode identification method and system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107341851A (en) * 2017-06-26 2017-11-10 深圳珠科创新技术有限公司 Real-time three-dimensional modeling method and system based on unmanned plane image data
CN108871289A (en) * 2018-06-01 2018-11-23 广州中科云图智能科技有限公司 A kind of circular airborne survey method and system based on unmanned plane
US20190066283A1 (en) * 2017-08-23 2019-02-28 General Electric Company Three-dimensional modeling of an object
CN110264570A (en) * 2019-06-13 2019-09-20 咏峰(大连)科技有限公司 A kind of autonomous cruising inspection system in forest land based on unmanned plane
CN111091613A (en) * 2019-10-31 2020-05-01 中国化学工程第六建设有限公司 Three-dimensional live-action modeling method based on unmanned aerial vehicle aerial survey

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107341851A (en) * 2017-06-26 2017-11-10 深圳珠科创新技术有限公司 Real-time three-dimensional modeling method and system based on unmanned plane image data
US20190066283A1 (en) * 2017-08-23 2019-02-28 General Electric Company Three-dimensional modeling of an object
CN108871289A (en) * 2018-06-01 2018-11-23 广州中科云图智能科技有限公司 A kind of circular airborne survey method and system based on unmanned plane
CN110264570A (en) * 2019-06-13 2019-09-20 咏峰(大连)科技有限公司 A kind of autonomous cruising inspection system in forest land based on unmanned plane
CN111091613A (en) * 2019-10-31 2020-05-01 中国化学工程第六建设有限公司 Three-dimensional live-action modeling method based on unmanned aerial vehicle aerial survey

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113344956A (en) * 2021-06-21 2021-09-03 深圳市武测空间信息有限公司 Ground feature contour extraction and classification method based on unmanned aerial vehicle aerial photography three-dimensional modeling
CN113344956B (en) * 2021-06-21 2022-02-01 深圳市武测空间信息有限公司 Ground feature contour extraction and classification method based on unmanned aerial vehicle aerial photography three-dimensional modeling
CN113867410A (en) * 2021-11-17 2021-12-31 武汉大势智慧科技有限公司 Unmanned aerial vehicle aerial photography data acquisition mode identification method and system
CN113867410B (en) * 2021-11-17 2023-11-03 武汉大势智慧科技有限公司 Unmanned aerial vehicle aerial photographing data acquisition mode identification method and system

Similar Documents

Publication Publication Date Title
He et al. Research of multi-rotor UAVs detailed autonomous inspection technology of transmission lines based on route planning
CN107504957B (en) Method for rapidly constructing three-dimensional terrain model by using unmanned aerial vehicle multi-view camera shooting
CN108701373B (en) Three-dimensional reconstruction method, system and device based on unmanned aerial vehicle aerial photography
CN104168455B (en) A kind of space base large scene camera system and method
CN108614274B (en) Cross type crossing line distance measuring method and device based on multi-rotor unmanned aerial vehicle
CN111091613A (en) Three-dimensional live-action modeling method based on unmanned aerial vehicle aerial survey
CN111415409B (en) Modeling method, system, equipment and storage medium based on oblique photography
CN114020002B (en) Method, device and equipment for unmanned aerial vehicle to inspect fan blade, unmanned aerial vehicle and medium
CN109238240A (en) A kind of unmanned plane oblique photograph method that taking landform into account and its camera chain
CN110530366A (en) A kind of flight course planning system and method for transmission line of electricity modeling
CN208027170U (en) A kind of power-line patrolling unmanned plane and system
CN108344397A (en) Automation modeling method, system and its auxiliary device based on oblique photograph technology
CN110428501B (en) Panoramic image generation method and device, electronic equipment and readable storage medium
CN109035665A (en) A kind of novel forest fire early-warning system and fire alarm method
CN112288818A (en) Unmanned quick shooting modeling method below ten thousand square meters
CN109541613A (en) Aerial high-voltage conducting wire cruising inspection system and method for inspecting based on single line laser ranging
CN105758384A (en) Unmanned aerial vehicle rocking oblique photograph system
CN111244822B (en) Fixed-wing unmanned aerial vehicle line patrol method, system and device in complex geographic environment
CN113433971A (en) Method, device, equipment and storage medium for acquiring data of high-rise building exterior wall
CN110647170A (en) Navigation mark inspection device and method based on unmanned aerial vehicle
CN111522360A (en) Banded oblique photography automatic route planning method based on electric power iron tower
CN113077561A (en) Intelligent inspection system for unmanned aerial vehicle
CN210835732U (en) Beacon inspection device based on unmanned aerial vehicle
CN116129064A (en) Electronic map generation method, device, equipment and storage medium
CN206649347U (en) A kind of application deployment system based on unmanned vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210129