CN112181958A - Method for quickly eliminating five-lens redundant data of oblique photography of unmanned aerial vehicle - Google Patents

Method for quickly eliminating five-lens redundant data of oblique photography of unmanned aerial vehicle Download PDF

Info

Publication number
CN112181958A
CN112181958A CN202010947842.3A CN202010947842A CN112181958A CN 112181958 A CN112181958 A CN 112181958A CN 202010947842 A CN202010947842 A CN 202010947842A CN 112181958 A CN112181958 A CN 112181958A
Authority
CN
China
Prior art keywords
lens
aerial vehicle
unmanned aerial
pos data
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010947842.3A
Other languages
Chinese (zh)
Other versions
CN112181958B (en
Inventor
林志军
夏斌
张红阳
张浩民
***
关俊峰
郑日平
陈剑平
景行
杨玺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangmen Power Supply Bureau of Guangdong Power Grid Co Ltd
Original Assignee
Jiangmen Power Supply Bureau of Guangdong Power Grid Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangmen Power Supply Bureau of Guangdong Power Grid Co Ltd filed Critical Jiangmen Power Supply Bureau of Guangdong Power Grid Co Ltd
Priority to CN202010947842.3A priority Critical patent/CN112181958B/en
Publication of CN112181958A publication Critical patent/CN112181958A/en
Application granted granted Critical
Publication of CN112181958B publication Critical patent/CN112181958B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/21Design, administration or maintenance of databases
    • G06F16/215Improving data quality; Data cleansing, e.g. de-duplication, removing invalid entries or correcting typographical errors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2457Query processing with adaptation to user needs

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Quality & Reliability (AREA)
  • Computational Linguistics (AREA)
  • Studio Devices (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention relates to the field of processing of images shot by an unmanned aerial vehicle, in particular to a method for quickly removing five-lens redundant data of oblique photography of the unmanned aerial vehicle, which comprises the following steps: arranging POS data of the unmanned aerial vehicle; step two: confirming the initial position index of the POS data of each flight zone of the unmanned aerial vehicle; step three: judging whether a second navigation band of the unmanned aerial vehicle is positioned on the left side or the right side of the first navigation band, and confirming whether the first navigation band rejects left lens POS data or right lens POS data; step four: redundant data in the unmanned aerial vehicle flight zone are removed; step five: the remaining POS data is saved. Redundant data which can be removed by the unmanned aerial vehicle is confirmed, the redundant data are removed in batches, the removal efficiency of the redundant data is accelerated, and the data processing time is reduced. The removed overlapped images are shot and are all outside the modeling area, so that the accuracy of three-dimensional modeling is not influenced after the images are removed, and the accuracy after the three-dimensional modeling is ensured.

Description

Method for quickly eliminating five-lens redundant data of oblique photography of unmanned aerial vehicle
Technical Field
The invention relates to the field of processing of images shot by an unmanned aerial vehicle, in particular to a method for quickly removing five-lens redundant data of oblique photography of the unmanned aerial vehicle.
Background
In the current electric power facility construction, the electric power corridor is often required to be rapidly modeled in three dimensions, but in the current live-action three-dimensional modeling, 5-lens inclination data are mostly used, and a navigation height needs to be expanded to ensure enough course and side direction overlapping degree, so that data redundancy is caused to a certain degree, and the redundant data amount is about 20%. In the modeling process, 20% more data, often 50% more processing time, is required. Therefore, the problem of rapidly rejecting the unmanned aerial vehicle oblique photography 5 lens redundant data is a problem which needs to be solved urgently.
The Chinese patent document with the publication number of 'CN 110267101A' and the publication date of 2019, 9, 20 discloses an unmanned aerial vehicle aerial video automatic frame extraction method based on a rapid three-dimensional jigsaw puzzle. The original video is intercepted through preprocessing, invalid video segments recorded before takeoff are removed, the number of key frames can be reduced as much as possible under the condition that the overlapping degree of the key frames is guaranteed, and the speed of three-dimensional jigsaw puzzle of the video is greatly improved.
However, in the above technical solution, since the key frames need to be removed, although the speed of three-dimensional modeling can be increased, the accuracy of three-dimensional modeling is reduced due to the loss of some key frames. Meanwhile, in the method, the key frame needs to be selected and the threshold value needs to be judged, and much time is consumed when the key frame is removed.
Disclosure of Invention
The invention aims to solve the problems that the accuracy of three-dimensional modeling is reduced and more time is consumed for removing data in the prior art, and provides a method for quickly removing five-lens redundant data of oblique photography of an unmanned aerial vehicle, which can quickly remove the redundant data and ensure the accuracy of three-dimensional modeling.
In order to solve the technical problems, the invention adopts the technical scheme that: the method for quickly eliminating the five-lens redundant data of the unmanned aerial vehicle oblique photography comprises the following steps:
the method comprises the following steps: arranging POS data of the unmanned aerial vehicle, and dividing the POS data into left lens POS data, right lens POS data, front lens POS data, rear lens POS data and lower lens POS data; unmanned aerial vehicle is when flight operation, and the unmanned aerial vehicle image that acquires can carry supporting POS data usually to can conveniently handle the image more in handling. The POS data mainly include GPS data and IMU data, i.e. external orientation elements in oblique photogrammetry: (latitude, longitude, elevation, heading angle, pitch angle, and roll angle). The unmanned aerial vehicle is equipped with 5 camera lenses, but the unmanned aerial vehicle only has a set of POS data, and the POS data of five camera lenses is the same, through finding left camera lens POS data and naming picture in the data, will copy four with left camera lens POS data to obtain the POS data of other camera lenses to the name of changing the picture.
Step two: and confirming the initial position index of the POS data of each flight zone of the unmanned aerial vehicle, and finding out the POS data needing to be eliminated through the initial position index.
Step three: judging whether a second navigation band of the unmanned aerial vehicle is positioned on the left side or the right side of the first navigation band, if the second navigation band is positioned on the left side of the first navigation band, rejecting right lens POS data of the first navigation band of the unmanned aerial vehicle, and rejecting left lens POS data of the second navigation band; if the second navigation band is located on the right side of the first navigation band, removing left lens POS data of the second navigation band of the unmanned aerial vehicle, and removing right lens POS data of the second navigation band; the unmanned aerial vehicle flies in an S shape, and the POS data of the lens removed from the odd numbered flight zones are consistent with the POS data of the lens removed from the first flight zone; the POS data of the lens which is removed from the even numbered flight zones is consistent with the POS data of the lens which is removed from the second flight zone;
step four: redundant data in the unmanned aerial vehicle flight zone are removed;
step five: the remaining POS data is saved.
In the technical scheme, the redundant data which can be removed by the unmanned aerial vehicle are confirmed, and the redundant data are removed in batches, so that the removal efficiency of the redundant data is accelerated. The removed overlapped images are shot and are all outside the modeling area, so that the accuracy of three-dimensional modeling is not influenced after the images are removed.
Preferably, in the first step, in the process of collating the POS data of the drone, the ground POS data of the drone is deleted. And pictures taken on the ground are eliminated, so that the data volume is reduced.
Preferably, in the first step, the POS data with the absolute value of the difference between the elevation and the average elevation value larger than the relative altitude of the flight mission are removed, and the set value of the relative altitude of the flight mission is 45-55 meters. The camera shooting command triggered by ground trial shooting or unconsciously can be automatically eliminated by the judgment, the redundant data can be deleted, and the redundant data can be effectively reduced.
Preferably, in the second step, the starting position of the flight strip is determined by calculating the signs of longitude and latitude between two points, that is, the point location is the starting position index of the POS data of the flight strip when the signs change.
Preferably, in the third step, whether the second navigation band is on the left side or the right side of the first navigation band is determined by calculating the average longitude and latitude of the second navigation band and the average longitude and latitude of the first navigation band.
Preferably, in the fourth step, left lens POS data or right lens POS data of the front N flight zones of the unmanned aerial vehicle and the left-to-right N flight zones exceeding the survey area range are rejected.
Preferably, only left lens POS data or right lens POS data of the front three flight zones and the last three flight zones of the unmanned aerial vehicle, which exceed the range of the measurement area, are removed, only redundant data exceeding the range of the measurement area are removed, the data volume to be processed is reduced, meanwhile, useful data are prevented from being deleted, and the accuracy of three-dimensional modeling is prevented from being influenced.
Preferably, lower lens POS data of a first flight band and a last flight band of the unmanned aerial vehicle are rejected. Because the lower lens is arranged downwards, the first and the last flight zones are beyond the measuring area in any flying way, belong to redundant data and can be directly deleted.
Preferably, unmanned aerial vehicle's left side camera lens, right side camera lens, preceding camera lens and back camera lens all become 45 degrees angles with the horizontal plane, and unmanned aerial vehicle's lower lens is vertical down.
Compared with the prior art, the beneficial effects are: redundant data which can be removed by the unmanned aerial vehicle is confirmed, the redundant data are removed in batches, the removal efficiency of the redundant data is accelerated, and the data processing time is reduced. The removed overlapped images are shot and are all outside the modeling area, so that the accuracy of three-dimensional modeling is not influenced after the images are removed, and the accuracy after the three-dimensional modeling is ensured.
Drawings
FIG. 1 is a flow chart of a method for rapidly eliminating redundant data of five lenses of unmanned aerial vehicle oblique photography according to the invention;
fig. 2 is a schematic view of the flight direction and flight band distribution of the drone of the present invention;
fig. 3 is another schematic view of the flight direction and flight band distribution of the drone of the present invention;
fig. 4 is a schematic view of another flight direction and flight band distribution of the drone of the present invention;
fig. 5 is another schematic view of the flight direction and flight band distribution of the drone of the present invention;
fig. 6 is a schematic view of another flight direction and flight band distribution of the drone of the present invention;
fig. 7 is another schematic view of the flight direction and flight band distribution of the drone of the present invention;
fig. 8 is a schematic view of another flight direction and flight band distribution of the drone of the present invention;
fig. 9 is another schematic view of the flight direction and flight band distribution of the drone of the present invention.
Detailed Description
The drawings are for illustrative purposes only and are not to be construed as limiting the patent; for the purpose of better illustrating the embodiments, certain features of the drawings may be omitted, enlarged or reduced, and do not represent the size of an actual product; it will be understood by those skilled in the art that certain well-known structures in the drawings and descriptions thereof may be omitted. The positional relationships depicted in the drawings are for illustrative purposes only and are not to be construed as limiting the present patent.
The same or similar reference numerals in the drawings of the embodiments of the present invention correspond to the same or similar components; in the description of the present invention, it should be understood that if there are terms such as "upper", "lower", "left", "right", "long", "short", etc., indicating orientations or positional relationships based on the orientations or positional relationships shown in the drawings, it is only for convenience of description and simplicity of description, but does not indicate or imply that the device or element referred to must have a specific orientation, be constructed in a specific orientation, and be operated, and therefore, the terms describing the positional relationships in the drawings are only used for illustrative purposes and are not to be construed as limitations of the present patent, and specific meanings of the terms may be understood by those skilled in the art according to specific situations.
The technical scheme of the invention is further described in detail by the following specific embodiments in combination with the attached drawings:
examples
Fig. 1 shows an embodiment of a method for quickly removing redundant data of five lenses in oblique photography of an unmanned aerial vehicle, which includes the following steps:
the method comprises the following steps: unmanned aerial vehicle's left side camera lens, right camera lens, preceding camera lens and back camera lens all become 45 degrees angles with the horizontal plane, and unmanned aerial vehicle's lower lens is vertical down. Arranging POS data of the unmanned aerial vehicle, and dividing the POS data into left lens POS data, right lens POS data, front lens POS data, rear lens POS data and lower lens POS data; the POS data mainly include GPS data and IMU data, i.e. external orientation elements in oblique photogrammetry: (latitude, longitude, elevation, heading angle, pitch angle, and roll angle). The unmanned aerial vehicle is equipped with 5 camera lenses, but the unmanned aerial vehicle only has a set of POS data, and the POS data of five camera lenses is the same, through finding left camera lens POS data and naming picture in the data, will copy four with left camera lens POS data to obtain the POS data of other camera lenses to the name of changing the picture. When data sorting is carried out, POS data with the absolute value of the difference value between the elevation and the average value of the elevation larger than the relative flight altitude of the flight mission are removed, the set value of the relative flight altitude of the flight mission is 50 m, a camera photographing command triggered by ground trial photographing or unconsciously can be removed, the redundant data can be deleted, and the redundant data can be effectively reduced.
Step two: the starting position of the flight zone is judged by calculating the longitude and latitude signs between two points, namely the point position is the starting position index of the POS data of the flight zone when the signs change, the starting position index of the POS data of each flight zone of the unmanned aerial vehicle is confirmed, and the POS data needing to be eliminated is found out through the starting position index.
Step three: judging whether a second navigation band of the unmanned aerial vehicle is positioned on the left side or the right side of the first navigation band, if the second navigation band is positioned on the left side of the first navigation band, rejecting right lens POS data of the first navigation band of the unmanned aerial vehicle, and rejecting left lens POS data of the second navigation band; if the second navigation band is located on the right side of the first navigation band, removing left lens POS data of the second navigation band of the unmanned aerial vehicle, and removing right lens POS data of the second navigation band; the unmanned aerial vehicle flies in an S shape, and the POS data of the lens removed from the odd numbered flight zones are consistent with the POS data of the lens removed from the first flight zone; the POS data of the lens which is removed from the even numbered flight zones is consistent with the POS data of the lens which is removed from the second flight zone;
in the step, according to the actual flight direction of the airplane, the airplane is distinguished according to the azimuth angles and is divided into eight directions, namely north, northeast, east, southeast, south, southwest, west and northwest. The starting swath direction may be summarized as the eight cases of the red arrow in the following figures, where the red arrow represents the starting swath direction.
As shown in fig. 2-9, arrow 1 indicates the flight direction of the starting flight band (i.e., the first flight band); arrow 2 represents the flight direction of the second flight zone: there are two cases, the first to the left of the starting direction (2' arrow in the figure); the second is to the right of the starting direction (2 arrows in the figure).
Determining the starting flight direction of the starting flight zone
And judging the initial flight band direction according to the positive and negative signs of the warp-weft difference of the first flight band.
For FIG. 2: the latitude difference delta B is greater than 0, and the longitude difference delta L is greater than 0, which indicates that the first flight zone of the airplane flies to the north and east, namely, the red arrow in the figure shows.
The starting flight direction of the flight zone shown in fig. 3-9 is judged by the sign of the longitude and latitude.
Determining whether the second flight band is to the left or right of the starting flight band
By calculating the average latitude and longitude (denoted as B) of the second navigation bandAverage of 2) Average latitude and longitude (noted as B) from the first flight bandAverage of 1) To determine whether the second flight band is to the left or right of the starting flight band.
For fig. 1:
if B isAverage of 2<BAverage of 1The average latitude of the second navigation band is smaller than that of the first navigation band, namely the second navigation band is arranged at the right side of the first navigation band (an arrow 2 in the figure), and the measurement area is also arranged at the right side of the first navigation band;
if B isAverage of 2>BAverage of 1The average latitude of the second navigation band is larger than that of the first navigation band, namely the second navigation band is arranged at the left side of the first navigation band (an arrow 2' in the figure), and the measuring area is also arranged at the left side of the first navigation band;
similarly, the position of the second flight band of fig. 3-5 is determined by comparing the average latitudes. The location of the second flight band of fig. 6-9 is determined by comparing the average longitudes.
Step four: removing left lens POS data or right lens POS data of the front three flight belts and the reverse three flight belts of the unmanned aerial vehicle, which exceed the range of the measurement area; and eliminating the lower lens POS data of the first flight zone and the last flight zone of the unmanned aerial vehicle. Redundant data beyond the measuring area range are removed, the data amount to be processed is reduced, meanwhile, useful data are prevented from being deleted, and the accuracy of three-dimensional modeling is prevented from being influenced. Because the lower lens is arranged downwards, the first and the last flight zones are beyond the measuring area in any flying way, belong to redundant data and can be directly deleted.
Step five: the remaining POS data is saved.
The beneficial effect of this implementation: redundant data which can be removed by the unmanned aerial vehicle is confirmed, the redundant data are removed in batches, the removal efficiency of the redundant data is accelerated, and the data processing time is reduced. The removed overlapped images are shot and are all outside the modeling area, so that the accuracy of three-dimensional modeling is not influenced after the images are removed, and the accuracy after the three-dimensional modeling is ensured.
It should be understood that the above-described embodiments of the present invention are merely examples for clearly illustrating the present invention, and are not intended to limit the embodiments of the present invention. Other variations and modifications will be apparent to persons skilled in the art in light of the above description. And are neither required nor exhaustive of all embodiments. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the claims of the present invention.

Claims (10)

1. A method for quickly eliminating five-lens redundant data of unmanned aerial vehicle oblique photography is characterized by comprising the following steps:
the method comprises the following steps: arranging POS data of the unmanned aerial vehicle, and dividing the POS data into left lens POS data, right lens POS data, front lens POS data, rear lens POS data and lower lens POS data;
step two: confirming the initial position index of the POS data of each flight zone of the unmanned aerial vehicle;
step three: judging whether a second navigation band of the unmanned aerial vehicle is positioned on the left side or the right side of the first navigation band, if the second navigation band is positioned on the left side of the first navigation band, rejecting right lens POS data of the first navigation band of the unmanned aerial vehicle, and rejecting left lens POS data of the second navigation band; if the second navigation band is located on the right side of the first navigation band, removing left lens POS data of the second navigation band of the unmanned aerial vehicle, and removing right lens POS data of the second navigation band; the unmanned aerial vehicle flies in an S shape, and the POS data of the lens removed from the odd numbered flight zones are consistent with the POS data of the lens removed from the first flight zone; the POS data of the lens which is removed from the even numbered flight zones is consistent with the POS data of the lens which is removed from the second flight zone;
step four: redundant data in the unmanned aerial vehicle flight zone are removed;
step five: the remaining POS data is saved.
2. The method for rapidly eliminating the five-lens redundant data of the oblique photography of the unmanned aerial vehicle as claimed in claim 1, wherein in the step one, the ground POS data of the unmanned aerial vehicle is deleted during the process of arranging the POS data of the unmanned aerial vehicle.
3. The method for rapidly eliminating the five-lens redundant data of the unmanned aerial vehicle oblique photography according to claim 2, wherein in the first step, POS data with the absolute value of the difference between the elevation and the average elevation value larger than the relative altitude of the flight mission are eliminated.
4. The method for rapidly eliminating the five-lens redundant data of the unmanned aerial vehicle oblique photography according to claim 3, wherein the set value of the relative flight altitude of the flight mission is 45-55 m.
5. The method as claimed in claim 1, wherein in the second step, the start position of the voyage is determined by calculating the signs of longitude and latitude between two points, that is, the point location is the start position index of the POS data of the voyage when the signs change.
6. The method as claimed in claim 1, wherein in step three, the second navigation band is determined to be on the left side or the right side of the first navigation band by calculating the average longitude and latitude of the second navigation band and the average longitude and latitude of the first navigation band.
7. The method for rapidly eliminating five-lens redundant data of unmanned aerial vehicle oblique photography according to claim 1, wherein in step four, left lens POS data or right lens POS data of the front N flight zones of the unmanned aerial vehicle and the last N flight zones exceeding the survey area range are eliminated.
8. The method for rapidly rejecting the five-lens redundant data of the oblique photography of the unmanned aerial vehicle as claimed in claim 7, wherein the left lens POS data or the right lens POS data of the unmanned aerial vehicle with the first three flight zones and the last three flight zones beyond the measuring area range are rejected.
9. The method for rapidly rejecting the five-lens redundant data of the oblique photography of the unmanned aerial vehicle as claimed in claim 7, wherein the lower lens POS data of the first aerial zone and the last aerial zone of the unmanned aerial vehicle are rejected.
10. The method for rapidly rejecting the five-lens redundant data of the oblique photography of the unmanned aerial vehicle as claimed in any one of claims 1 to 9, wherein the left lens, the right lens, the front lens and the rear lens of the unmanned aerial vehicle are all at an angle of 45 degrees with respect to the horizontal plane, and the lower lens of the unmanned aerial vehicle faces vertically downwards.
CN202010947842.3A 2020-09-10 2020-09-10 Method for quickly eliminating five-lens redundant data of oblique photography of unmanned aerial vehicle Active CN112181958B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010947842.3A CN112181958B (en) 2020-09-10 2020-09-10 Method for quickly eliminating five-lens redundant data of oblique photography of unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010947842.3A CN112181958B (en) 2020-09-10 2020-09-10 Method for quickly eliminating five-lens redundant data of oblique photography of unmanned aerial vehicle

Publications (2)

Publication Number Publication Date
CN112181958A true CN112181958A (en) 2021-01-05
CN112181958B CN112181958B (en) 2023-01-24

Family

ID=73921791

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010947842.3A Active CN112181958B (en) 2020-09-10 2020-09-10 Method for quickly eliminating five-lens redundant data of oblique photography of unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN112181958B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103033805A (en) * 2012-12-25 2013-04-10 西安煤航信息产业有限公司 Automatic removal method for redundant data between air strips of airborne laser radar
CN106949880A (en) * 2017-03-10 2017-07-14 中国电建集团昆明勘测设计研究院有限公司 Method for processing overhigh local overlapping degree of unmanned aerial vehicle images in measurement area with large elevation fluctuation

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103033805A (en) * 2012-12-25 2013-04-10 西安煤航信息产业有限公司 Automatic removal method for redundant data between air strips of airborne laser radar
CN106949880A (en) * 2017-03-10 2017-07-14 中国电建集团昆明勘测设计研究院有限公司 Method for processing overhigh local overlapping degree of unmanned aerial vehicle images in measurement area with large elevation fluctuation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
袁辉等: "利用低空无人机飞控数据的摄影航带全自动整理方法", 《测绘科学》 *

Also Published As

Publication number Publication date
CN112181958B (en) 2023-01-24

Similar Documents

Publication Publication Date Title
CN112164015B (en) Monocular vision autonomous inspection image acquisition method and device and power inspection unmanned aerial vehicle
KR102018892B1 (en) Method and apparatus for controlling take-off and landing of unmanned aerial vehicle
US10621456B2 (en) Distance measurement method and apparatus, and unmanned aerial vehicle
KR101711602B1 (en) Safety inspection system using unmanned aircraft and method for controlling the same
WO2018209898A1 (en) Information processing device, aerial photographing path generation method, aerial photographing path generation system, program and recording medium
CN108460815A (en) Map road element edit methods and device
CN107194989A (en) The scene of a traffic accident three-dimensional reconstruction system and method taken photo by plane based on unmanned plane aircraft
CN106155086A (en) A kind of Road Detection unmanned plane and automatic cruising method thereof
CN109035294B (en) Image extraction system and method for moving target
CN111578904B (en) Unmanned aerial vehicle aerial surveying method and system based on equidistant spirals
US9299129B2 (en) Method and apparatus for removing shadow from aerial or satellite photograph
CN105872479A (en) Community grid managing, monitoring and early warning system based on unmanned aerial vehicle
JP2012137933A (en) Position specifying method of planimetric features to be photographed, program thereof, display map, photographic position acquiring method, program thereof and photographic position acquiring device
CN111913492A (en) Unmanned aerial vehicle safe landing method and device
CN105973206B (en) One kind is based on the air strips division methods of the algorithm of Douglas-general gram
CN113406014A (en) Oil spilling monitoring system and method based on multispectral imaging equipment
CN207068060U (en) The scene of a traffic accident three-dimensional reconstruction system taken photo by plane based on unmanned plane aircraft
CN111754451A (en) Surveying and mapping unmanned aerial vehicle achievement detection method and device, electronic equipment and storage medium
CN114689030A (en) Unmanned aerial vehicle auxiliary positioning method and system based on airborne vision
CN112181958B (en) Method for quickly eliminating five-lens redundant data of oblique photography of unmanned aerial vehicle
KR102488553B1 (en) Drone used 3d mapping method
CN111982076B (en) Single-lens unmanned aerial vehicle flight parameter setting method
CN111444385B (en) Electronic map real-time video mosaic method based on image corner matching
CN115950435A (en) Real-time positioning method for unmanned aerial vehicle inspection image
CN115512056A (en) Scene three-dimensional reconstruction method, device and equipment based on unmanned aerial vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant