CN116164711A - Unmanned aerial vehicle mapping method, unmanned aerial vehicle mapping system, unmanned aerial vehicle mapping medium and unmanned aerial vehicle mapping computer - Google Patents

Unmanned aerial vehicle mapping method, unmanned aerial vehicle mapping system, unmanned aerial vehicle mapping medium and unmanned aerial vehicle mapping computer Download PDF

Info

Publication number
CN116164711A
CN116164711A CN202310222060.7A CN202310222060A CN116164711A CN 116164711 A CN116164711 A CN 116164711A CN 202310222060 A CN202310222060 A CN 202310222060A CN 116164711 A CN116164711 A CN 116164711A
Authority
CN
China
Prior art keywords
aerial vehicle
unmanned aerial
photo
mapping
control points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310222060.7A
Other languages
Chinese (zh)
Other versions
CN116164711B (en
Inventor
夏磊
贾明利
吴庆波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Jingyi Space Information Technology Co ltd
Original Assignee
Guangdong Jingyi Space Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Jingyi Space Information Technology Co ltd filed Critical Guangdong Jingyi Space Information Technology Co ltd
Priority to CN202310222060.7A priority Critical patent/CN116164711B/en
Publication of CN116164711A publication Critical patent/CN116164711A/en
Application granted granted Critical
Publication of CN116164711B publication Critical patent/CN116164711B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/14Transformations for image registration, e.g. adjusting or mapping for alignment of images
    • G06T3/147Transformations for image registration, e.g. adjusting or mapping for alignment of images using affine transformations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4046Scaling of whole images or parts thereof, e.g. expanding or contracting using neural networks
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Computational Linguistics (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Health & Medical Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention relates to an unmanned aerial vehicle mapping method, a system, a medium and a computer, which have the following beneficial effects: through lay like the accuse point on the ground in survey and drawing district, then utilize the degree of depth to learn neural network and transform, cut the photo, make the photo that unmanned aerial vehicle took can splice together more accurately, and fly the photo through controlling unmanned aerial vehicle along the fluctuation of rectangular wave route, can avoid the concatenation error that the altitude error caused when flying, improve the accuracy of photo concatenation.

Description

Unmanned aerial vehicle mapping method, unmanned aerial vehicle mapping system, unmanned aerial vehicle mapping medium and unmanned aerial vehicle mapping computer
Technical Field
The invention relates to the technical field of remote control mapping, in particular to an unmanned aerial vehicle mapping method, an unmanned aerial vehicle mapping system, a medium and a computer.
Background
In the prior art, a specific area is mapped by using a nodding mode of an unmanned plane, and the following method is generally adopted: controlling the unmanned aerial vehicle to fly above the surveying and mapping area, performing nodding shooting on the surveying and mapping area, correspondingly obtaining partial photos of a plurality of surveying and mapping areas, and then mutually splicing the shot photos by utilizing an image splicing technology, so that a complete surveying and mapping area photo is formed.
However, the following problems exist with the unmanned aerial vehicle mapping technique described above: 1. the unmanned aerial vehicle is not vertically downward in the flight process, and the photographed picture has certain inclination. 2. In order to improve the shooting range of the unmanned aerial vehicle, the adopted lens is usually a wide-angle lens, so that certain distortion exists in the photo, particularly the edge position of the photo, and the deformation is large. 3. Because the flying height of the unmanned aerial vehicle cannot be guaranteed to be completely consistent, a certain error exists in the proportion of the photographed pictures, and the problem of image dislocation easily occurs in the actual splicing process.
Disclosure of Invention
Aiming at the defects of the prior art, the invention aims to provide an unmanned aerial vehicle mapping method, system, medium and computer, so as to solve the problem that the photo dislocation is easily caused by inconsistent proportion of the photo in the existing unmanned aerial vehicle mapping process.
The technical aim of the invention is realized by the following technical scheme: a method of unmanned aerial vehicle mapping, comprising:
s1, arranging image control points in a mapping area;
s2, starting the unmanned aerial vehicle, controlling the unmanned aerial vehicle to perform self-checking, judging whether the unmanned aerial vehicle has faults, and if so, giving an alarm; if not, executing S3;
s3, controlling the unmanned aerial vehicle to take off, controlling the unmanned aerial vehicle to fly along a preset route, and shooting a mapping area to obtain a plurality of first photos containing image control points;
s4, respectively carrying out orthodontic treatment on the first photos to correspondingly obtain second photos;
and S5, splicing the second photos to correspondingly obtain a complete image of the mapping region.
Optionally, the laying of the image control points in the mapping area includes: and arranging image control points in a rectangular lattice mode in a mapping area.
Optionally, the orthodontic treatment on the plurality of first photographs includes:
s41, judging whether the number of the image control points on a plurality of first photos is larger than a preset number threshold one by one, if so, executing the step S42, and if not, discarding the first photos;
s42, extracting features of the imaging control points in the first photo, and correspondingly generating coordinates of the imaging control points on the first photo;
s43, cutting the first photo according to the coordinates to obtain a corresponding cut photo;
s44, carrying out affine transformation on the cut photo according to the coordinates to obtain a corresponding second photo.
Optionally, the step S43 further includes: acquiring the definition of the cut photo, judging whether the definition is larger than a definition threshold, if not, discarding the cut photo and re-executing the step S41; if yes, go to step S44.
Optionally, the controlling the unmanned aerial vehicle to fly along a predetermined route includes: controlling the unmanned aerial vehicle to fly in a vertical fluctuation mode, wherein a rectangular wave route is formed by a flight route of the unmanned aerial vehicle, and the mapping area is shot at the right-angle turning position of the rectangular wave route.
Optionally, the unmanned aerial vehicle is controlled to fly repeatedly for 2-4 times along the preset route.
Optionally, according to the distance between the image control points, the photos of different heights shot by the unmanned aerial vehicle are scaled in size, and the first photos with the same size proportion are correspondingly obtained.
An unmanned aerial vehicle mapping system, comprising:
unmanned aerial vehicle self-checking module: the method comprises the steps of controlling the unmanned aerial vehicle to perform self-checking, and judging whether the unmanned aerial vehicle has faults or not;
the route setting module: the method comprises the steps of setting a flight route of an unmanned aerial vehicle;
an image control point detection module: the method comprises the steps of detecting the number of image control points on a first photo and judging whether the number of image control points is larger than a preset number threshold;
photo cutting module: the first photo is cut according to the coordinates, and a corresponding cut photo is obtained;
and a definition judging module: the method comprises the steps of obtaining the definition of the cut photo and judging whether the definition is larger than a definition threshold;
photo fusion module: and the second photos are spliced to correspondingly obtain a complete image of the mapping area.
A computer device comprising a memory storing a computer program and a processor implementing the steps of the method described above when the processor executes the computer program.
A computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method described above.
In summary, the invention has the following beneficial effects: through lay like the accuse point on the ground in survey and drawing district, then utilize the degree of depth to learn neural network and transform, cut the photo, make the photo that unmanned aerial vehicle took can splice together more accurately, and fly the photo through controlling unmanned aerial vehicle along the fluctuation of rectangular wave route, can avoid the concatenation error that the altitude error caused when flying, improve the accuracy of photo concatenation.
Drawings
FIG. 1 is a flow chart of an unmanned aerial vehicle mapping method of the present invention;
FIG. 2 is a block diagram of an unmanned aerial vehicle mapping system of the present invention;
fig. 3 is an internal structural diagram of a computer device in an embodiment of the present invention.
In the figure: 1. the unmanned aerial vehicle self-checking module; 2. a route setting module; 3. an image control point detection module; 4. a photo cutting module; 5. a definition judging module; 6. and a photo fusion module.
Detailed Description
In order that the objects, features and advantages of the invention will be readily understood, a more particular description of the invention will be rendered by reference to specific embodiments thereof which are illustrated in the appended drawings. Several embodiments of the invention are presented in the figures. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein.
In the present invention, unless explicitly specified and limited otherwise, the terms "mounted," "connected," "secured," and the like are to be construed broadly and may be, for example, fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements. The specific meaning of the above terms in the present invention can be understood by those of ordinary skill in the art according to the specific circumstances. The terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature.
In the present invention, unless expressly stated or limited otherwise, a first feature "above" or "below" a second feature may include both the first and second features being in direct contact, as well as the first and second features not being in direct contact but being in contact with each other through additional features therebetween. Moreover, a first feature being "above," "over" and "on" a second feature includes the first feature being directly above and obliquely above the second feature, or simply indicating that the first feature is higher in level than the second feature. The first feature being "under", "below" and "beneath" the second feature includes the first feature being directly under and obliquely below the second feature, or simply means that the first feature is less level than the second feature. The terms "vertical," "horizontal," "left," "right," "up," "down," and the like are used for descriptive purposes only and are not to indicate or imply that the devices or elements being referred to must have a particular orientation, be constructed and operated in a particular orientation, and therefore should not be construed as limiting the invention.
The present invention will be described in detail below with reference to the accompanying drawings and examples.
The invention provides an unmanned aerial vehicle mapping method, as shown in fig. 1, comprising the following steps:
s1, arranging image control points in a mapping area;
s2, starting the unmanned aerial vehicle, controlling the unmanned aerial vehicle to perform self-checking, judging whether the unmanned aerial vehicle has faults, and if so, giving an alarm; if not, executing S3;
s3, controlling the unmanned aerial vehicle to take off, controlling the unmanned aerial vehicle to fly along a preset route, and shooting a mapping area to obtain a plurality of first photos containing image control points;
s4, respectively carrying out orthodontic treatment on the first photos to correspondingly obtain second photos;
and S5, splicing the second photos to correspondingly obtain a complete image of the mapping region.
In practical applications, the photo control points are control points that are directly laid out and measured in the field for photogrammetric control point encryption or mapping needs. The aerial photography measurement determines the distribution, the number and the joint measurement precision of the photo control points according to the aerial photography data, the follow-up procedure process flow and the requirement of the imaging precision. Before the unmanned aerial vehicle actually flies, the unmanned aerial vehicle body needs to be subjected to self-checking, the unmanned aerial vehicle body needs to comprise a power system, an energy system, a sensor system, a shooting system and a communication system, if a problem is found in the self-checking process, the take-off program of the unmanned aerial vehicle can be stopped, an alarm is sent, and the unmanned aerial vehicle is prevented from being dangerous in the flying process. After the unmanned aerial vehicle takes off, the unmanned aerial vehicle is required to be controlled to fly along a preset route, and the mapping area is continuously shot, and the photos of the area corresponding to the mapping area are correspondingly obtained.
Further, the arranging the image control points in the mapping area includes: and arranging image control points in a rectangular lattice mode in a mapping area.
In practical application, because the function of the image control points provides standard and accurate reference function for the photo, in order to make the photo more accurate to process, the image control points are arranged in the form of rectangular lattice, so that the photo can be processed more reasonably and rapidly.
Further, the orthodontic treatment on the plurality of first photographs respectively includes:
s41, judging whether the number of the image control points on a plurality of first photos is larger than a preset number threshold one by one, if so, executing the step S42, and if not, discarding the first photos;
s42, extracting features of the imaging control points in the first photo, and correspondingly generating coordinates of the imaging control points on the first photo;
s43, cutting the first photo according to the coordinates to obtain a corresponding cut photo;
s44, carrying out affine transformation on the cut photo according to the coordinates to obtain a corresponding second photo.
In practical applications, since the range of the photo taken by the unmanned aerial vehicle is determined according to the flying height and the preset route, the number of image control points in the photo taken by the unmanned aerial vehicle cannot be completely guaranteed, for example, the unmanned aerial vehicle may only take one image control point in the process of taking the photo, or the number of the taken image control points is smaller, the positions of the taken image control points in the photo are offset, the contribution to the overall orthodontic effect of the photo is poor, and therefore, the photo needs to be discarded in order to avoid the unsatisfactory orthodontic result of the photo, so that the photo for splicing is a photo with good orthodontic effect.
Specifically, the number of image control points on a first photo is judged, the image control points in the photo are detected mainly by utilizing a pre-trained target detection neural network model, the image control point photo is firstly input into a target detection neural network for training to obtain a target detection neural network model, the target detection neural network model adopts a YOLO-V5 structure, and a trunk in the target detection neural network model is Mobilene V3. The trained target detection neural network model can select the image control point boxes in the photo by using the boundary boxes, each boundary box has a corresponding confidence coefficient, the confidence coefficient refers to the possibility that the target in the boundary box is the image control point, the greater the confidence coefficient is, the greater the possibility that the target selected by the boundary box is the image control point is indicated, and in the actual use process, the confidence coefficient threshold value is correspondingly set according to the training result, namely, the target selected by the boundary box which is greater than the confidence coefficient threshold value can identify the image control point. And counting the number of the image control points on the first photo, comparing the counted number with a preset threshold value of the number of the image control points, reserving the first photo with the number of the image control points larger than the threshold value, and discarding the first photo with the number of the image control points smaller than the threshold value.
The feature extraction is carried out on the image control points, and the following steps are mainly adopted: inputting a first photo containing an image control point into a pre-trained convolutional neural network model, wherein the convolutional neural network model adopts a res2net_v1b network backbone structure, and a plurality of bounding boxes containing the image control point are generated; convolving the first photo with a convolution kernel to generate a corresponding feature map; and correspondingly marking the coordinates of the image control points on the photo on the characteristic map.
And cutting the first photo according to the coordinates to obtain a corresponding cut photo, wherein the image control point is mainly utilized to cut off the part with larger edge distortion of the photo, so that the phenomenon that the edge of the photo is not restored in place in the splicing process and the splicing effect of the photo is influenced by errors is avoided.
In the actual shooting process, since the camera may be deflected (not perpendicular to the horizontal plane), in the actual shot photo, the photo needs to be adjusted, and a specific adjustment method is as follows: performing corner detection on the cut photo by using two groups of straight line approximation methods to obtain four corners of the cut photo; based on the four corner points, preprocessing the image according to an anti-perspective transformation algorithm to obtain a corrected image, namely a second photo.
Further, the step S43 further includes: acquiring the definition of the cut photo, judging whether the definition is larger than a definition threshold, if not, discarding the cut photo and re-executing the step S41; if yes, go to step S44.
In practical application, judging the definition of a second photo, inputting the second photo into a pre-trained convolutional neural network model, and generating a cross entropy loss function of the second photo; and carrying out regression processing on the cross entropy loss function to generate corresponding definition. In practical applications, the sharpness threshold is 0.72, that is, when the sharpness obtained by the above steps is less than 0.72, it is determined that the sharpness of the second photo is not satisfactory, and the photo needs to be discarded.
Further, the controlling the unmanned aerial vehicle to fly along a predetermined route includes: controlling the unmanned aerial vehicle to fly in a vertical fluctuation mode, wherein a rectangular wave route is formed by a flight route of the unmanned aerial vehicle, and the mapping area is shot at the right-angle turning position of the rectangular wave route.
In practical application, unmanned aerial vehicle carries out the in-process that flies at same horizon, and its flight altitude also can not keep unanimously completely, and the proportion of the photo of shooing exists certain error, in order to avoid the concatenation of error influence photo, this application is through controlling unmanned aerial vehicle and wave the flight from top to bottom, and the proportion difference of the photo of directly letting unmanned aerial vehicle shoot is more obvious, then rethread photo zooms, realizes the accurate concatenation of photo.
Further, the unmanned aerial vehicle is controlled to fly repeatedly for 2-4 times along the preset route.
In practical application, because the requirement on the photo is higher, partial photo needs to be discarded in the practical picture splicing process, repeated flying is needed for a plurality of times, the comprehensiveness of photo shooting is ensured, and the defect of the photo caused by discarding the photo is avoided.
Further, according to the distance between the image control points, the photos of different heights shot by the unmanned aerial vehicle are scaled in size, and the first photos with the same size proportion are correspondingly obtained.
In practical application, the size of the first photo may be adjusted by using an OpenCV resolution function, or a gaussian pyramid or a laplacian pyramid may be used to correspondingly zoom in or zoom out the first photo.
As shown in fig. 2, the present invention further provides an unmanned aerial vehicle mapping system, including:
unmanned aerial vehicle self-checking module: the method comprises the steps of controlling the unmanned aerial vehicle to perform self-checking, and judging whether the unmanned aerial vehicle has faults or not;
the route setting module: the method comprises the steps of setting a flight route of an unmanned aerial vehicle;
an image control point detection module: the method comprises the steps of detecting the number of image control points on a first photo and judging whether the number of image control points is larger than a preset number threshold;
photo cutting module: the first photo is cut according to the coordinates, and a corresponding cut photo is obtained;
and a definition judging module: the method comprises the steps of obtaining the definition of the cut photo and judging whether the definition is larger than a definition threshold;
photo fusion module: and the second photos are spliced to correspondingly obtain a complete image of the mapping area.
For specific limitations on an unmanned aerial vehicle mapping system, reference may be made to the limitations of an unmanned aerial vehicle mapping method hereinabove, and no further description is given here. Each of the modules in the unmanned aerial vehicle mapping system may be implemented in whole or in part by software, hardware, or a combination thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, a computer device is provided, which may be a server, the internal structure of which may be as shown in fig. 3. The computer device includes a processor, a memory, a network interface, and a database connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, computer programs, and a database. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The computer program is executed by the processor to implement a drone mapping method.
It will be appreciated by those skilled in the art that the structure shown in fig. 3 is merely a block diagram of some of the structures associated with the present application and is not limiting of the computer device to which the present application may be applied, and that a particular computer device may include more or fewer components than shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is provided comprising a memory and a processor, the memory having stored therein a computer program, the processor when executing the computer program performing the steps of: comprising the following steps:
s1, arranging image control points in a mapping area;
s2, starting the unmanned aerial vehicle, controlling the unmanned aerial vehicle to perform self-checking, judging whether the unmanned aerial vehicle has faults, and if so, giving an alarm; if not, executing S3;
s3, controlling the unmanned aerial vehicle to take off, controlling the unmanned aerial vehicle to fly along a preset route, and shooting a mapping area to obtain a plurality of first photos containing image control points;
s4, respectively carrying out orthodontic treatment on the first photos to correspondingly obtain second photos;
and S5, splicing the second photos to correspondingly obtain a complete image of the mapping region.
In one embodiment, the disposing the image control point in the mapping area includes: and arranging image control points in a rectangular lattice mode in a mapping area.
In one embodiment, the orthodontic treatment on the first photographs includes:
s41, judging whether the number of the image control points on a plurality of first photos is larger than a preset number threshold one by one, if so, executing the step S42, and if not, discarding the first photos;
s42, extracting features of the imaging control points in the first photo, and correspondingly generating coordinates of the imaging control points on the first photo;
s43, cutting the first photo according to the coordinates to obtain a corresponding cut photo;
s44, carrying out affine transformation on the cut photo according to the coordinates to obtain a corresponding second photo.
In one embodiment, the step S43 further includes: acquiring the definition of the cut photo, judging whether the definition is larger than a definition threshold, if not, discarding the cut photo and re-executing the step S41; if yes, go to step S44.
In one embodiment, the controlling the drone to fly along a predetermined route includes: controlling the unmanned aerial vehicle to fly in a vertical fluctuation mode, wherein a rectangular wave route is formed by a flight route of the unmanned aerial vehicle, and the mapping area is shot at the right-angle turning position of the rectangular wave route.
In one embodiment, the drone is controlled to fly repeatedly 2-4 times along a predetermined route.
In one embodiment, according to the distance between the image control points, the size of the photos of different heights shot by the unmanned aerial vehicle is scaled, and the first photos with the same size proportion are correspondingly obtained.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the various embodiments provided herein may include non-volatile and/or volatile memory. The nonvolatile memory can include Read Only Memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), memory bus direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), among others.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above description is only a preferred embodiment of the present invention, and the protection scope of the present invention is not limited to the above examples, and all technical solutions belonging to the concept of the present invention belong to the protection scope of the present invention. It should be noted that modifications and adaptations to the present invention may occur to one skilled in the art without departing from the principles of the present invention and are intended to be within the scope of the present invention.

Claims (10)

1. A method of unmanned aerial vehicle mapping, comprising:
s1, arranging image control points in a mapping area;
s2, starting the unmanned aerial vehicle, controlling the unmanned aerial vehicle to perform self-checking, judging whether the unmanned aerial vehicle has faults, and if so, giving an alarm; if not, executing S3;
s3, controlling the unmanned aerial vehicle to take off, controlling the unmanned aerial vehicle to fly along a preset route, and shooting a mapping area to obtain a plurality of first photos containing image control points;
s4, respectively carrying out orthodontic treatment on the first photos to correspondingly obtain second photos;
and S5, splicing the second photos to correspondingly obtain a complete image of the mapping region.
2. The unmanned aerial vehicle mapping method of claim 1, wherein the deploying image control points in the mapping area comprises: and arranging image control points in a rectangular lattice mode in a mapping area.
3. The unmanned aerial vehicle mapping method of claim 2, wherein the orthodontic treatment of the first plurality of photographs respectively comprises:
s41, judging whether the number of the image control points on a plurality of first photos is larger than a preset number threshold one by one, if so, executing the step S42, and if not, discarding the first photos;
s42, extracting features of the imaging control points in the first photo, and correspondingly generating coordinates of the imaging control points on the first photo;
s43, cutting the first photo according to the coordinates to obtain a corresponding cut photo;
s44, carrying out affine transformation on the cut photo according to the coordinates to obtain a corresponding second photo.
4. The unmanned aerial vehicle mapping method of claim 4, wherein step S43 further comprises:
acquiring the definition of the cut photo, judging whether the definition is larger than a definition threshold, if not, discarding the cut photo and re-executing the step S41; if yes, go to step S44.
5. The unmanned aerial vehicle mapping method of claim 1, wherein controlling the unmanned aerial vehicle to fly along the predetermined route comprises:
controlling the unmanned aerial vehicle to fly in a vertical fluctuation mode, wherein a rectangular wave route is formed by a flight route of the unmanned aerial vehicle, and the mapping area is shot at the right-angle turning position of the rectangular wave route.
6. The unmanned aerial vehicle mapping method of claim 5, wherein the unmanned aerial vehicle is controlled to fly repeatedly 2-4 times along the predetermined route.
7. The unmanned aerial vehicle mapping method according to claim 6, wherein the photos of different heights shot by the unmanned aerial vehicle are scaled according to the distance between the image control points, and the first photos with the same size proportion are correspondingly obtained.
8. An unmanned aerial vehicle mapping system, comprising:
unmanned aerial vehicle self-checking module: the method comprises the steps of controlling the unmanned aerial vehicle to perform self-checking, and judging whether the unmanned aerial vehicle has faults or not;
the route setting module: the method comprises the steps of setting a flight route of an unmanned aerial vehicle;
an image control point detection module: the method comprises the steps of detecting the number of image control points on a first photo and judging whether the number of image control points is larger than a preset number threshold;
photo cutting module: the first photo is cut according to the coordinates, and a corresponding cut photo is obtained;
and a definition judging module: the method comprises the steps of obtaining the definition of the cut photo and judging whether the definition is larger than a definition threshold;
photo fusion module: and the second photos are spliced to correspondingly obtain a complete image of the mapping area.
9. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any of claims 1 to 7 when the computer program is executed.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 7.
CN202310222060.7A 2023-03-09 2023-03-09 Unmanned aerial vehicle mapping method, unmanned aerial vehicle mapping system, unmanned aerial vehicle mapping medium and unmanned aerial vehicle mapping computer Active CN116164711B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310222060.7A CN116164711B (en) 2023-03-09 2023-03-09 Unmanned aerial vehicle mapping method, unmanned aerial vehicle mapping system, unmanned aerial vehicle mapping medium and unmanned aerial vehicle mapping computer

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310222060.7A CN116164711B (en) 2023-03-09 2023-03-09 Unmanned aerial vehicle mapping method, unmanned aerial vehicle mapping system, unmanned aerial vehicle mapping medium and unmanned aerial vehicle mapping computer

Publications (2)

Publication Number Publication Date
CN116164711A true CN116164711A (en) 2023-05-26
CN116164711B CN116164711B (en) 2024-03-29

Family

ID=86416402

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310222060.7A Active CN116164711B (en) 2023-03-09 2023-03-09 Unmanned aerial vehicle mapping method, unmanned aerial vehicle mapping system, unmanned aerial vehicle mapping medium and unmanned aerial vehicle mapping computer

Country Status (1)

Country Link
CN (1) CN116164711B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116772803A (en) * 2023-08-24 2023-09-19 陕西德鑫智能科技有限公司 Unmanned aerial vehicle detection method and device
CN117537790A (en) * 2024-01-09 2024-02-09 深圳市国测测绘技术有限公司 Three-dimensional map mapping method, device and system based on unmanned aerial vehicle

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103438869A (en) * 2013-08-28 2013-12-11 中国水利水电科学研究院 Aerial dynamic large-scale vegetation coverage acquisition system
CN105045279A (en) * 2015-08-03 2015-11-11 余江 System and method for automatically generating panorama photographs through aerial photography of unmanned aerial aircraft
CN105352481A (en) * 2015-10-23 2016-02-24 武汉苍穹电子仪器有限公司 High-precision unmanned aerial vehicle image non-control points surveying and mapping method and system thereof
CN108562279A (en) * 2018-03-06 2018-09-21 平湖市城工建设测绘设计有限责任公司 A kind of unmanned plane mapping method
CN114220029A (en) * 2021-12-14 2022-03-22 中广核太阳能开发有限公司 Detection method and device for rotary joint of groove type photo-thermal power station
CN114689030A (en) * 2022-06-01 2022-07-01 中国兵器装备集团自动化研究所有限公司 Unmanned aerial vehicle auxiliary positioning method and system based on airborne vision

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108961150B (en) * 2018-04-11 2019-05-03 西安科技大学 Photo control point method of deploying to ensure effective monitoring and control of illegal activities automatically based on unmanned plane image
CN108548525A (en) * 2018-06-14 2018-09-18 浙江鼎测地理信息技术有限公司 A method of carrying out field mapping using unmanned plane aeroplane photography
CN111860375A (en) * 2020-07-23 2020-10-30 南京科沃信息技术有限公司 Plant protection unmanned aerial vehicle ground monitoring system and monitoring method thereof

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103438869A (en) * 2013-08-28 2013-12-11 中国水利水电科学研究院 Aerial dynamic large-scale vegetation coverage acquisition system
CN105045279A (en) * 2015-08-03 2015-11-11 余江 System and method for automatically generating panorama photographs through aerial photography of unmanned aerial aircraft
CN105352481A (en) * 2015-10-23 2016-02-24 武汉苍穹电子仪器有限公司 High-precision unmanned aerial vehicle image non-control points surveying and mapping method and system thereof
CN108562279A (en) * 2018-03-06 2018-09-21 平湖市城工建设测绘设计有限责任公司 A kind of unmanned plane mapping method
CN114220029A (en) * 2021-12-14 2022-03-22 中广核太阳能开发有限公司 Detection method and device for rotary joint of groove type photo-thermal power station
CN114689030A (en) * 2022-06-01 2022-07-01 中国兵器装备集团自动化研究所有限公司 Unmanned aerial vehicle auxiliary positioning method and system based on airborne vision

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116772803A (en) * 2023-08-24 2023-09-19 陕西德鑫智能科技有限公司 Unmanned aerial vehicle detection method and device
CN116772803B (en) * 2023-08-24 2024-02-09 陕西德鑫智能科技有限公司 Unmanned aerial vehicle detection method and device
CN117537790A (en) * 2024-01-09 2024-02-09 深圳市国测测绘技术有限公司 Three-dimensional map mapping method, device and system based on unmanned aerial vehicle
CN117537790B (en) * 2024-01-09 2024-04-09 深圳市国测测绘技术有限公司 Three-dimensional map mapping method, device and system based on unmanned aerial vehicle

Also Published As

Publication number Publication date
CN116164711B (en) 2024-03-29

Similar Documents

Publication Publication Date Title
CN116164711B (en) Unmanned aerial vehicle mapping method, unmanned aerial vehicle mapping system, unmanned aerial vehicle mapping medium and unmanned aerial vehicle mapping computer
US10089530B2 (en) Systems and methods for autonomous perpendicular imaging of test squares
WO2021035731A1 (en) Control method and apparatus for unmanned aerial vehicle, and computer readable storage medium
CN110718137B (en) Method and device for constructing density distribution map of target object, terminal and mobile device
CN111583119B (en) Orthoimage splicing method and equipment and computer readable medium
US20190042829A1 (en) Systems and methods for autonomous perpendicular imaging of test squares
CN111246098B (en) Robot photographing method and device, computer equipment and storage medium
CN110209847A (en) Quasi real time processing method, device and storage medium on Airborne Data Classification machine
CN115311346A (en) Power inspection robot positioning image construction method and device, electronic equipment and storage medium
WO2020114433A1 (en) Depth perception method and apparatus, and depth perception device
CN111445513B (en) Plant canopy volume acquisition method and device based on depth image, computer equipment and storage medium
Dlesk et al. Structure from motion processing of analogue images captured by Rollei metric camera, digitized with various scanning resolution
CN110940318A (en) Aerial remote sensing real-time imaging method, electronic equipment and storage medium
CN111539964B (en) Plant canopy surface area acquisition method and device based on depth image, computer equipment and storage medium
Valkov et al. Calibration of digital non-metric cameras for measuring works
CN115376018A (en) Building height and floor area calculation method, device, equipment and storage medium
US20220222909A1 (en) Systems and Methods for Adjusting Model Locations and Scales Using Point Clouds
CN116518981B (en) Aircraft visual navigation method based on deep learning matching and Kalman filtering
CN117422650B (en) Panoramic image distortion correction method and device, electronic equipment and medium
WO2023127020A1 (en) Information processing system, method, and program
CN116337015A (en) Aerial photogrammetry production method and aerial photogrammetry production system without field control point
CN118172410A (en) Method and device for mapping photovoltaic inspection image to station building orthographic image
CN118037854A (en) Calibration method and device for multiple cameras, computer equipment and storage medium
CN118400607A (en) Automatic calculation-based oblique photography camera control method and related equipment
CN116664762A (en) Three-dimensional modeling method, device and equipment for real estate and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant