CN114782841B - Correction method and device based on landing pattern - Google Patents

Correction method and device based on landing pattern Download PDF

Info

Publication number
CN114782841B
CN114782841B CN202210422078.7A CN202210422078A CN114782841B CN 114782841 B CN114782841 B CN 114782841B CN 202210422078 A CN202210422078 A CN 202210422078A CN 114782841 B CN114782841 B CN 114782841B
Authority
CN
China
Prior art keywords
image
landing
unmanned aerial
aerial vehicle
geometric
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210422078.7A
Other languages
Chinese (zh)
Other versions
CN114782841A (en
Inventor
周佳澄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Imapcloud Intelligent Technology Co ltd
Original Assignee
Guangzhou Imapcloud Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Imapcloud Intelligent Technology Co ltd filed Critical Guangzhou Imapcloud Intelligent Technology Co ltd
Priority to CN202210422078.7A priority Critical patent/CN114782841B/en
Publication of CN114782841A publication Critical patent/CN114782841A/en
Application granted granted Critical
Publication of CN114782841B publication Critical patent/CN114782841B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)

Abstract

The application provides a correction method and a correction device based on a landing pattern. And carrying out spectrum comparison on the calibration image and the original image of the landing pattern based on the Apriltag label to obtain spectrum difference information, and carrying out geometric information comparison on the calibration image and the original image based on the geometric figure element to obtain geometric difference information. And performing spectral correction and geometric correction on each acquired image based on the spectral difference information and the geometric difference information. According to the scheme, the landing pattern comprising the Apriltag tag and the geometric figure element is used as a standard reference, so that correction of an image shot by the unmanned aerial vehicle is realized, the accuracy of the corrected image on spectrum and geometric features is ensured, and the problem of subsequent recognition errors based on the image is avoided.

Description

Correction method and device based on landing pattern
Technical Field
The application relates to the technical field of unmanned aerial vehicles, in particular to a correction method and device based on landing patterns.
Background
With the development of unmanned control technology, unmanned aerial vehicles are now widely used in different fields. For example, the unmanned aerial vehicle is used for collecting the images of terrains, landforms and the like, and then the land, the water source, the farmland, the forests and the like in the region are analyzed based on the images collected by the unmanned aerial vehicle. In performing such analysis of information, it is generally performed in a manner based on image recognition processing, including, for example, color-based recognition, graphic-based analysis, and the like.
This requires that the colors and graphics in the images acquired based on the drone be consistent with reality. However, since the image capturing apparatus mounted on the unmanned aerial vehicle may have distortion in color and graphics when capturing an image, the captured image may be directly used for subsequent processing, which may have defects such as low recognition accuracy or recognition errors.
Disclosure of Invention
The object of the application consists, for example, in providing a correction method and device based on a landing pattern, which are able to guarantee the accuracy of the corrected image in terms of colour and geometrical characteristics.
Embodiments of the application may be implemented as follows:
in a first aspect, the present application provides a correction method based on a landing pattern, the landing pattern including an Apriltag tag and a geometric figure element, the landing pattern being disposed on a landing platform, the method comprising:
Acquiring a calibration image comprising the landing pattern acquired at a preset position in the flight process of the unmanned aerial vehicle;
performing spectrum comparison on the calibration image and an original image comprising the landing pattern based on an Apriltag label in the calibration image to obtain spectrum difference information of the calibration image;
comparing the geometric information of the calibration image with that of the original image based on the geometric figure elements, and obtaining the geometric difference information of the calibration image;
and carrying out spectrum correction on each image acquired by the unmanned aerial vehicle in the flight process based on the spectrum difference information, and carrying out geometric correction on each image based on the geometric difference information.
In an alternative embodiment, the method further comprises:
acquiring multi-frame image frames which are continuously acquired by the unmanned aerial vehicle in the landing process and comprise the landing patterns;
obtaining the current position offset of the unmanned aerial vehicle according to the information of the landing pattern in the multi-frame image frame;
and adjusting the flight angle and the flight speed of the unmanned aerial vehicle according to the position offset, so that the unmanned aerial vehicle lands to the central position of the landing platform.
In an alternative embodiment, the Apriltag tag is an RGB image;
The step of comparing the calibration image with the original image comprising the landing pattern based on the Apriltag label therein to obtain the spectrum difference information of the calibration image comprises the following steps:
obtaining standard parameters of each color channel of the original image of the landing pattern in an RGB color space;
obtaining calibration parameters of each color channel of the calibration image in an RGB color space;
and comparing the corresponding standard parameters with the calibration parameters to obtain the spectrum difference information of the calibration image in each color channel.
In an optional embodiment, the step of comparing the geometric information of the calibration image with the geometric figure elements of the original image to obtain geometric difference information of the calibration image includes:
obtaining world coordinate values of key points calibrated by geometric figure elements in the calibration image;
converting the world coordinate values of the key points in the calibration image into right-angle coordinate values, and converting the world coordinate values of the corresponding key points in the original image into right-angle coordinate values;
converting each key point into a polar coordinate system based on the right-angle coordinate values, and obtaining polar coordinate values and polar diameter values of each key point in the polar coordinate system;
And calculating to obtain the geometric difference information of the calibration image according to the polar coordinate value and the polar diameter value of the key point in the original image and the polar coordinate value and the polar diameter value of the corresponding key point in the calibration image.
In an alternative embodiment, the step of performing geometry correction on each image based on the geometry difference information includes:
for each image, obtaining right-angle coordinate values of each pixel point in the image;
converting the right-angle coordinate value of each pixel point into a polar coordinate value, and calculating to obtain a corrected polar coordinate value according to the polar coordinate value and the geometric difference information;
obtaining corrected right-angle coordinate values according to the corrected polar coordinate values;
and obtaining a corrected image according to the corrected rectangular coordinate value of each pixel point and the pixel value of each pixel point.
In an alternative embodiment, the position offset includes a current landing acceleration of the unmanned aerial vehicle and a 3D position of the landing platform relative to a current position of the unmanned aerial vehicle;
the step of obtaining the current position offset of the unmanned aerial vehicle according to the information of the landing pattern in the multi-frame image frame comprises the following steps:
extracting Apriltag labels contained in landing patterns in each image frame for each image frame;
According to the difference information of the corresponding Apriltag labels between every two adjacent frames, calculating to obtain the landing acceleration of the unmanned aerial vehicle;
and calculating to obtain the 3D azimuth of the landing platform where the Apriltag tag is positioned relative to the current position of the unmanned plane according to the information of the Apriltag tag in the current frame image frame and the image frame of the previous preset frame image frame.
In an optional embodiment, the step of adjusting the flight angle and the flight speed of the unmanned aerial vehicle according to the position offset to make the unmanned aerial vehicle land to the central position of the landing platform includes:
adjusting the flight angle of the unmanned aerial vehicle relative to the landing platform according to the 3D azimuth;
obtaining the distance between the landing platform and the unmanned aerial vehicle according to the 3D azimuth and the current position of the unmanned aerial vehicle;
and adjusting the flight speed of the unmanned aerial vehicle according to the landing acceleration and the distance so as to enable the unmanned aerial vehicle to land to the central position of the landing platform.
In an alternative embodiment, the Apriltag tag is formed by nesting at least two Apriltag icons of different sizes, so that the drone can identify one of the Apriltag icons at a different height relative to the Apriltag tag.
In an alternative embodiment, the geometric figure elements in the landing pattern comprise a square at the outermost periphery, a circle inscribing the square, and two diagonals of the square, wherein the two diagonals divide the square and the circle into a plurality of sub-areas;
the Apriltag label comprises a plurality of three-primary-color Apriltag icons, and three-primary-color Apriltag icons with three colors are arranged in each subarea, so that the unmanned aerial vehicle can perform spectral correction based on information of the three-primary-color Apriltag icons in any subarea.
In a second aspect, the present application provides a correction device based on a landing pattern, the landing pattern comprising an Apriltag tag and a geometric figure element, the landing pattern being provided on a landing platform, the device comprising:
the acquisition module is used for acquiring a calibration image comprising the landing pattern acquired at a preset position in the flight process of the unmanned aerial vehicle;
the first comparison module is used for carrying out spectrum comparison on the calibration image and an original image comprising the landing pattern based on an Apriltag label in the calibration image to obtain spectrum difference information of the calibration image;
The second comparison module is used for comparing the geometric information of the calibration image with the geometric figure elements of the original image to obtain the geometric difference information of the calibration image;
the correction module is used for carrying out spectrum correction on each image acquired by the unmanned aerial vehicle in the flight process based on the spectrum difference information, and carrying out geometric correction on each image based on the geometric difference information.
The beneficial effects of the embodiment of the application include, for example:
the application provides a correction method and a correction device based on a landing pattern, wherein the landing pattern comprising an Apriltag tag and geometric figure elements is arranged on a landing platform, a calibration image comprising the landing pattern is acquired at a preset position in the flying process of an unmanned aerial vehicle, the calibration image is subjected to spectrum comparison with an original image of the landing pattern based on the Apriltag tag, so that spectrum difference information of the calibration image is obtained, and the calibration image is subjected to geometric information comparison with the original image based on the geometric figure elements, so that geometric difference information of the calibration image is obtained. And performing spectral correction and geometric correction on each image acquired by the unmanned aerial vehicle based on the spectral difference information and the geometric difference information. In the scheme, the landing pattern comprising the Apriltag tag and the geometric figure element is used as a standard reference to realize correction of the image shot by the unmanned aerial vehicle, so that the accuracy of the corrected image in terms of color and geometric features is ensured, and the problem of subsequent recognition errors based on the image is avoided.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a correction method based on landing patterns according to an embodiment of the present application;
fig. 2 is a flowchart of a landing correction method in the landing pattern-based correction method according to the embodiment of the present application;
FIG. 3 is a flow chart of sub-steps included in step S202 of FIG. 2;
FIG. 4 is a flow chart of sub-steps included in step S203 of FIG. 2;
FIG. 5 is a flow chart of sub-steps included in step S102 of FIG. 1;
FIG. 6 is a flow chart of sub-steps included in step S103 of FIG. 1;
FIG. 7 is a schematic illustration of a landing pattern according to an embodiment of the present application;
FIG. 8 is a sub-step included in step S104 of FIG. 1;
FIG. 9 is a second schematic diagram of a landing pattern according to an embodiment of the present application;
fig. 10 is a block diagram of an electronic device according to an embodiment of the present application;
Fig. 11 is a functional block diagram of a correction device based on a landing pattern according to an embodiment of the present application.
Icon: 110-a storage medium; a 120-processor; 130-a correction device based on a landing pattern; 131-an acquisition module; 132-a first comparison module; 133-a second comparison module; 134-a correction module; 140-communication interface.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments of the present application. The components of the embodiments of the present application generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the application, as presented in the figures, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
In the description of the present application, it should be noted that features in the embodiments of the present application may be combined with each other without conflict.
Referring to fig. 1, a flowchart of a correction method based on a landing pattern according to an embodiment of the present application may be implemented by an unmanned aerial vehicle, for example, by a processing device on the unmanned aerial vehicle. The landing pattern is arranged on the landing platform and comprises an Apriltag label and geometric figure elements.
The specific flow shown in fig. 1 will be described in detail.
S101, acquiring a calibration image comprising the landing pattern, which is acquired at a preset position in the flight process of the unmanned aerial vehicle.
S102, performing spectrum comparison on the calibration image and an original image comprising the landing pattern based on the Apriltag label, and obtaining spectrum difference information of the calibration image.
S103, comparing the geometric information of the calibration image with that of the original image based on the geometric figure elements, and obtaining the geometric difference information of the calibration image.
S104, carrying out spectrum correction on each image acquired by the unmanned aerial vehicle in the flight process based on the spectrum difference information, and carrying out geometric correction on each image based on the geometric difference information.
In this embodiment, the unmanned aerial vehicle is mounted with an imaging device, such as a multispectral camera. The multispectral camera is a common load of the unmanned aerial vehicle, and the calibration of geometric parameters and spectral characteristics of the camera is a basic requirement for ensuring the accurate measurement and high-precision monitoring of the unmanned aerial vehicle.
And after the landing pattern is designed, an original image containing the landing pattern can be generated and attached to the landing platform. In addition, the electronic device generating the landing pattern can also send the original image and the related information of the original image to the unmanned aerial vehicle, so that the unmanned aerial vehicle can correct the original image as a standard.
In the flight process of the unmanned aerial vehicle, a calibration image comprising a landing pattern can be acquired at a preset position through the camera equipment. The preset position may be a plurality of different positions, for example positions of different heights, different angles with respect to the landing pattern.
The calibration pattern acquired by the unmanned aerial vehicle comprises a landing pattern, and the landing pattern comprises an Apriltag tag and geometric figure elements.
The remote sensing image of the unmanned aerial vehicle is acquired under different conditions such as weather, illumination, sensor configuration and the like, so that the brightness value and the color on the same ground object image are inconsistent. If the color adjustment is not carried out, the image is subjected to subsequent processing, such as mosaic, even if the accuracy of geometric registration is high, the overlapping area is well matched, but the color tone difference of the images on two sides after mosaic is obvious, the seam line is very prominent and is not attractive, the analysis and the identification of the ground object image and the professional information are also influenced, and the application effect is reduced. Or the information analysis and identification credibility is lost as a comparison basis of the same ground object in different periods.
To accurately compare these images, it is necessary to normalize the color reproduction effect in the image captured by the image capturing apparatus, thereby achieving high accuracy, including high color fidelity. The process of normalizing color reproduction is called spectral calibration.
In addition, the imaging process of the imaging apparatus involves conversion between different coordinate systems, for example, points in space are converted from a world coordinate system to a camera coordinate system, then projected onto an imaging plane, and finally data on the imaging plane is converted to an image pixel coordinate system. Distortion may be introduced due to deviations in the lens manufacturing accuracy and assembly process, resulting in distortion of the image. For example, distortion of the lens is classified into radial distortion and tangential distortion.
Among them, radial distortion is distortion distributed along the radial direction of the lens, and occurs because light rays are more curved away from the center of the lens than near the center, which is more apparent in a general lens. Radial distortion mainly includes barrel distortion and pincushion distortion.
Tangential distortion is due to the fact that the lens itself is not parallel to the camera sensor plane or the image plane, which is often the result of mounting deviations from the way the lens is attached to the lens module.
It follows that geometric distortion and color distortion generally occur in an image captured by an image capturing apparatus.
Therefore, in the embodiment, the information which can be used for correction can be obtained based on the calibration image acquired by the unmanned aerial vehicle in the flight process. Since the original image containing the landing pattern is directly derived based on the generation information of the landing pattern, the original image can be taken as a standard.
The calibration image and the original image both contain an Apriltag label, and the Apriltag label is a visual reference label which can be identified visually. The Apriltag label can be composed of three primary colors, has a two-dimensional code-like pattern with a certain coding rule, and presents a color pattern. Therefore, the spectrum comparison can be performed based on the Apriltag tag in the calibration image and the original image, so that the spectrum difference information of the calibration image is obtained.
In addition, the calibration image and the original image both contain geometric figure elements in the landing pattern, and the geometric figure elements can be figures or lines which show standard shapes in the landing pattern, such as circles, squares, diagonals and the like.
In this embodiment, geometric information comparison may be performed based on geometric graphic elements in the calibration image and the original image, so as to obtain geometric difference information of the calibration image.
The spectrum difference information and the geometric difference information of the calibration image shot by the unmanned aerial vehicle are also applicable to all other images shot by the unmanned aerial vehicle, such as an image of a water source shot by the unmanned aerial vehicle, an image of a farmland, an image of a highland and the like. Accordingly, the spectral and geometric difference information calculated based on the landing pattern can be used for spectral and geometric correction of images acquired by the unmanned aerial vehicle during flight, respectively.
Therefore, the accuracy of the corrected image of the unmanned aerial vehicle on the color information and the geometric information can be guaranteed, and the corrected image can be put into subsequent analysis processing.
In the correction scheme provided by the embodiment, the landing pattern comprising the Apriltag tag and the geometric figure element is used as a standard reference to realize correction of the image shot by the unmanned aerial vehicle, so that the purpose of ensuring the accuracy of the corrected image on colors and figures can be realized, and the problem of subsequent recognition errors based on the image can be avoided.
In addition, development of unmanned aerial vehicle intelligent flight technology and load technology has greatly promoted unmanned aerial vehicle remote sensing application's breadth and degree of depth, and unmanned aerial vehicle and intelligent demand are also more and more urgent to unmanned aerial vehicle nest technique has also been induced. The intelligent machine nest can realize functions of automatically storing unmanned aerial vehicle, remote accurate take-off and landing, intelligent automatic charging, state real-time monitoring, automatic data transmission and the like, and ensures the service requirements of high-frequency, continuous and normalized patrol flight.
In order to ensure safe and accurate landing of the unmanned aerial vehicle, in the embodiment, the landing pattern on the landing platform can be utilized to assist the landing of the unmanned aerial vehicle, and the flight information of the landing process of the unmanned aerial vehicle is corrected. Referring to fig. 2, the correction method provided in the present embodiment may further include the following steps:
s201, acquiring multi-frame image frames which are continuously acquired by the unmanned aerial vehicle in the landing process and comprise the landing pattern.
S202, obtaining the current position offset of the unmanned aerial vehicle according to the information of the landing pattern in the multi-frame image frame.
S203, adjusting the flight angle and the flight speed of the unmanned aerial vehicle according to the position offset, so that the unmanned aerial vehicle drops to the center position of the landing platform.
Starting from triggering the unmanned aerial vehicle to execute the landing task, the unmanned aerial vehicle can utilize the camera equipment to shoot images when landing. Multiple frames of image frames may be captured consecutively, each of which may include a landing pattern.
The Apriltag tag is a tag graph similar to a two-dimensional code pattern with a certain coding rule, and the relative position of the landing pattern relative to the unmanned aerial vehicle can be calculated through the coding information of the Apriltag tag under a plurality of different positions and angles, for example, under 6 degrees of freedom parameters of three different positions and angles. That is, the current positional offset of the drone includes the relative position of the drone with respect to the landing platform, which may be a 3D position.
In order to ensure that the unmanned aerial vehicle can safely and accurately land on the landing platform, besides the unmanned aerial vehicle needs to be regulated to land in the direction of the landing platform, the speed of the unmanned aerial vehicle when the unmanned aerial vehicle falls on the landing platform needs to be ensured to be extremely small or 0, so that no collision exists when the unmanned aerial vehicle lands.
Therefore, in this embodiment, the flight angle and the flight speed of the unmanned aerial vehicle can be adjusted according to the calculated position offset, so that the unmanned aerial vehicle is ensured to drop to the center position of the landing platform.
In this embodiment, the landing pattern on the landing platform is used as the reference information for correcting the flight speed and the flight azimuth in the landing process, so that the unmanned aerial vehicle can be ensured to land safely and accurately.
In this embodiment, the current position offset of the unmanned aerial vehicle calculated based on the landing pattern in the multi-frame image frame includes the current landing acceleration of the unmanned aerial vehicle and the 3D azimuth of the landing platform relative to the current position of the unmanned aerial vehicle. The calculation of the position offset may be implemented as follows, please refer to fig. 3 in combination:
s2021, extracting the Apriltag labels contained in the landing patterns in each image frame for each image frame.
S2022, calculating to obtain the landing acceleration of the unmanned aerial vehicle according to the difference information of the corresponding Apriltag labels between every two adjacent frames.
S2023, calculating to obtain the 3D azimuth of the landing platform where the Apriltag tag is located relative to the current position of the unmanned plane according to the information of the Apriltag tag in the current frame image frame and the image frame of the previous preset frame image frame.
Based on the coding information of the Apriltag tag in each frame of image frame shot by the unmanned aerial vehicle in the landing process, the real-time changing relative position information between the landing platform and the unmanned aerial vehicle can be calculated.
In addition, as the corresponding relative position can be obtained for each frame of image frame and the acquisition time difference between two adjacent frames of image frames can be obtained, the landing acceleration of the unmanned aerial vehicle in the landing process can be obtained based on the difference between the relative positions calculated by different image frames.
In this embodiment, when calculating the 3D azimuth of the landing platform relative to the current position of the unmanned aerial vehicle, if the image information of the Apriltag tag included in the current frame image frame is clear, generally, the 3D azimuth of the landing platform may be directly calculated based on the information of the Apriltag tag in the current frame image frame. The image information in the current frame image frame is not clear, so that the process of possible calculation is overtime, and a final calculation result is not obtained. In this case, the 3D azimuth between the unmanned aerial vehicle and the landing platform when shooting each image frame may be calculated based on the information of the Apriltag tag in each image frame in the previous preset frame according to the previous preset frame image frame of the current frame.
Fitting is performed based on a plurality of 3D orientations corresponding to the previous preset frame, so that a fitting curve about the 3D orientations can be obtained. And 3D azimuth of the landing platform relative to the unmanned aerial vehicle when the unmanned aerial vehicle shoots the current frame image frame can be predicted based on the fitting curve.
On the basis of obtaining the 3D azimuth of the landing platform corresponding to the current position of the unmanned aerial vehicle and the current landing acceleration of the unmanned aerial vehicle, please refer to fig. 4, in this embodiment, the landing process of the unmanned aerial vehicle may be corrected by the following manner:
and S2031, adjusting the flight angle of the unmanned aerial vehicle relative to the landing platform according to the 3D azimuth.
And S2032, obtaining the distance between the landing platform and the unmanned aerial vehicle according to the 3D azimuth and the current position of the unmanned aerial vehicle.
And S2033, adjusting the flight speed of the unmanned aerial vehicle according to the landing acceleration and the distance so as to enable the unmanned aerial vehicle to land to the center position of the landing platform.
In this embodiment, the flight angle of the unmanned aerial vehicle is adjusted according to the 3D azimuth of the landing platform relative to the unmanned aerial vehicle, so that the unmanned aerial vehicle performs landing flight towards the landing platform.
The 3D position of the landing platform relative to the current position of the unmanned aerial vehicle is known, and the current position of the unmanned aerial vehicle is known, so that the distance between the unmanned aerial vehicle and the landing platform can be calculated. The current speed, landing acceleration and the distance between the landing platform of the unmanned aerial vehicle can be known, the unmanned aerial vehicle is expected to be finally and stably and safely parked on the landing platform, and the current flight speed of the unmanned aerial vehicle can be adjusted, so that the unmanned aerial vehicle is guaranteed to be finally and stably parked on the landing platform after passing the distance, and collision is avoided.
Therefore, in this embodiment, geometric distortion correction, spectrum correction and accurate positioning landing correction of the image shot by the unmanned aerial vehicle can be realized based on the landing pattern set on the landing platform. In the correction processing, the landing pattern is used as a reference object, so that the correction function based on the landing pattern in multiple aspects can be realized.
In this embodiment, the Apriltag label in the landing pattern is an RGB image, that is, the Apriltag label contains color information of three primary colors. When the spectrum difference information of the calibration image is obtained by performing the spectrum comparison based on the original image and the calibration image, referring to fig. 5, the method can be implemented as follows:
s1021, obtaining standard parameters of each color channel of the original image of the landing pattern in the RGB color space.
S1022, obtaining calibration parameters of each color channel of the calibration image in the RGB color space.
S1023, comparing the corresponding standard parameters with the calibration parameters to obtain the spectrum difference information of the calibration image in each color channel.
Because the Apriltag label is composed of three primary color reference colors, each pixel point in the landing pattern is a color represented by the three primary color reference colors alone or a color represented by the three primary color reference colors after being combined.
In this embodiment, after the original image including the landing pattern is generated, standard parameters of each color channel of the original image in the RGB color space, that is, standard parameters of each reference color of the three primary colors, may be recorded.
After the unmanned aerial vehicle collects the calibration image, calibration parameters of each color channel of the calibration image in the RGB color space can be obtained based on the Apriltag tag in the calibration image.
If there is a difference between the standard parameters of each color channel in the original image and the calibration parameters of each color channel in the calibration image, the difference will be caused in color even if the calibration image and the original image are the same object and the same point, that is, the image shot by the unmanned aerial vehicle has color deviation.
Therefore, the calibration parameters of the calibration image can be compared with the corresponding standard parameters, so that the spectrum difference information of the calibration image in each color channel can be obtained.
On the basis, after the unmanned aerial vehicle shoots and obtains the images, the spectrum correction can be carried out on each image based on the obtained spectrum difference information. Specifically, the spectrum of each color channel in each image can be corrected based on the spectrum difference information of each color channel, so that the color standardization of the image shot by the unmanned aerial vehicle is realized.
Referring to fig. 6, in this embodiment, when comparing the geometric information of the calibration image and the geometric information of the original image, the geometric difference information of the calibration image may be obtained by the following method:
s1031, obtaining world coordinate values of key points calibrated by the geometric figure elements in the calibration image.
S1032, converting the world coordinate value of each key point in the calibration image into a right-angle coordinate value, and converting the world coordinate value of each corresponding key point in the original image into a right-angle coordinate value.
S1033, converting each key point into a polar coordinate system based on the right-angle coordinate values, and obtaining the polar coordinate values and the polar diameter values of each key point in the polar coordinate system.
S1034, calculating to obtain the geometric difference information of the calibration image according to the polar coordinate value and the polar diameter value of the key point in the original image and the polar coordinate value and the polar diameter value of the corresponding key point in the calibration image.
As can be seen from the above, the geometric elements in the landing pattern may be standardized patterns or lines in the landing pattern. The geometric figure element may map out some key points in the image, for example, the drop pattern may be a graph as shown in fig. 7. The geometric figure element in the landing pattern comprises a square at the outermost periphery, a circle inscribed in the square and two diagonal lines of the square, wherein the square and the circle are divided into a plurality of subareas by the two diagonal lines.
While the key points for geometric figure element calibration can be such as the corners of squares, the intersections of diagonals, etc.
In the calibration image shot by the shooting equipment, the world coordinate value of each key point under the world coordinate system can be known. On the basis, the world coordinate values of the key points can be converted into right-angle coordinate values, namely coordinate values under an image coordinate system. And converting the right-angle coordinate values of the key points into a polar coordinate system to obtain the polar coordinate values and the polar diameter values of the key points.
Similarly, the world coordinate values of the key points in the original image are converted into right-angle coordinate values and into a polar coordinate system.
And comparing polar coordinate values and polar diameter values of the corresponding key points in the calibration image and the original image to obtain geometric difference information such as the distortion correction coefficient of each key point, the deflection angle of the image and the like. The geometric difference information of the plurality of key points is synthesized, for example, the average value of the geometric difference information of each plurality of key points is taken, and the geometric difference information of the calibration image can be obtained.
Referring to fig. 8, on the basis, when the geometric shape correction is performed on each image shot by the unmanned aerial vehicle based on the geometric difference information of the calibration image, the following manner may be implemented:
S1041, for each image, obtaining right-angle coordinate values of each pixel point in the image.
S1042, converting the right angle coordinate value of each pixel point into a polar coordinate value, and calculating to obtain a corrected polar coordinate value according to the polar coordinate value and the geometric difference information.
S1043, obtaining corrected right-angle coordinate values according to the corrected polar coordinate values.
S1044, obtaining a corrected image according to the right-angle coordinate value corrected by each pixel point and the pixel value of each pixel point.
When each image photographed by the unmanned aerial vehicle is corrected based on the geometric difference information, the pixel points in the image are required to be converted into a polar coordinate system as well. And calculating corrected polar coordinate values by using the geometric difference information on the basis of obtaining the polar coordinate values of the pixel points under the polar coordinate system. And then recovering to the rectangular coordinate system based on the corrected polar coordinate values, and combining the pixel values of the pixel points to generate a corrected image under the condition of keeping the pixel values of the pixel points unchanged.
In this embodiment, in order to ensure that images acquired by the unmanned aerial vehicle at different heights have clear and complete-information Apriltag tags. In this embodiment, the Apriltag label may be designed as shown in fig. 9, where the Apriltag label is formed by nesting at least two Apriltag icons with different sizes, for example, may be formed by nesting two Apriltag icons with different sizes, or may be formed by nesting three Apriltag icons with different sizes. So that the drone can identify one of the Apriltag icons at a different height relative to the Apriltag tag.
Thus, by nesting the Apriltag icons with different sizes to form the Apriltag label, at least one Apriltag icon can be identified in the dynamic change process of the identification distance between the unmanned aerial vehicle and the landing pattern.
Further, when the landing pattern is as shown in fig. 7, wherein the Apriltag tag includes a plurality of three-primary-color Apriltag icons, three-primary-color Apriltag icons having three colors are provided in each sub-region, so that the drone can perform spectral correction based on information of the three-primary-color Apriltag icons in any one sub-region.
Specifically, as shown in fig. 7, when designing the landing pattern, a circle inscribed within a square may form a polar coordinate system. Two larger Apriltag icons may be deployed on a diagonal, the distance between the two Apriltag icons being equal in length to the diagonal of the smaller Apriltag icon at the corner of the square, and the colors may be set to red and green, respectively. Two smaller Apriltag icons may be deployed on the other diagonal first, the distance between the two Apriltag icons may be half the length of the diagonal, and the colors may all be set to blue. Four smaller Apriltag icons can be deployed at the top of the four corners, two smaller Apriltag icons can be deployed in a circle, and the color matching of the Apriltag icons in each subarea is optimally configured, so that three primary Apriltag icons with three colors are contained in each subarea.
It should be noted that the design of the landing pattern shown in fig. 7 is merely illustrative, and the present embodiment is not limited thereto.
Referring to fig. 10, an exemplary component schematic of an electronic device according to an embodiment of the present application may be the unmanned aerial vehicle. The electronic device may include a storage medium 110, a processor 120, a landing pattern based correction device 130, and a communication interface 140. In this embodiment, the storage medium 110 and the processor 120 are both located in the electronic device and are separately disposed. However, it should be understood that the storage medium 110 may also be separate from the electronic device and accessible to the processor 120 through a bus interface. Alternatively, the storage medium 110 may be integrated into the processor 120, for example, as a cache and/or general purpose registers.
The correction device 130 based on the landing pattern may be understood as the above-mentioned electronic device, or the processor 120 of the electronic device, or may be understood as a software functional module for implementing the correction method based on the landing pattern under the control of the electronic device, independently of the above-mentioned electronic device or the processor 120.
As shown in fig. 11, the correction device 130 based on the landing pattern may include an acquisition module 131, a first comparison module 132, a second comparison module 133, and a correction module 134. The functions of the respective functional blocks of the drop pattern-based correction device 130 are described in detail below.
The acquiring module 131 is configured to acquire a calibration image including the landing pattern acquired at a preset position during the flight of the unmanned aerial vehicle.
It will be appreciated that the acquisition module 131 may be used to perform step S101 described above, and reference may be made to the details of the implementation of the acquisition module 131 as described above with respect to step S101.
The first comparison module 132 is configured to perform spectral comparison on the calibration image and an original image including the landing pattern based on an Apriltag tag therein, so as to obtain spectral difference information of the calibration image.
It is understood that the first comparing module 132 may be used to perform the step S102, and reference may be made to the details of the implementation of the first comparing module 132 related to the step S102.
And a second comparison module 133, configured to compare the geometric information of the calibration image with the geometric figure elements of the original image, so as to obtain geometric difference information of the calibration image.
It will be appreciated that the second comparison module 133 may be used to perform the step S103 described above, and reference may be made to the details of the implementation of the second comparison module 133 regarding the step S103 described above.
The correction module 134 is configured to perform spectral correction on each image acquired by the unmanned aerial vehicle during the flight based on the spectral difference information, and perform geometry correction on each image based on the geometric difference information.
It is understood that the correction module 134 may be used to perform the step S104 described above, and reference may be made to the details of the implementation of the correction module 134 regarding the step S104 described above.
In one possible implementation, the correction device 130 based on the landing pattern further includes an adjustment module that can be used to:
acquiring multi-frame image frames which are continuously acquired by the unmanned aerial vehicle in the landing process and comprise the landing patterns;
obtaining the current position offset of the unmanned aerial vehicle according to the information of the landing pattern in the multi-frame image frame;
and adjusting the flight angle and the flight speed of the unmanned aerial vehicle according to the position offset, so that the unmanned aerial vehicle lands to the central position of the landing platform.
In one possible implementation, the Apriltag tag is an RGB image, and the first comparing module 132 may be configured to:
obtaining standard parameters of each color channel of the original image of the landing pattern in an RGB color space;
obtaining calibration parameters of each color channel of the calibration image in an RGB color space;
and comparing the corresponding standard parameters with the calibration parameters to obtain the spectrum difference information of the calibration image in each color channel.
In one possible embodiment, the second comparison module 133 may be used to:
obtaining world coordinate values of key points calibrated by geometric figure elements in the calibration image;
converting the world coordinate values of the key points in the calibration image into right-angle coordinate values, and converting the world coordinate values of the corresponding key points in the original image into right-angle coordinate values;
converting each key point into a polar coordinate system based on the right-angle coordinate values, and obtaining polar coordinate values and polar diameter values of each key point in the polar coordinate system;
and calculating to obtain the geometric difference information of the calibration image according to the polar coordinate value and the polar diameter value of the key point in the original image and the polar coordinate value and the polar diameter value of the corresponding key point in the calibration image.
In one possible implementation, the correction module 134 may be configured to:
for each image, obtaining right-angle coordinate values of each pixel point in the image;
converting the right-angle coordinate value of each pixel point into a polar coordinate value, and calculating to obtain a corrected polar coordinate value according to the polar coordinate value and the geometric difference information;
obtaining corrected right-angle coordinate values according to the corrected polar coordinate values;
And obtaining a corrected image according to the corrected rectangular coordinate value of each pixel point and the pixel value of each pixel point.
In one possible implementation manner, the position offset includes a current landing acceleration of the unmanned aerial vehicle and a 3D azimuth of the landing platform relative to a current position of the unmanned aerial vehicle, and the adjustment module may be configured to:
extracting Apriltag labels contained in landing patterns in each image frame for each image frame;
according to the difference information of the corresponding Apriltag labels between every two adjacent frames, calculating to obtain the landing acceleration of the unmanned aerial vehicle;
and calculating to obtain the 3D azimuth of the landing platform where the Apriltag tag is positioned relative to the current position of the unmanned plane according to the information of the Apriltag tag in the current frame image frame and the image frame of the previous preset frame image frame.
In one possible implementation, the adjustment module may be configured to:
adjusting the flight angle of the unmanned aerial vehicle relative to the landing platform according to the 3D azimuth;
obtaining the distance between the landing platform and the unmanned aerial vehicle according to the 3D azimuth and the current position of the unmanned aerial vehicle;
and adjusting the flight speed of the unmanned aerial vehicle according to the landing acceleration and the distance so as to enable the unmanned aerial vehicle to land to the central position of the landing platform.
In one possible embodiment, the Apriltag tag is formed by nesting at least two Apriltag icons of different sizes, so that the drone can identify one of the Apriltag icons at a different height relative to the Apriltag tag.
In one possible embodiment, the geometric figure elements in the landing pattern comprise a square at the outermost periphery, a circle inscribing the square, and two diagonals of the square, wherein the two diagonals divide the square and the circle into a plurality of sub-areas;
the Apriltag label comprises a plurality of three-primary-color Apriltag icons, and three-primary-color Apriltag icons with three colors are arranged in each subarea, so that the unmanned aerial vehicle can perform spectral correction based on information of the three-primary-color Apriltag icons in any subarea.
The process flow of each module in the apparatus and the interaction flow between the modules may be described with reference to the related descriptions in the above method embodiments, which are not described in detail herein.
Further, an embodiment of the present application also provides a computer readable storage medium storing machine executable instructions that when executed implement the landing pattern-based correction method provided in the above embodiment.
Specifically, the computer readable storage medium can be a general-purpose storage medium, such as a removable disk, a hard disk, or the like, and when the computer program on the computer readable storage medium is executed, the above-described correction method based on the landing pattern can be performed. With respect to the processes involved in the computer readable storage medium and when executed as executable instructions thereof, reference is made to the relevant descriptions of the method embodiments described above and will not be described in detail herein.
In summary, according to the correction method and the correction device based on the landing pattern provided by the embodiment of the application, the landing pattern comprising the Apriltag tag and the geometric figure element is arranged on the landing platform, in the flying process of the unmanned aerial vehicle, the calibration image comprising the landing pattern is acquired at the preset position, the calibration image is subjected to spectrum comparison with the original image of the landing pattern based on the Apriltag tag, so that the spectrum difference information of the calibration image is obtained, and the geometric difference information of the calibration image is obtained by comparing the calibration image with the original image based on the geometric figure element. And performing spectral correction and geometric correction on each image acquired by the unmanned aerial vehicle based on the spectral difference information and the geometric difference information. In the scheme, the landing pattern comprising the Apriltag tag and the geometric figure element is used as a standard reference to realize correction of the image shot by the unmanned aerial vehicle, so that the accuracy of the corrected image in terms of color and geometric features is ensured, and the problem of subsequent recognition errors based on the image is avoided.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any changes or substitutions easily contemplated by those skilled in the art within the scope of the present application should be included in the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (8)

1. A correction method based on a landing pattern, wherein the landing pattern comprises Apriltag tags and geometric figure elements, the landing pattern is arranged on a landing platform, the method comprising:
acquiring a calibration image comprising the landing pattern acquired at a preset position in the flight process of the unmanned aerial vehicle;
performing spectrum comparison on the calibration image and an original image comprising the landing pattern based on an Apriltag label in the calibration image to obtain spectrum difference information of the calibration image;
comparing the geometric information of the calibration image with that of the original image based on the geometric figure elements, and obtaining the geometric difference information of the calibration image;
performing spectrum correction on each image acquired by the unmanned aerial vehicle in the flight process based on the spectrum difference information, and performing geometric shape correction on each image based on the geometric difference information;
The Apriltag tag is an RGB image, and the step of obtaining the spectrum difference information includes:
obtaining standard parameters of each color channel of the original image of the landing pattern in an RGB color space; obtaining calibration parameters of each color channel of the calibration image in an RGB color space; comparing the corresponding standard parameters with the calibration parameters to obtain spectrum difference information of the calibration image in each color channel;
the step of obtaining the geometric difference information comprises the following steps:
obtaining world coordinate values of key points calibrated by geometric figure elements in the calibration image; converting the world coordinate values of the key points in the calibration image into right-angle coordinate values, and converting the world coordinate values of the corresponding key points in the original image into right-angle coordinate values; converting each key point into a polar coordinate system based on the right-angle coordinate values, and obtaining polar coordinate values and polar diameter values of each key point in the polar coordinate system; and calculating to obtain the geometric difference information of the calibration image according to the polar coordinate value and the polar diameter value of the key point in the original image and the polar coordinate value and the polar diameter value of the corresponding key point in the calibration image.
2. The landing pattern-based correction method according to claim 1, characterized in that the method further comprises:
acquiring multi-frame image frames which are continuously acquired by the unmanned aerial vehicle in the landing process and comprise the landing patterns;
obtaining the current position offset of the unmanned aerial vehicle according to the information of the landing pattern in the multi-frame image frame;
and adjusting the flight angle and the flight speed of the unmanned aerial vehicle according to the position offset, so that the unmanned aerial vehicle lands to the central position of the landing platform.
3. The method of correcting a landing pattern according to claim 1, wherein the step of geometrically correcting each image based on the geometric difference information includes:
for each image, obtaining right-angle coordinate values of each pixel point in the image;
converting the right-angle coordinate value of each pixel point into a polar coordinate value, and calculating to obtain a corrected polar coordinate value according to the polar coordinate value and the geometric difference information;
obtaining corrected right-angle coordinate values according to the corrected polar coordinate values;
and obtaining a corrected image according to the corrected rectangular coordinate value of each pixel point and the pixel value of each pixel point.
4. The landing pattern based correction method according to claim 2, wherein the positional offset includes a current landing acceleration of the unmanned aerial vehicle and a 3D orientation of the landing platform relative to a current position of the unmanned aerial vehicle;
the step of obtaining the current position offset of the unmanned aerial vehicle according to the information of the landing pattern in the multi-frame image frame comprises the following steps:
extracting Apriltag labels contained in landing patterns in each image frame for each image frame;
according to the difference information of the corresponding Apriltag labels between every two adjacent frames, calculating to obtain the landing acceleration of the unmanned aerial vehicle;
and calculating to obtain the 3D azimuth of the landing platform where the Apriltag tag is positioned relative to the current position of the unmanned plane according to the information of the Apriltag tag in the current frame image frame and the image frame of the previous preset frame image frame.
5. The landing pattern-based correction method according to claim 4, wherein the step of adjusting the flying angle and the flying speed of the unmanned aerial vehicle according to the positional deviation amount to land the unmanned aerial vehicle to the center position of the landing platform comprises:
adjusting the flight angle of the unmanned aerial vehicle relative to the landing platform according to the 3D azimuth;
Obtaining the distance between the landing platform and the unmanned aerial vehicle according to the 3D azimuth and the current position of the unmanned aerial vehicle;
and adjusting the flight speed of the unmanned aerial vehicle according to the landing acceleration and the distance so as to enable the unmanned aerial vehicle to land to the central position of the landing platform.
6. The landing pattern based correction method of any one of claims 1 to 5, wherein said Apriltag tags are nested from at least two Apriltag icons of different sizes, such that said drone is able to identify one of said Apriltag icons at a different height position relative to said Apriltag tags.
7. The landing pattern-based correction method according to any one of claims 1 to 5, wherein the geometric figure elements in the landing pattern include a square located at the outermost periphery, a circle inscribed in the square, and two diagonal lines of the square, wherein the two diagonal lines divide the square and the circle into a plurality of sub-areas;
the Apriltag label comprises a plurality of three-primary-color Apriltag icons, and three-primary-color Apriltag icons with three colors are arranged in each subarea, so that the unmanned aerial vehicle can perform spectral correction based on information of the three-primary-color Apriltag icons in any subarea.
8. A correction device based on a landing pattern, wherein the landing pattern comprises Apriltag tags and geometric figure elements, the landing pattern being disposed on a landing platform, the device comprising:
the acquisition module is used for acquiring a calibration image comprising the landing pattern acquired at a preset position in the flight process of the unmanned aerial vehicle;
the first comparison module is used for carrying out spectrum comparison on the calibration image and an original image comprising the landing pattern based on an Apriltag label in the calibration image to obtain spectrum difference information of the calibration image;
the second comparison module is used for comparing the geometric information of the calibration image with the geometric figure elements of the original image to obtain the geometric difference information of the calibration image;
the correction module is used for carrying out spectrum correction on each image acquired by the unmanned aerial vehicle in the flight process based on the spectrum difference information and carrying out geometric shape correction on each image based on the geometric difference information;
the first comparison module is used for obtaining standard parameters of each color channel of the original image of the landing pattern in an RGB color space; obtaining calibration parameters of each color channel of the calibration image in an RGB color space; comparing the corresponding standard parameters with the calibration parameters to obtain spectrum difference information of the calibration image in each color channel;
The second comparison module is used for obtaining world coordinate values of key points calibrated by the geometric figure elements in the calibration image; converting the world coordinate values of the key points in the calibration image into right-angle coordinate values, and converting the world coordinate values of the corresponding key points in the original image into right-angle coordinate values; converting each key point into a polar coordinate system based on the right-angle coordinate values, and obtaining polar coordinate values and polar diameter values of each key point in the polar coordinate system; and calculating to obtain the geometric difference information of the calibration image according to the polar coordinate value and the polar diameter value of the key point in the original image and the polar coordinate value and the polar diameter value of the corresponding key point in the calibration image.
CN202210422078.7A 2022-04-21 2022-04-21 Correction method and device based on landing pattern Active CN114782841B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210422078.7A CN114782841B (en) 2022-04-21 2022-04-21 Correction method and device based on landing pattern

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210422078.7A CN114782841B (en) 2022-04-21 2022-04-21 Correction method and device based on landing pattern

Publications (2)

Publication Number Publication Date
CN114782841A CN114782841A (en) 2022-07-22
CN114782841B true CN114782841B (en) 2023-12-15

Family

ID=82432049

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210422078.7A Active CN114782841B (en) 2022-04-21 2022-04-21 Correction method and device based on landing pattern

Country Status (1)

Country Link
CN (1) CN114782841B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019182521A1 (en) * 2018-03-22 2019-09-26 Infinium Robotics Pte Ltd Autonomous taking off, positioning and landing of unmanned aerial vehicles (uav) on a mobile platform
CN110400278A (en) * 2019-07-30 2019-11-01 广东工业大学 A kind of full-automatic bearing calibration, device and the equipment of color of image and geometric distortion
CN110991207A (en) * 2019-11-19 2020-04-10 山东大学 Unmanned aerial vehicle accurate landing method integrating H pattern recognition and Apriltag two-dimensional code recognition
CN111429356A (en) * 2020-03-31 2020-07-17 北京建筑大学 Geometric registration and cutting method for ground hyperspectral image
CN111489315A (en) * 2020-04-17 2020-08-04 南京智谱科技有限公司 Spectral band position correction method and device and computing equipment
CN113093772A (en) * 2021-04-13 2021-07-09 中国计量大学 Method for accurately landing hangar of unmanned aerial vehicle
CN114200954A (en) * 2021-10-28 2022-03-18 佛山中科云图智能科技有限公司 Apriltag-based unmanned aerial vehicle landing method, device, medium and electronic equipment

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8928758B2 (en) * 2012-06-18 2015-01-06 Electronic Warfare Associates, Inc. Imaging data correction system and method
CN106651961B (en) * 2016-12-09 2019-10-11 中山大学 A kind of unmanned plane scaling method and system based on color solid calibration object
US10997448B2 (en) * 2019-05-15 2021-05-04 Matterport, Inc. Arbitrary visual features as fiducial elements

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019182521A1 (en) * 2018-03-22 2019-09-26 Infinium Robotics Pte Ltd Autonomous taking off, positioning and landing of unmanned aerial vehicles (uav) on a mobile platform
CN110400278A (en) * 2019-07-30 2019-11-01 广东工业大学 A kind of full-automatic bearing calibration, device and the equipment of color of image and geometric distortion
CN110991207A (en) * 2019-11-19 2020-04-10 山东大学 Unmanned aerial vehicle accurate landing method integrating H pattern recognition and Apriltag two-dimensional code recognition
CN111429356A (en) * 2020-03-31 2020-07-17 北京建筑大学 Geometric registration and cutting method for ground hyperspectral image
CN111489315A (en) * 2020-04-17 2020-08-04 南京智谱科技有限公司 Spectral band position correction method and device and computing equipment
CN113093772A (en) * 2021-04-13 2021-07-09 中国计量大学 Method for accurately landing hangar of unmanned aerial vehicle
CN114200954A (en) * 2021-10-28 2022-03-18 佛山中科云图智能科技有限公司 Apriltag-based unmanned aerial vehicle landing method, device, medium and electronic equipment

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
UAV Landing Platform Recognition Using Cognitive Computation Combining Geometric Analysis and Computer Vision Techniques;J. A. García‑Pulido等;《Cognitive Computation》;392-412 *
无人机多光谱成像仪图像的校正及配准算法研究;孙凡;何志平;戴方兴;马艳华;;红外技术(04);5-9 *

Also Published As

Publication number Publication date
CN114782841A (en) 2022-07-22

Similar Documents

Publication Publication Date Title
US20220270293A1 (en) Calibration for sensor
Yahyanejad et al. A fast and mobile system for registration of low-altitude visual and thermal aerial images using multiple small-scale UAVs
CN110223226B (en) Panoramic image splicing method and system
WO2022100470A1 (en) Systems and methods for target detection
US20160005145A1 (en) Aligning Ground Based Images and Aerial Imagery
US7961216B2 (en) Real-time composite image comparator
US10223775B2 (en) Array camera image combination with feature-based ghost removal
US8666170B2 (en) Computer system and method of matching for images and graphs
CN109325913B (en) Unmanned aerial vehicle image splicing method and device
US11580665B2 (en) Image positioning system and image positioning method based on upsampling
CN112907675B (en) Calibration method, device, system, equipment and storage medium of image acquisition equipment
CN110084743B (en) Image splicing and positioning method based on multi-flight-zone initial flight path constraint
US11710253B2 (en) Position and attitude estimation device, position and attitude estimation method, and storage medium
US8687068B2 (en) Pattern of color codes
CN114897676A (en) Unmanned aerial vehicle remote sensing multispectral image splicing method, device and medium
CN113066173B (en) Three-dimensional model construction method and device and electronic equipment
CN114782841B (en) Correction method and device based on landing pattern
CN103632360B (en) The joining method of unmanned plane aerial photography image
CN105427239B (en) A kind of two-dimensional points cloud matching process
JP5352435B2 (en) Classification image creation device
EP2879090A1 (en) Aligning ground based images and aerial imagery
CN115586796A (en) Vision-based unmanned aerial vehicle landing position processing method, device and equipment
US8611700B2 (en) Distance map-based warping of binary images
KR20200030694A (en) AVM system and camera calibration method
US10553022B2 (en) Method of processing full motion video data for photogrammetric reconstruction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant